Sep 12 17:13:13.932917 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 12 17:13:13.932955 kernel: Linux version 6.6.106-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Sep 12 15:59:19 -00 2025 Sep 12 17:13:13.932968 kernel: KASLR enabled Sep 12 17:13:13.932974 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 12 17:13:13.932980 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390c1018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Sep 12 17:13:13.932985 kernel: random: crng init done Sep 12 17:13:13.932993 kernel: ACPI: Early table checksum verification disabled Sep 12 17:13:13.932999 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 12 17:13:13.933005 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 12 17:13:13.933013 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933020 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933026 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933032 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933038 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933046 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933054 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933060 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933067 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:13.933074 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 12 17:13:13.933080 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 12 17:13:13.933087 kernel: NUMA: Failed to initialise from firmware Sep 12 17:13:13.933093 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 17:13:13.933100 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] Sep 12 17:13:13.933106 kernel: Zone ranges: Sep 12 17:13:13.933112 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 12 17:13:13.933121 kernel: DMA32 empty Sep 12 17:13:13.933127 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 12 17:13:13.933133 kernel: Movable zone start for each node Sep 12 17:13:13.933140 kernel: Early memory node ranges Sep 12 17:13:13.933146 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Sep 12 17:13:13.933153 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 12 17:13:13.933160 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 12 17:13:13.933166 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 12 17:13:13.933173 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 12 17:13:13.933179 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 12 17:13:13.933185 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 12 17:13:13.933192 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 17:13:13.933200 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 12 17:13:13.933207 kernel: psci: probing for conduit method from ACPI. Sep 12 17:13:13.933213 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 17:13:13.933223 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 17:13:13.933230 kernel: psci: Trusted OS migration not required Sep 12 17:13:13.933237 kernel: psci: SMC Calling Convention v1.1 Sep 12 17:13:13.933246 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 12 17:13:13.933253 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 Sep 12 17:13:13.933260 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 Sep 12 17:13:13.933267 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 12 17:13:13.933273 kernel: Detected PIPT I-cache on CPU0 Sep 12 17:13:13.933280 kernel: CPU features: detected: GIC system register CPU interface Sep 12 17:13:13.933287 kernel: CPU features: detected: Hardware dirty bit management Sep 12 17:13:13.933294 kernel: CPU features: detected: Spectre-v4 Sep 12 17:13:13.933301 kernel: CPU features: detected: Spectre-BHB Sep 12 17:13:13.933308 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 17:13:13.933316 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 17:13:13.933323 kernel: CPU features: detected: ARM erratum 1418040 Sep 12 17:13:13.933329 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 17:13:13.933336 kernel: alternatives: applying boot alternatives Sep 12 17:13:13.933345 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=1e63d3057914877efa0eb5f75703bd3a3d4c120bdf4a7ab97f41083e29183e56 Sep 12 17:13:13.933352 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:13:13.933359 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 17:13:13.933366 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:13:13.933372 kernel: Fallback order for Node 0: 0 Sep 12 17:13:13.933379 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Sep 12 17:13:13.933425 kernel: Policy zone: Normal Sep 12 17:13:13.933435 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:13:13.933442 kernel: software IO TLB: area num 2. Sep 12 17:13:13.933450 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Sep 12 17:13:13.933457 kernel: Memory: 3882744K/4096000K available (10304K kernel code, 2186K rwdata, 8108K rodata, 39488K init, 897K bss, 213256K reserved, 0K cma-reserved) Sep 12 17:13:13.933464 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:13:13.933471 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:13:13.933478 kernel: rcu: RCU event tracing is enabled. Sep 12 17:13:13.933485 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:13:13.933493 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:13:13.933499 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:13:13.933516 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:13:13.933525 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:13:13.933532 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 17:13:13.933539 kernel: GICv3: 256 SPIs implemented Sep 12 17:13:13.933546 kernel: GICv3: 0 Extended SPIs implemented Sep 12 17:13:13.933552 kernel: Root IRQ handler: gic_handle_irq Sep 12 17:13:13.933559 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 12 17:13:13.933566 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 12 17:13:13.933573 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 12 17:13:13.933580 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Sep 12 17:13:13.933587 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Sep 12 17:13:13.933594 kernel: GICv3: using LPI property table @0x00000001000e0000 Sep 12 17:13:13.933601 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Sep 12 17:13:13.933609 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:13:13.933616 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 17:13:13.933623 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 12 17:13:13.933630 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 12 17:13:13.933637 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 12 17:13:13.933644 kernel: Console: colour dummy device 80x25 Sep 12 17:13:13.933651 kernel: ACPI: Core revision 20230628 Sep 12 17:13:13.933659 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 12 17:13:13.933666 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:13:13.933673 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 12 17:13:13.933682 kernel: landlock: Up and running. Sep 12 17:13:13.933689 kernel: SELinux: Initializing. Sep 12 17:13:13.933696 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:13:13.933703 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:13:13.933710 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:13:13.933717 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:13:13.933724 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:13:13.933731 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:13:13.933739 kernel: Platform MSI: ITS@0x8080000 domain created Sep 12 17:13:13.933748 kernel: PCI/MSI: ITS@0x8080000 domain created Sep 12 17:13:13.933755 kernel: Remapping and enabling EFI services. Sep 12 17:13:13.933762 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:13:13.933770 kernel: Detected PIPT I-cache on CPU1 Sep 12 17:13:13.933777 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 12 17:13:13.933785 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Sep 12 17:13:13.933792 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 17:13:13.933799 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 12 17:13:13.933806 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:13:13.933824 kernel: SMP: Total of 2 processors activated. Sep 12 17:13:13.933835 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 17:13:13.933843 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 17:13:13.933857 kernel: CPU features: detected: Common not Private translations Sep 12 17:13:13.933866 kernel: CPU features: detected: CRC32 instructions Sep 12 17:13:13.933874 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 12 17:13:13.933882 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 17:13:13.933889 kernel: CPU features: detected: LSE atomic instructions Sep 12 17:13:13.933896 kernel: CPU features: detected: Privileged Access Never Sep 12 17:13:13.933904 kernel: CPU features: detected: RAS Extension Support Sep 12 17:13:13.933914 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 17:13:13.933922 kernel: CPU: All CPU(s) started at EL1 Sep 12 17:13:13.933929 kernel: alternatives: applying system-wide alternatives Sep 12 17:13:13.933937 kernel: devtmpfs: initialized Sep 12 17:13:13.933944 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:13:13.933952 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:13:13.933960 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:13:13.933969 kernel: SMBIOS 3.0.0 present. Sep 12 17:13:13.933977 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 12 17:13:13.933984 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:13:13.933992 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 17:13:13.934000 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 17:13:13.934008 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 17:13:13.934015 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:13:13.934023 kernel: audit: type=2000 audit(0.015:1): state=initialized audit_enabled=0 res=1 Sep 12 17:13:13.934031 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:13:13.934040 kernel: cpuidle: using governor menu Sep 12 17:13:13.934048 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 17:13:13.934055 kernel: ASID allocator initialised with 32768 entries Sep 12 17:13:13.934063 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:13:13.934070 kernel: Serial: AMBA PL011 UART driver Sep 12 17:13:13.934077 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 17:13:13.934085 kernel: Modules: 0 pages in range for non-PLT usage Sep 12 17:13:13.934092 kernel: Modules: 508992 pages in range for PLT usage Sep 12 17:13:13.934100 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:13:13.934110 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:13:13.934117 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 17:13:13.934125 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 17:13:13.934132 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:13:13.934140 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:13:13.934147 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 17:13:13.934155 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 17:13:13.934162 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:13:13.934169 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:13:13.934179 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:13:13.934186 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 17:13:13.934194 kernel: ACPI: Interpreter enabled Sep 12 17:13:13.934201 kernel: ACPI: Using GIC for interrupt routing Sep 12 17:13:13.934208 kernel: ACPI: MCFG table detected, 1 entries Sep 12 17:13:13.934216 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 12 17:13:13.934223 kernel: printk: console [ttyAMA0] enabled Sep 12 17:13:13.934231 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 17:13:13.934426 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:13:13.934517 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 17:13:13.934586 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 17:13:13.934651 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 12 17:13:13.934717 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 12 17:13:13.934727 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 12 17:13:13.934735 kernel: PCI host bridge to bus 0000:00 Sep 12 17:13:13.934812 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 12 17:13:13.937107 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 17:13:13.937173 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 12 17:13:13.937232 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 17:13:13.937328 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Sep 12 17:13:13.937476 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Sep 12 17:13:13.937555 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Sep 12 17:13:13.937632 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Sep 12 17:13:13.937709 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.937779 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Sep 12 17:13:13.938959 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.939096 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Sep 12 17:13:13.939179 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.939256 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Sep 12 17:13:13.939336 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.939456 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Sep 12 17:13:13.939540 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.939609 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Sep 12 17:13:13.939688 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.939763 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Sep 12 17:13:13.940042 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.940125 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Sep 12 17:13:13.940208 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.940275 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Sep 12 17:13:13.940349 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:13.940434 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Sep 12 17:13:13.940525 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Sep 12 17:13:13.940592 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Sep 12 17:13:13.940673 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 17:13:13.940745 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Sep 12 17:13:13.941857 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 17:13:13.942034 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 12 17:13:13.942132 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 12 17:13:13.942204 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Sep 12 17:13:13.942288 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Sep 12 17:13:13.942358 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Sep 12 17:13:13.942476 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Sep 12 17:13:13.942567 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Sep 12 17:13:13.942639 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Sep 12 17:13:13.942728 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 12 17:13:13.942799 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Sep 12 17:13:13.944077 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Sep 12 17:13:13.944166 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Sep 12 17:13:13.944294 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Sep 12 17:13:13.944378 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 17:13:13.944475 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Sep 12 17:13:13.944547 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Sep 12 17:13:13.944615 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 12 17:13:13.944690 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 12 17:13:13.944760 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 12 17:13:13.944941 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 12 17:13:13.945028 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 12 17:13:13.945095 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 12 17:13:13.945160 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 12 17:13:13.945234 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 12 17:13:13.945299 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 12 17:13:13.945364 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 12 17:13:13.945491 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 12 17:13:13.945564 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 12 17:13:13.945634 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 12 17:13:13.945707 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 12 17:13:13.945775 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 12 17:13:13.946392 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 12 17:13:13.946492 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 12 17:13:13.946563 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 12 17:13:13.946630 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 12 17:13:13.946710 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 17:13:13.946780 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 12 17:13:13.948039 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 12 17:13:13.948137 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 17:13:13.948204 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 12 17:13:13.948271 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 12 17:13:13.948343 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 17:13:13.948432 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 12 17:13:13.948510 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 12 17:13:13.948581 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Sep 12 17:13:13.948653 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 17:13:13.948723 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Sep 12 17:13:13.948790 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 17:13:13.948931 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Sep 12 17:13:13.949008 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 17:13:13.949086 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Sep 12 17:13:13.949151 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 17:13:13.949223 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Sep 12 17:13:13.949289 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 17:13:13.949361 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Sep 12 17:13:13.949948 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 17:13:13.950039 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Sep 12 17:13:13.950106 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 17:13:13.950175 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Sep 12 17:13:13.950240 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 17:13:13.950310 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Sep 12 17:13:13.950376 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 17:13:13.950474 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Sep 12 17:13:13.950551 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Sep 12 17:13:13.950623 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Sep 12 17:13:13.950691 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Sep 12 17:13:13.950761 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Sep 12 17:13:13.951903 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Sep 12 17:13:13.952013 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Sep 12 17:13:13.952084 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Sep 12 17:13:13.952156 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Sep 12 17:13:13.952231 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Sep 12 17:13:13.952302 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Sep 12 17:13:13.952370 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Sep 12 17:13:13.952463 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Sep 12 17:13:13.952535 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Sep 12 17:13:13.952607 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Sep 12 17:13:13.952676 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Sep 12 17:13:13.952746 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Sep 12 17:13:13.952936 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Sep 12 17:13:13.953026 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Sep 12 17:13:13.953093 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Sep 12 17:13:13.953172 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Sep 12 17:13:13.953249 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Sep 12 17:13:13.953318 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 17:13:13.953400 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Sep 12 17:13:13.953472 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 17:13:13.953548 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 12 17:13:13.954228 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 12 17:13:13.954300 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 17:13:13.954378 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Sep 12 17:13:13.954513 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 17:13:13.954593 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 12 17:13:13.954660 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 12 17:13:13.954725 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 17:13:13.954803 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Sep 12 17:13:13.955263 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Sep 12 17:13:13.955343 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 17:13:13.955466 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 12 17:13:13.955551 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 12 17:13:13.955621 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 17:13:13.955701 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Sep 12 17:13:13.955771 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 17:13:13.955866 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 12 17:13:13.955935 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 12 17:13:13.956000 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 17:13:13.956077 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Sep 12 17:13:13.956153 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 17:13:13.956220 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 12 17:13:13.956286 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 12 17:13:13.956351 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 17:13:13.956447 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Sep 12 17:13:13.956521 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Sep 12 17:13:13.956592 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 17:13:13.956659 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 12 17:13:13.956730 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 12 17:13:13.956795 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 17:13:13.956906 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Sep 12 17:13:13.956979 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Sep 12 17:13:13.957048 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Sep 12 17:13:13.957118 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 17:13:13.957187 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 12 17:13:13.957254 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 12 17:13:13.957328 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 17:13:13.957443 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 17:13:13.957522 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 12 17:13:13.957588 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 12 17:13:13.957656 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 17:13:13.957728 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 17:13:13.957794 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 12 17:13:13.957950 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 12 17:13:13.958029 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 17:13:13.958099 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 12 17:13:13.958159 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 17:13:13.958217 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 12 17:13:13.958292 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 12 17:13:13.958353 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 12 17:13:13.958431 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 17:13:13.958511 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 12 17:13:13.958573 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 12 17:13:13.958633 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 17:13:13.958702 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 12 17:13:13.958761 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 12 17:13:13.958869 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 17:13:13.958958 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 12 17:13:13.959021 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 12 17:13:13.959085 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 17:13:13.959169 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 12 17:13:13.959234 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 12 17:13:13.959296 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 17:13:13.959367 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 12 17:13:13.959491 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 12 17:13:13.959557 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 17:13:13.959638 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 12 17:13:13.959702 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 12 17:13:13.959771 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 17:13:13.960028 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 12 17:13:13.961967 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 12 17:13:13.962047 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 17:13:13.962127 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 12 17:13:13.962192 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 12 17:13:13.962252 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 17:13:13.962266 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 17:13:13.962274 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 17:13:13.962283 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 17:13:13.962291 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 17:13:13.962299 kernel: iommu: Default domain type: Translated Sep 12 17:13:13.962307 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 17:13:13.962315 kernel: efivars: Registered efivars operations Sep 12 17:13:13.962323 kernel: vgaarb: loaded Sep 12 17:13:13.962331 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 17:13:13.962341 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:13:13.962350 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:13:13.962358 kernel: pnp: PnP ACPI init Sep 12 17:13:13.962501 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 12 17:13:13.962516 kernel: pnp: PnP ACPI: found 1 devices Sep 12 17:13:13.962524 kernel: NET: Registered PF_INET protocol family Sep 12 17:13:13.962532 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 17:13:13.962540 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 17:13:13.962554 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:13:13.962563 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:13:13.962571 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 17:13:13.962579 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 17:13:13.962587 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:13:13.962595 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:13:13.962603 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:13:13.962691 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 12 17:13:13.962704 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:13:13.962715 kernel: kvm [1]: HYP mode not available Sep 12 17:13:13.962724 kernel: Initialise system trusted keyrings Sep 12 17:13:13.962732 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 17:13:13.962739 kernel: Key type asymmetric registered Sep 12 17:13:13.962748 kernel: Asymmetric key parser 'x509' registered Sep 12 17:13:13.962756 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 17:13:13.962764 kernel: io scheduler mq-deadline registered Sep 12 17:13:13.962772 kernel: io scheduler kyber registered Sep 12 17:13:13.962780 kernel: io scheduler bfq registered Sep 12 17:13:13.962790 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 12 17:13:13.962919 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 12 17:13:13.963007 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 12 17:13:13.963076 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.963147 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 12 17:13:13.963214 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 12 17:13:13.963298 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.963374 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 12 17:13:13.963463 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 12 17:13:13.963539 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.963614 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 12 17:13:13.963684 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 12 17:13:13.963757 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.963854 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 12 17:13:13.963929 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 12 17:13:13.963999 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.964073 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 12 17:13:13.964141 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 12 17:13:13.964211 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.964285 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 12 17:13:13.964354 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 12 17:13:13.964438 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.964513 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 12 17:13:13.964581 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 12 17:13:13.964652 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.964664 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 12 17:13:13.964734 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 12 17:13:13.964802 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 12 17:13:13.967070 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:13.967100 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 17:13:13.967109 kernel: ACPI: button: Power Button [PWRB] Sep 12 17:13:13.967118 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 17:13:13.967212 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 12 17:13:13.967293 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 12 17:13:13.967305 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:13:13.967313 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 12 17:13:13.967401 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 12 17:13:13.967415 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 12 17:13:13.967423 kernel: thunder_xcv, ver 1.0 Sep 12 17:13:13.967431 kernel: thunder_bgx, ver 1.0 Sep 12 17:13:13.967442 kernel: nicpf, ver 1.0 Sep 12 17:13:13.967450 kernel: nicvf, ver 1.0 Sep 12 17:13:13.967547 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 17:13:13.967613 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T17:13:13 UTC (1757697193) Sep 12 17:13:13.967624 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 17:13:13.967632 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Sep 12 17:13:13.967640 kernel: watchdog: Delayed init of the lockup detector failed: -19 Sep 12 17:13:13.967648 kernel: watchdog: Hard watchdog permanently disabled Sep 12 17:13:13.967660 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:13:13.967668 kernel: Segment Routing with IPv6 Sep 12 17:13:13.967676 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:13:13.967684 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:13:13.967692 kernel: Key type dns_resolver registered Sep 12 17:13:13.967700 kernel: registered taskstats version 1 Sep 12 17:13:13.967708 kernel: Loading compiled-in X.509 certificates Sep 12 17:13:13.967716 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.106-flatcar: 2d576b5e69e6c5de2f731966fe8b55173c144d02' Sep 12 17:13:13.967724 kernel: Key type .fscrypt registered Sep 12 17:13:13.967733 kernel: Key type fscrypt-provisioning registered Sep 12 17:13:13.967742 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:13:13.967750 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:13:13.967758 kernel: ima: No architecture policies found Sep 12 17:13:13.967766 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 17:13:13.967774 kernel: clk: Disabling unused clocks Sep 12 17:13:13.967782 kernel: Freeing unused kernel memory: 39488K Sep 12 17:13:13.967789 kernel: Run /init as init process Sep 12 17:13:13.967797 kernel: with arguments: Sep 12 17:13:13.967808 kernel: /init Sep 12 17:13:13.967899 kernel: with environment: Sep 12 17:13:13.967908 kernel: HOME=/ Sep 12 17:13:13.967917 kernel: TERM=linux Sep 12 17:13:13.967924 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:13:13.967934 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 17:13:13.967945 systemd[1]: Detected virtualization kvm. Sep 12 17:13:13.967953 systemd[1]: Detected architecture arm64. Sep 12 17:13:13.967965 systemd[1]: Running in initrd. Sep 12 17:13:13.967973 systemd[1]: No hostname configured, using default hostname. Sep 12 17:13:13.967981 systemd[1]: Hostname set to . Sep 12 17:13:13.967990 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:13:13.967998 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:13:13.968007 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:13:13.968016 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:13:13.968025 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:13:13.968036 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:13:13.968045 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:13:13.968054 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:13:13.968064 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:13:13.968072 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:13:13.968083 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:13:13.968092 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:13:13.968102 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:13:13.968111 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:13:13.968120 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:13:13.968128 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:13:13.968137 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:13:13.968145 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:13:13.968154 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:13:13.968163 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 12 17:13:13.968174 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:13:13.968184 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:13:13.968192 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:13:13.968201 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:13:13.968210 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:13:13.968218 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:13:13.968227 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:13:13.968236 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:13:13.968244 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:13:13.968255 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:13:13.968263 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:13.968271 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:13:13.968280 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:13:13.968288 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:13:13.968326 systemd-journald[236]: Collecting audit messages is disabled. Sep 12 17:13:13.968350 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:13:13.968360 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:13.968371 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:13:13.968380 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:13:13.968399 kernel: Bridge firewalling registered Sep 12 17:13:13.968408 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:13:13.968416 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:13:13.968425 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:13:13.968435 systemd-journald[236]: Journal started Sep 12 17:13:13.968459 systemd-journald[236]: Runtime Journal (/run/log/journal/b9ce13b319cc4401bb978432e42588a6) is 8.0M, max 76.6M, 68.6M free. Sep 12 17:13:13.920622 systemd-modules-load[237]: Inserted module 'overlay' Sep 12 17:13:13.944644 systemd-modules-load[237]: Inserted module 'br_netfilter' Sep 12 17:13:13.973857 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:13:13.977852 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:13:13.978161 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:13:13.986497 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:13:13.997374 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:13:13.999506 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:14.011155 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:13:14.013941 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:13:14.017139 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:13:14.033646 dracut-cmdline[271]: dracut-dracut-053 Sep 12 17:13:14.039128 dracut-cmdline[271]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=1e63d3057914877efa0eb5f75703bd3a3d4c120bdf4a7ab97f41083e29183e56 Sep 12 17:13:14.058732 systemd-resolved[273]: Positive Trust Anchors: Sep 12 17:13:14.058754 systemd-resolved[273]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:13:14.058787 systemd-resolved[273]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:13:14.066138 systemd-resolved[273]: Defaulting to hostname 'linux'. Sep 12 17:13:14.067621 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:13:14.069140 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:13:14.150857 kernel: SCSI subsystem initialized Sep 12 17:13:14.155899 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:13:14.164864 kernel: iscsi: registered transport (tcp) Sep 12 17:13:14.179870 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:13:14.179972 kernel: QLogic iSCSI HBA Driver Sep 12 17:13:14.233801 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:13:14.242124 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:13:14.263850 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:13:14.265041 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:13:14.265087 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 12 17:13:14.317890 kernel: raid6: neonx8 gen() 15638 MB/s Sep 12 17:13:14.334839 kernel: raid6: neonx4 gen() 13357 MB/s Sep 12 17:13:14.351873 kernel: raid6: neonx2 gen() 13148 MB/s Sep 12 17:13:14.368889 kernel: raid6: neonx1 gen() 10425 MB/s Sep 12 17:13:14.385869 kernel: raid6: int64x8 gen() 6915 MB/s Sep 12 17:13:14.402877 kernel: raid6: int64x4 gen() 7300 MB/s Sep 12 17:13:14.419869 kernel: raid6: int64x2 gen() 6099 MB/s Sep 12 17:13:14.436915 kernel: raid6: int64x1 gen() 5031 MB/s Sep 12 17:13:14.437021 kernel: raid6: using algorithm neonx8 gen() 15638 MB/s Sep 12 17:13:14.453885 kernel: raid6: .... xor() 11997 MB/s, rmw enabled Sep 12 17:13:14.453970 kernel: raid6: using neon recovery algorithm Sep 12 17:13:14.458863 kernel: xor: measuring software checksum speed Sep 12 17:13:14.458920 kernel: 8regs : 9432 MB/sec Sep 12 17:13:14.460019 kernel: 32regs : 19660 MB/sec Sep 12 17:13:14.460052 kernel: arm64_neon : 26760 MB/sec Sep 12 17:13:14.460072 kernel: xor: using function: arm64_neon (26760 MB/sec) Sep 12 17:13:14.516883 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:13:14.533077 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:13:14.540116 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:13:14.568783 systemd-udevd[455]: Using default interface naming scheme 'v255'. Sep 12 17:13:14.572753 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:13:14.585053 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:13:14.603634 dracut-pre-trigger[466]: rd.md=0: removing MD RAID activation Sep 12 17:13:14.649179 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:13:14.665242 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:13:14.724168 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:13:14.734093 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:13:14.752694 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:13:14.756581 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:13:14.758867 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:13:14.759537 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:13:14.768105 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:13:14.785231 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:13:14.826381 kernel: scsi host0: Virtio SCSI HBA Sep 12 17:13:14.838977 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 17:13:14.839074 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 12 17:13:14.855861 kernel: ACPI: bus type USB registered Sep 12 17:13:14.856541 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:13:14.862556 kernel: usbcore: registered new interface driver usbfs Sep 12 17:13:14.856673 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:14.860734 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:13:14.867925 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:13:14.868188 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:14.869198 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:14.878836 kernel: usbcore: registered new interface driver hub Sep 12 17:13:14.878881 kernel: usbcore: registered new device driver usb Sep 12 17:13:14.881152 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:14.895906 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 12 17:13:14.897847 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 12 17:13:14.898092 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 17:13:14.907984 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 12 17:13:14.911859 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:13:14.913730 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 12 17:13:14.913850 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 12 17:13:14.916853 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:14.920875 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 12 17:13:14.921122 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:13:14.921236 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 12 17:13:14.921320 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 12 17:13:14.921437 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 12 17:13:14.921523 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 17:13:14.922427 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 12 17:13:14.923906 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 12 17:13:14.924085 kernel: hub 1-0:1.0: USB hub found Sep 12 17:13:14.925531 kernel: hub 1-0:1.0: 4 ports detected Sep 12 17:13:14.925701 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 12 17:13:14.928309 kernel: hub 2-0:1.0: USB hub found Sep 12 17:13:14.927089 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:13:14.933040 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 17:13:14.933069 kernel: hub 2-0:1.0: 4 ports detected Sep 12 17:13:14.933271 kernel: GPT:17805311 != 80003071 Sep 12 17:13:14.933283 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 17:13:14.933293 kernel: GPT:17805311 != 80003071 Sep 12 17:13:14.933310 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 17:13:14.933320 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:13:14.934909 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 12 17:13:14.969120 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:14.991989 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (507) Sep 12 17:13:14.993870 kernel: BTRFS: device fsid 5a23a06a-00d4-4606-89bf-13e31a563129 devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (512) Sep 12 17:13:14.997495 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 12 17:13:15.002768 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 12 17:13:15.018031 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:13:15.022963 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 12 17:13:15.023763 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 12 17:13:15.031052 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:13:15.040910 disk-uuid[572]: Primary Header is updated. Sep 12 17:13:15.040910 disk-uuid[572]: Secondary Entries is updated. Sep 12 17:13:15.040910 disk-uuid[572]: Secondary Header is updated. Sep 12 17:13:15.170913 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 12 17:13:15.304838 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 12 17:13:15.305879 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 12 17:13:15.306288 kernel: usbcore: registered new interface driver usbhid Sep 12 17:13:15.306315 kernel: usbhid: USB HID core driver Sep 12 17:13:15.413872 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 12 17:13:15.543906 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 12 17:13:15.596871 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 12 17:13:16.054847 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:13:16.055650 disk-uuid[574]: The operation has completed successfully. Sep 12 17:13:16.107356 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:13:16.108205 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:13:16.135425 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:13:16.139579 sh[583]: Success Sep 12 17:13:16.150844 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Sep 12 17:13:16.219123 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:13:16.221777 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:13:16.223850 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:13:16.254876 kernel: BTRFS info (device dm-0): first mount of filesystem 5a23a06a-00d4-4606-89bf-13e31a563129 Sep 12 17:13:16.254954 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:16.254978 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 12 17:13:16.255999 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:13:16.256031 kernel: BTRFS info (device dm-0): using free space tree Sep 12 17:13:16.262843 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:13:16.264980 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:13:16.267301 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:13:16.275075 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:13:16.280529 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:13:16.291412 kernel: BTRFS info (device sda6): first mount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:16.291486 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:16.291498 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:13:16.298861 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:13:16.298906 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:13:16.312296 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 12 17:13:16.313930 kernel: BTRFS info (device sda6): last unmount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:16.323536 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:13:16.332142 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:13:16.417701 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:13:16.426146 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:13:16.438606 ignition[675]: Ignition 2.19.0 Sep 12 17:13:16.438621 ignition[675]: Stage: fetch-offline Sep 12 17:13:16.438663 ignition[675]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:16.438686 ignition[675]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:16.438899 ignition[675]: parsed url from cmdline: "" Sep 12 17:13:16.438903 ignition[675]: no config URL provided Sep 12 17:13:16.438908 ignition[675]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:13:16.438915 ignition[675]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:13:16.438921 ignition[675]: failed to fetch config: resource requires networking Sep 12 17:13:16.439147 ignition[675]: Ignition finished successfully Sep 12 17:13:16.446910 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:13:16.455034 systemd-networkd[771]: lo: Link UP Sep 12 17:13:16.455047 systemd-networkd[771]: lo: Gained carrier Sep 12 17:13:16.456785 systemd-networkd[771]: Enumeration completed Sep 12 17:13:16.456941 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:13:16.457694 systemd[1]: Reached target network.target - Network. Sep 12 17:13:16.460641 systemd-networkd[771]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:16.460645 systemd-networkd[771]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:16.462194 systemd-networkd[771]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:16.462197 systemd-networkd[771]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:16.463162 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:13:16.465808 systemd-networkd[771]: eth0: Link UP Sep 12 17:13:16.465812 systemd-networkd[771]: eth0: Gained carrier Sep 12 17:13:16.465846 systemd-networkd[771]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:16.470198 systemd-networkd[771]: eth1: Link UP Sep 12 17:13:16.470201 systemd-networkd[771]: eth1: Gained carrier Sep 12 17:13:16.470212 systemd-networkd[771]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:16.481855 ignition[774]: Ignition 2.19.0 Sep 12 17:13:16.483001 ignition[774]: Stage: fetch Sep 12 17:13:16.483718 ignition[774]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:16.483734 ignition[774]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:16.483858 ignition[774]: parsed url from cmdline: "" Sep 12 17:13:16.483862 ignition[774]: no config URL provided Sep 12 17:13:16.483867 ignition[774]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:13:16.483877 ignition[774]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:13:16.483900 ignition[774]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 12 17:13:16.484805 ignition[774]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 12 17:13:16.499916 systemd-networkd[771]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:13:16.521941 systemd-networkd[771]: eth0: DHCPv4 address 188.245.115.118/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:13:16.685896 ignition[774]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 12 17:13:16.689562 ignition[774]: GET result: OK Sep 12 17:13:16.689663 ignition[774]: parsing config with SHA512: fa09e4f7a731f8772c751e829a9aff8da1dff6faaa7dcba47403cd814faa7b3aba9894aff0979b21f13c78c871893dc0cbcf93512536ce5823954b399f55b92a Sep 12 17:13:16.694404 unknown[774]: fetched base config from "system" Sep 12 17:13:16.694421 unknown[774]: fetched base config from "system" Sep 12 17:13:16.694947 ignition[774]: fetch: fetch complete Sep 12 17:13:16.694428 unknown[774]: fetched user config from "hetzner" Sep 12 17:13:16.694971 ignition[774]: fetch: fetch passed Sep 12 17:13:16.695029 ignition[774]: Ignition finished successfully Sep 12 17:13:16.698552 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:13:16.706098 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:13:16.718699 ignition[781]: Ignition 2.19.0 Sep 12 17:13:16.718708 ignition[781]: Stage: kargs Sep 12 17:13:16.718935 ignition[781]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:16.718947 ignition[781]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:16.719976 ignition[781]: kargs: kargs passed Sep 12 17:13:16.720032 ignition[781]: Ignition finished successfully Sep 12 17:13:16.723665 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:13:16.731156 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:13:16.749771 ignition[789]: Ignition 2.19.0 Sep 12 17:13:16.749790 ignition[789]: Stage: disks Sep 12 17:13:16.752489 ignition[789]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:16.752503 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:16.753561 ignition[789]: disks: disks passed Sep 12 17:13:16.753630 ignition[789]: Ignition finished successfully Sep 12 17:13:16.757234 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:13:16.758076 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:13:16.759830 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:13:16.762557 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:13:16.763670 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:13:16.764698 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:13:16.769035 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:13:16.790229 systemd-fsck[797]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 12 17:13:16.795241 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:13:16.803998 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:13:16.855121 kernel: EXT4-fs (sda9): mounted filesystem fc6c61a7-153d-4e7f-95c0-bffdb4824d71 r/w with ordered data mode. Quota mode: none. Sep 12 17:13:16.856396 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:13:16.857964 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:13:16.866063 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:13:16.869996 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:13:16.874054 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 17:13:16.875805 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:13:16.877334 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:13:16.882441 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:13:16.886855 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (805) Sep 12 17:13:16.888849 kernel: BTRFS info (device sda6): first mount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:16.888905 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:16.888918 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:13:16.890741 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:13:16.896927 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:13:16.897023 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:13:16.900688 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:13:16.953986 initrd-setup-root[833]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:13:16.958279 coreos-metadata[807]: Sep 12 17:13:16.958 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 12 17:13:16.960467 coreos-metadata[807]: Sep 12 17:13:16.960 INFO Fetch successful Sep 12 17:13:16.962326 coreos-metadata[807]: Sep 12 17:13:16.960 INFO wrote hostname ci-4081-3-6-e-c5bf4513f4 to /sysroot/etc/hostname Sep 12 17:13:16.964109 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:13:16.967246 initrd-setup-root[840]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:13:16.972091 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:13:16.976584 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:13:17.079387 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:13:17.087236 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:13:17.093560 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:13:17.102946 kernel: BTRFS info (device sda6): last unmount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:17.123432 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:13:17.130849 ignition[923]: INFO : Ignition 2.19.0 Sep 12 17:13:17.130849 ignition[923]: INFO : Stage: mount Sep 12 17:13:17.130849 ignition[923]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:17.130849 ignition[923]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:17.134654 ignition[923]: INFO : mount: mount passed Sep 12 17:13:17.134654 ignition[923]: INFO : Ignition finished successfully Sep 12 17:13:17.134760 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:13:17.144002 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:13:17.254553 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:13:17.262083 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:13:17.284785 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (934) Sep 12 17:13:17.284888 kernel: BTRFS info (device sda6): first mount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:17.284914 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:17.285637 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:13:17.290884 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:13:17.290972 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:13:17.294987 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:13:17.316852 ignition[951]: INFO : Ignition 2.19.0 Sep 12 17:13:17.316852 ignition[951]: INFO : Stage: files Sep 12 17:13:17.320914 ignition[951]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:17.320914 ignition[951]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:17.320914 ignition[951]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:13:17.323611 ignition[951]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:13:17.323611 ignition[951]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:13:17.328139 ignition[951]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:13:17.329513 ignition[951]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:13:17.331306 unknown[951]: wrote ssh authorized keys file for user: core Sep 12 17:13:17.332284 ignition[951]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:13:17.335038 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 12 17:13:17.336195 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 12 17:13:17.450296 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:13:17.509999 systemd-networkd[771]: eth1: Gained IPv6LL Sep 12 17:13:17.702696 systemd-networkd[771]: eth0: Gained IPv6LL Sep 12 17:13:17.763632 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 12 17:13:17.763632 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:13:17.766583 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 12 17:13:18.055289 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:13:18.676409 ignition[951]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:13:18.676409 ignition[951]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:13:18.681855 ignition[951]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:13:18.681855 ignition[951]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:13:18.681855 ignition[951]: INFO : files: files passed Sep 12 17:13:18.681855 ignition[951]: INFO : Ignition finished successfully Sep 12 17:13:18.681218 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:13:18.691620 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:13:18.695052 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:13:18.700196 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:13:18.700316 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:13:18.719693 initrd-setup-root-after-ignition[979]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:13:18.719693 initrd-setup-root-after-ignition[979]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:13:18.722892 initrd-setup-root-after-ignition[983]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:13:18.724685 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:13:18.725725 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:13:18.731214 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:13:18.770420 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:13:18.770650 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:13:18.774740 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:13:18.775744 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:13:18.777978 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:13:18.783033 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:13:18.801245 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:13:18.807054 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:13:18.820031 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:13:18.820863 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:13:18.823379 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:13:18.825839 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:13:18.826028 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:13:18.827973 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:13:18.829616 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:13:18.830843 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:13:18.832950 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:13:18.834092 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:13:18.836213 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:13:18.836891 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:13:18.838098 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:13:18.839396 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:13:18.840592 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:13:18.841697 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:13:18.841846 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:13:18.843799 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:13:18.844476 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:13:18.845899 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:13:18.845985 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:13:18.847233 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:13:18.847388 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:13:18.849044 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:13:18.849162 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:13:18.850631 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:13:18.850723 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:13:18.851667 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 17:13:18.851763 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:13:18.862156 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:13:18.865499 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:13:18.867997 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:13:18.868159 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:13:18.869953 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:13:18.870323 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:13:18.881182 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:13:18.882413 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:13:18.887430 ignition[1003]: INFO : Ignition 2.19.0 Sep 12 17:13:18.888908 ignition[1003]: INFO : Stage: umount Sep 12 17:13:18.888908 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:18.888908 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:18.895895 ignition[1003]: INFO : umount: umount passed Sep 12 17:13:18.895895 ignition[1003]: INFO : Ignition finished successfully Sep 12 17:13:18.895212 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:13:18.898483 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:13:18.898606 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:13:18.899588 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:13:18.899677 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:13:18.901580 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:13:18.901647 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:13:18.903072 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:13:18.903132 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:13:18.903944 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:13:18.903987 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:13:18.904991 systemd[1]: Stopped target network.target - Network. Sep 12 17:13:18.905873 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:13:18.905928 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:13:18.906953 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:13:18.907780 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:13:18.911915 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:13:18.914516 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:13:18.915920 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:13:18.917892 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:13:18.917969 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:13:18.919375 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:13:18.919425 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:13:18.920376 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:13:18.920430 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:13:18.921431 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:13:18.921474 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:13:18.922515 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:13:18.922563 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:13:18.923742 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:13:18.924642 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:13:18.931672 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:13:18.931848 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:13:18.933047 systemd-networkd[771]: eth0: DHCPv6 lease lost Sep 12 17:13:18.935107 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:13:18.935172 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:13:18.936901 systemd-networkd[771]: eth1: DHCPv6 lease lost Sep 12 17:13:18.938281 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:13:18.939064 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:13:18.941078 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:13:18.941128 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:13:18.949046 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:13:18.951378 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:13:18.951456 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:13:18.955047 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:13:18.955119 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:13:18.956919 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:13:18.956986 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:13:18.958357 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:13:18.970030 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:13:18.970152 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:13:18.995145 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:13:18.995473 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:13:19.000064 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:13:19.000182 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:13:19.000911 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:13:19.000945 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:13:19.002023 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:13:19.002071 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:13:19.003664 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:13:19.003713 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:13:19.005641 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:13:19.005695 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:19.018650 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:13:19.020395 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:13:19.020510 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:13:19.022477 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 17:13:19.022572 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:13:19.027406 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:13:19.027464 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:13:19.028690 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:13:19.028739 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:19.029937 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:13:19.030047 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:13:19.031696 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:13:19.039737 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:13:19.048964 systemd[1]: Switching root. Sep 12 17:13:19.091451 systemd-journald[236]: Journal stopped Sep 12 17:13:19.972668 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). Sep 12 17:13:19.972743 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:13:19.972760 kernel: SELinux: policy capability open_perms=1 Sep 12 17:13:19.972770 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:13:19.972779 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:13:19.972788 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:13:19.972798 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:13:19.972811 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:13:19.972835 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:13:19.972844 kernel: audit: type=1403 audit(1757697199.249:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:13:19.972857 systemd[1]: Successfully loaded SELinux policy in 36.808ms. Sep 12 17:13:19.972881 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.902ms. Sep 12 17:13:19.972899 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 17:13:19.972911 systemd[1]: Detected virtualization kvm. Sep 12 17:13:19.972922 systemd[1]: Detected architecture arm64. Sep 12 17:13:19.972933 systemd[1]: Detected first boot. Sep 12 17:13:19.972944 systemd[1]: Hostname set to . Sep 12 17:13:19.972954 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:13:19.972965 zram_generator::config[1046]: No configuration found. Sep 12 17:13:19.972978 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:13:19.972988 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:13:19.972998 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:13:19.973008 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:13:19.973019 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:13:19.973029 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:13:19.973040 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:13:19.973050 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:13:19.973062 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:13:19.973073 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:13:19.973084 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:13:19.973094 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:13:19.973105 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:13:19.973116 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:13:19.973127 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:13:19.973137 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:13:19.973148 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:13:19.973159 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:13:19.973173 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 17:13:19.973184 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:13:19.973194 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:13:19.973205 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:13:19.973215 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:13:19.973228 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:13:19.973238 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:13:19.973253 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:13:19.973263 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:13:19.973274 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:13:19.973284 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:13:19.973295 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:13:19.973306 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:13:19.973316 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:13:19.973335 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:13:19.973349 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:13:19.973359 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:13:19.973374 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:13:19.973384 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:13:19.973394 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:13:19.973404 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:13:19.973416 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:13:19.973427 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:13:19.973440 systemd[1]: Reached target machines.target - Containers. Sep 12 17:13:19.973450 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:13:19.973460 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:19.973471 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:13:19.973481 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:13:19.973492 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:19.973506 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:13:19.973518 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:13:19.973529 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:13:19.973540 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:13:19.973551 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:13:19.973563 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:13:19.973574 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:13:19.973585 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:13:19.973597 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:13:19.973608 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:13:19.973618 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:13:19.973629 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:13:19.973640 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:13:19.973652 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:13:19.973662 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:13:19.973672 systemd[1]: Stopped verity-setup.service. Sep 12 17:13:19.973685 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:13:19.973695 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:13:19.973706 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:13:19.973716 kernel: fuse: init (API version 7.39) Sep 12 17:13:19.973726 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:13:19.973738 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:13:19.973748 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:13:19.973759 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:13:19.973769 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:13:19.973779 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:13:19.973790 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:19.973800 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:19.975832 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:13:19.975890 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:13:19.975911 kernel: loop: module loaded Sep 12 17:13:19.975958 systemd-journald[1108]: Collecting audit messages is disabled. Sep 12 17:13:19.975987 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:13:19.975999 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:13:19.976012 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:13:19.976026 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:13:19.976038 systemd-journald[1108]: Journal started Sep 12 17:13:19.976060 systemd-journald[1108]: Runtime Journal (/run/log/journal/b9ce13b319cc4401bb978432e42588a6) is 8.0M, max 76.6M, 68.6M free. Sep 12 17:13:19.733602 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:13:19.757751 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:13:19.758549 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:13:19.978197 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:13:19.980782 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:13:19.981704 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:13:19.984140 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:13:19.996839 kernel: ACPI: bus type drm_connector registered Sep 12 17:13:19.998038 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:13:19.998205 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:13:20.005197 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:13:20.014050 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:13:20.019983 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:13:20.020680 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:13:20.020716 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:13:20.025277 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 12 17:13:20.038570 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:13:20.043415 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:13:20.044215 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:20.047590 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:13:20.052422 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:13:20.054009 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:13:20.058081 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:13:20.059967 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:13:20.061189 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:13:20.065209 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:13:20.070050 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:13:20.072626 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:13:20.073469 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:13:20.076282 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:13:20.077599 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:13:20.099410 systemd-journald[1108]: Time spent on flushing to /var/log/journal/b9ce13b319cc4401bb978432e42588a6 is 59.311ms for 1123 entries. Sep 12 17:13:20.099410 systemd-journald[1108]: System Journal (/var/log/journal/b9ce13b319cc4401bb978432e42588a6) is 8.0M, max 584.8M, 576.8M free. Sep 12 17:13:20.185788 systemd-journald[1108]: Received client request to flush runtime journal. Sep 12 17:13:20.185915 kernel: loop0: detected capacity change from 0 to 211168 Sep 12 17:13:20.105249 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:13:20.107941 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:13:20.188877 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:13:20.120788 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 12 17:13:20.157362 systemd-tmpfiles[1160]: ACLs are not supported, ignoring. Sep 12 17:13:20.157373 systemd-tmpfiles[1160]: ACLs are not supported, ignoring. Sep 12 17:13:20.167413 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:13:20.174185 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:13:20.178130 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:13:20.194923 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:13:20.200570 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:13:20.201967 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:13:20.206129 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 12 17:13:20.219155 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 12 17:13:20.230586 udevadm[1180]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Sep 12 17:13:20.235020 kernel: loop1: detected capacity change from 0 to 114328 Sep 12 17:13:20.253781 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:13:20.263172 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:13:20.264971 kernel: loop2: detected capacity change from 0 to 8 Sep 12 17:13:20.280926 systemd-tmpfiles[1183]: ACLs are not supported, ignoring. Sep 12 17:13:20.280946 systemd-tmpfiles[1183]: ACLs are not supported, ignoring. Sep 12 17:13:20.285980 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:13:20.291856 kernel: loop3: detected capacity change from 0 to 114432 Sep 12 17:13:20.325861 kernel: loop4: detected capacity change from 0 to 211168 Sep 12 17:13:20.354847 kernel: loop5: detected capacity change from 0 to 114328 Sep 12 17:13:20.370853 kernel: loop6: detected capacity change from 0 to 8 Sep 12 17:13:20.374850 kernel: loop7: detected capacity change from 0 to 114432 Sep 12 17:13:20.399107 (sd-merge)[1188]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 12 17:13:20.399919 (sd-merge)[1188]: Merged extensions into '/usr'. Sep 12 17:13:20.407163 systemd[1]: Reloading requested from client PID 1159 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:13:20.407186 systemd[1]: Reloading... Sep 12 17:13:20.518940 zram_generator::config[1211]: No configuration found. Sep 12 17:13:20.661889 ldconfig[1154]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:13:20.718053 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:13:20.767359 systemd[1]: Reloading finished in 359 ms. Sep 12 17:13:20.799516 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:13:20.801394 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:13:20.812093 systemd[1]: Starting ensure-sysext.service... Sep 12 17:13:20.814886 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:13:20.823134 systemd[1]: Reloading requested from client PID 1252 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:13:20.823297 systemd[1]: Reloading... Sep 12 17:13:20.865133 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:13:20.865418 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:13:20.866076 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:13:20.866313 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Sep 12 17:13:20.866377 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Sep 12 17:13:20.870660 systemd-tmpfiles[1253]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:13:20.870675 systemd-tmpfiles[1253]: Skipping /boot Sep 12 17:13:20.882771 systemd-tmpfiles[1253]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:13:20.882786 systemd-tmpfiles[1253]: Skipping /boot Sep 12 17:13:20.892856 zram_generator::config[1276]: No configuration found. Sep 12 17:13:21.015931 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:13:21.062902 systemd[1]: Reloading finished in 239 ms. Sep 12 17:13:21.083902 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:13:21.090695 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:13:21.111372 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 17:13:21.119176 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:13:21.135217 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:13:21.141777 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:13:21.151039 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:13:21.163057 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:13:21.170206 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:21.173570 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:21.180112 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:13:21.184200 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:13:21.187007 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:21.187993 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:13:21.188769 augenrules[1340]: No rules Sep 12 17:13:21.193602 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 17:13:21.197717 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:21.197945 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:21.208974 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:21.218188 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:21.219095 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:21.224208 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:13:21.228564 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:13:21.231174 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:13:21.233285 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:13:21.233868 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:13:21.236506 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:13:21.236684 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:13:21.238778 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:21.239249 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:21.252474 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:13:21.252668 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:13:21.256661 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:21.256996 systemd-udevd[1335]: Using default interface naming scheme 'v255'. Sep 12 17:13:21.264438 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:21.275262 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:13:21.280159 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:13:21.285423 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:13:21.287673 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:21.289035 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:13:21.291699 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:13:21.295783 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:21.297145 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:21.301538 systemd[1]: Finished ensure-sysext.service. Sep 12 17:13:21.302311 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:13:21.310965 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:13:21.326020 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:13:21.334381 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 17:13:21.335731 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:13:21.354365 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:13:21.354554 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:13:21.366105 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:13:21.366310 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:13:21.367528 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:13:21.367875 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:13:21.371606 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:13:21.371684 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:13:21.450228 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 17:13:21.481089 systemd-networkd[1376]: lo: Link UP Sep 12 17:13:21.481097 systemd-networkd[1376]: lo: Gained carrier Sep 12 17:13:21.482374 systemd-networkd[1376]: Enumeration completed Sep 12 17:13:21.482508 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:13:21.485795 systemd-networkd[1376]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:21.485802 systemd-networkd[1376]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:21.486786 systemd-networkd[1376]: eth1: Link UP Sep 12 17:13:21.486908 systemd-networkd[1376]: eth1: Gained carrier Sep 12 17:13:21.486964 systemd-networkd[1376]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:21.495236 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:13:21.508913 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 17:13:21.509291 systemd-resolved[1333]: Positive Trust Anchors: Sep 12 17:13:21.509302 systemd-resolved[1333]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:13:21.509378 systemd-resolved[1333]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:13:21.510263 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:13:21.513648 systemd-networkd[1376]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:21.520034 systemd-networkd[1376]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:13:21.520435 systemd-resolved[1333]: Using system hostname 'ci-4081-3-6-e-c5bf4513f4'. Sep 12 17:13:21.522158 systemd-timesyncd[1382]: Network configuration changed, trying to establish connection. Sep 12 17:13:21.523837 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:13:21.524898 systemd[1]: Reached target network.target - Network. Sep 12 17:13:21.525738 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:13:21.588905 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:13:21.619444 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 12 17:13:21.620128 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:21.629901 systemd-networkd[1376]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:21.632090 systemd-networkd[1376]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:21.632994 systemd-timesyncd[1382]: Network configuration changed, trying to establish connection. Sep 12 17:13:21.633306 systemd-networkd[1376]: eth0: Link UP Sep 12 17:13:21.633393 systemd-networkd[1376]: eth0: Gained carrier Sep 12 17:13:21.633458 systemd-networkd[1376]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:21.636299 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:21.638649 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:13:21.638952 systemd-timesyncd[1382]: Network configuration changed, trying to establish connection. Sep 12 17:13:21.642628 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:13:21.644563 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:21.644607 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:13:21.649988 systemd-networkd[1376]: eth0: DHCPv4 address 188.245.115.118/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:13:21.652033 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:21.652216 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:21.652922 systemd-timesyncd[1382]: Network configuration changed, trying to establish connection. Sep 12 17:13:21.671482 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:13:21.675488 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1393) Sep 12 17:13:21.672542 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:13:21.679450 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:13:21.681929 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:13:21.689280 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:13:21.689770 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:13:21.737861 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 12 17:13:21.737997 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 12 17:13:21.738029 kernel: [drm] features: -context_init Sep 12 17:13:21.749117 kernel: [drm] number of scanouts: 1 Sep 12 17:13:21.749254 kernel: [drm] number of cap sets: 0 Sep 12 17:13:21.755319 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Sep 12 17:13:21.755445 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 17:13:21.758489 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:21.771959 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 12 17:13:21.791207 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:13:21.793484 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:21.801143 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:13:21.809260 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:13:21.813051 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:21.826929 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:13:21.888908 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:21.895871 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 12 17:13:21.904294 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 12 17:13:21.918693 lvm[1439]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 17:13:21.955881 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 12 17:13:21.957107 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:13:21.957990 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:13:21.960066 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:13:21.961169 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:13:21.962454 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:13:21.963496 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:13:21.964452 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:13:21.965273 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:13:21.965331 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:13:21.965880 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:13:21.968938 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:13:21.972828 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:13:21.983170 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:13:21.986565 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 12 17:13:21.987983 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:13:21.988859 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:13:21.989506 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:13:21.990346 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:13:21.990379 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:13:21.998213 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:13:22.005084 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:13:22.007679 lvm[1443]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 17:13:22.008161 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:13:22.014834 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:13:22.021109 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:13:22.023089 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:13:22.026736 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:13:22.032086 jq[1447]: false Sep 12 17:13:22.041002 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:13:22.047050 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 12 17:13:22.050406 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:13:22.062043 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:13:22.069073 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:13:22.070560 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:13:22.071929 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:13:22.074008 coreos-metadata[1445]: Sep 12 17:13:22.073 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 12 17:13:22.075086 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:13:22.078709 coreos-metadata[1445]: Sep 12 17:13:22.078 INFO Fetch successful Sep 12 17:13:22.078709 coreos-metadata[1445]: Sep 12 17:13:22.078 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 12 17:13:22.082115 coreos-metadata[1445]: Sep 12 17:13:22.080 INFO Fetch successful Sep 12 17:13:22.080992 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:13:22.083897 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 12 17:13:22.100377 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:13:22.100588 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:13:22.104371 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:13:22.104574 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:13:22.130457 extend-filesystems[1448]: Found loop4 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found loop5 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found loop6 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found loop7 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda1 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda2 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda3 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found usr Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda4 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda6 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda7 Sep 12 17:13:22.130457 extend-filesystems[1448]: Found sda9 Sep 12 17:13:22.130457 extend-filesystems[1448]: Checking size of /dev/sda9 Sep 12 17:13:22.136030 dbus-daemon[1446]: [system] SELinux support is enabled Sep 12 17:13:22.156306 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:13:22.174361 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:13:22.174396 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:13:22.178083 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:13:22.178119 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:13:22.187686 jq[1459]: true Sep 12 17:13:22.192121 extend-filesystems[1448]: Resized partition /dev/sda9 Sep 12 17:13:22.196008 tar[1465]: linux-arm64/LICENSE Sep 12 17:13:22.196008 tar[1465]: linux-arm64/helm Sep 12 17:13:22.196394 (ntainerd)[1476]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:13:22.206101 extend-filesystems[1491]: resize2fs 1.47.1 (20-May-2024) Sep 12 17:13:22.207874 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:13:22.208060 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:13:22.217056 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 12 17:13:22.234952 update_engine[1457]: I20250912 17:13:22.232686 1457 main.cc:92] Flatcar Update Engine starting Sep 12 17:13:22.238095 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:13:22.239649 update_engine[1457]: I20250912 17:13:22.239575 1457 update_check_scheduler.cc:74] Next update check in 2m11s Sep 12 17:13:22.248466 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:13:22.271171 jq[1492]: true Sep 12 17:13:22.278975 systemd-logind[1456]: New seat seat0. Sep 12 17:13:22.283199 systemd-logind[1456]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 17:13:22.283901 systemd-logind[1456]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 12 17:13:22.284288 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:13:22.301049 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:13:22.302165 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:13:22.417873 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1385) Sep 12 17:13:22.428018 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 12 17:13:22.455843 containerd[1476]: time="2025-09-12T17:13:22.453898600Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 12 17:13:22.456184 bash[1519]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:13:22.456259 extend-filesystems[1491]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 17:13:22.456259 extend-filesystems[1491]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 12 17:13:22.456259 extend-filesystems[1491]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 12 17:13:22.458863 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:13:22.459170 extend-filesystems[1448]: Resized filesystem in /dev/sda9 Sep 12 17:13:22.459170 extend-filesystems[1448]: Found sr0 Sep 12 17:13:22.461173 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:13:22.465867 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:13:22.494205 systemd[1]: Starting sshkeys.service... Sep 12 17:13:22.523915 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 17:13:22.533354 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 17:13:22.566639 containerd[1476]: time="2025-09-12T17:13:22.566567520Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:22.571497 containerd[1476]: time="2025-09-12T17:13:22.571431200Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.106-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:22.571497 containerd[1476]: time="2025-09-12T17:13:22.571487720Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 12 17:13:22.571649 containerd[1476]: time="2025-09-12T17:13:22.571520040Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 12 17:13:22.571743 containerd[1476]: time="2025-09-12T17:13:22.571719120Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 12 17:13:22.571782 containerd[1476]: time="2025-09-12T17:13:22.571746120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:22.571860 containerd[1476]: time="2025-09-12T17:13:22.571836600Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:22.571888 containerd[1476]: time="2025-09-12T17:13:22.571858720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572095 containerd[1476]: time="2025-09-12T17:13:22.572068600Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572095 containerd[1476]: time="2025-09-12T17:13:22.572090560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572151 containerd[1476]: time="2025-09-12T17:13:22.572107680Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572151 containerd[1476]: time="2025-09-12T17:13:22.572118480Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572216 containerd[1476]: time="2025-09-12T17:13:22.572195680Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572465 containerd[1476]: time="2025-09-12T17:13:22.572439160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572595 containerd[1476]: time="2025-09-12T17:13:22.572571840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:22.572628 containerd[1476]: time="2025-09-12T17:13:22.572594920Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 12 17:13:22.572704 containerd[1476]: time="2025-09-12T17:13:22.572682200Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 12 17:13:22.572752 containerd[1476]: time="2025-09-12T17:13:22.572735600Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:13:22.606297 coreos-metadata[1528]: Sep 12 17:13:22.606 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 12 17:13:22.609162 coreos-metadata[1528]: Sep 12 17:13:22.607 INFO Fetch successful Sep 12 17:13:22.618561 locksmithd[1497]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:13:22.636272 unknown[1528]: wrote ssh authorized keys file for user: core Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.648277480Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.648391240Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.648414400Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.648433000Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.648450560Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.648675280Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649166280Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649327240Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649348240Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649362480Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649387040Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649403640Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649416960Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649453 containerd[1476]: time="2025-09-12T17:13:22.649432240Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649454520Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649469040Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649482360Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649496120Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649533640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649549400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649562600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649577520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649789 containerd[1476]: time="2025-09-12T17:13:22.649590640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649969 containerd[1476]: time="2025-09-12T17:13:22.649883000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649969 containerd[1476]: time="2025-09-12T17:13:22.649903120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649969 containerd[1476]: time="2025-09-12T17:13:22.649919240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649969 containerd[1476]: time="2025-09-12T17:13:22.649933240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.649969 containerd[1476]: time="2025-09-12T17:13:22.649962200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.650053 containerd[1476]: time="2025-09-12T17:13:22.649978920Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.650053 containerd[1476]: time="2025-09-12T17:13:22.649993320Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.650053 containerd[1476]: time="2025-09-12T17:13:22.650008040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.652918 containerd[1476]: time="2025-09-12T17:13:22.651981240Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 12 17:13:22.652918 containerd[1476]: time="2025-09-12T17:13:22.652029680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.652918 containerd[1476]: time="2025-09-12T17:13:22.652073920Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.652918 containerd[1476]: time="2025-09-12T17:13:22.652086640Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 12 17:13:22.652918 containerd[1476]: time="2025-09-12T17:13:22.652906520Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 12 17:13:22.653375 containerd[1476]: time="2025-09-12T17:13:22.652956720Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 12 17:13:22.653375 containerd[1476]: time="2025-09-12T17:13:22.652970600Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 12 17:13:22.653375 containerd[1476]: time="2025-09-12T17:13:22.652993080Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 12 17:13:22.653375 containerd[1476]: time="2025-09-12T17:13:22.653016160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.653375 containerd[1476]: time="2025-09-12T17:13:22.653035400Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 12 17:13:22.653375 containerd[1476]: time="2025-09-12T17:13:22.653048200Z" level=info msg="NRI interface is disabled by configuration." Sep 12 17:13:22.653375 containerd[1476]: time="2025-09-12T17:13:22.653062600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 12 17:13:22.654857 containerd[1476]: time="2025-09-12T17:13:22.653548200Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 12 17:13:22.654857 containerd[1476]: time="2025-09-12T17:13:22.653896320Z" level=info msg="Connect containerd service" Sep 12 17:13:22.654857 containerd[1476]: time="2025-09-12T17:13:22.653962280Z" level=info msg="using legacy CRI server" Sep 12 17:13:22.654857 containerd[1476]: time="2025-09-12T17:13:22.653974800Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:13:22.656797 containerd[1476]: time="2025-09-12T17:13:22.655942480Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 12 17:13:22.657191 containerd[1476]: time="2025-09-12T17:13:22.657084000Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.657500280Z" level=info msg="Start subscribing containerd event" Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.657641160Z" level=info msg="Start recovering state" Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.657759320Z" level=info msg="Start event monitor" Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.657783480Z" level=info msg="Start snapshots syncer" Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.657878600Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.657895680Z" level=info msg="Start streaming server" Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.658087920Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:13:22.659864 containerd[1476]: time="2025-09-12T17:13:22.658135440Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:13:22.660090 containerd[1476]: time="2025-09-12T17:13:22.659995720Z" level=info msg="containerd successfully booted in 0.208056s" Sep 12 17:13:22.662650 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:13:22.684350 update-ssh-keys[1537]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:13:22.684727 sshd_keygen[1478]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:13:22.686257 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 17:13:22.695933 systemd[1]: Finished sshkeys.service. Sep 12 17:13:22.717838 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:13:22.728372 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:13:22.737739 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:13:22.738031 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:13:22.746740 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:13:22.760869 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:13:22.769882 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:13:22.779204 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 17:13:22.780207 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:13:22.932667 tar[1465]: linux-arm64/README.md Sep 12 17:13:22.944541 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:13:23.462159 systemd-networkd[1376]: eth1: Gained IPv6LL Sep 12 17:13:23.463021 systemd-timesyncd[1382]: Network configuration changed, trying to establish connection. Sep 12 17:13:23.464227 systemd-networkd[1376]: eth0: Gained IPv6LL Sep 12 17:13:23.465736 systemd-timesyncd[1382]: Network configuration changed, trying to establish connection. Sep 12 17:13:23.468130 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:13:23.470623 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:13:23.479293 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:23.485614 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:13:23.525878 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:13:24.325170 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:24.325443 (kubelet)[1577]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:24.328131 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:13:24.332928 systemd[1]: Startup finished in 847ms (kernel) + 5.559s (initrd) + 5.119s (userspace) = 11.526s. Sep 12 17:13:24.883962 kubelet[1577]: E0912 17:13:24.883873 1577 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:24.886319 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:24.886480 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:34.948739 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:13:34.963272 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:35.089008 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:35.102486 (kubelet)[1596]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:35.154730 kubelet[1596]: E0912 17:13:35.154616 1596 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:35.158561 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:35.158900 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:37.412567 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:13:37.419339 systemd[1]: Started sshd@0-188.245.115.118:22-139.178.89.65:47806.service - OpenSSH per-connection server daemon (139.178.89.65:47806). Sep 12 17:13:38.396090 sshd[1604]: Accepted publickey for core from 139.178.89.65 port 47806 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:38.399502 sshd[1604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:38.410500 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:13:38.421343 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:13:38.426608 systemd-logind[1456]: New session 1 of user core. Sep 12 17:13:38.438772 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:13:38.446386 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:13:38.460042 (systemd)[1608]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:13:38.576730 systemd[1608]: Queued start job for default target default.target. Sep 12 17:13:38.585689 systemd[1608]: Created slice app.slice - User Application Slice. Sep 12 17:13:38.586073 systemd[1608]: Reached target paths.target - Paths. Sep 12 17:13:38.586109 systemd[1608]: Reached target timers.target - Timers. Sep 12 17:13:38.588657 systemd[1608]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:13:38.607110 systemd[1608]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:13:38.607295 systemd[1608]: Reached target sockets.target - Sockets. Sep 12 17:13:38.607434 systemd[1608]: Reached target basic.target - Basic System. Sep 12 17:13:38.607494 systemd[1608]: Reached target default.target - Main User Target. Sep 12 17:13:38.607531 systemd[1608]: Startup finished in 139ms. Sep 12 17:13:38.607664 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:13:38.615250 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:13:39.322184 systemd[1]: Started sshd@1-188.245.115.118:22-139.178.89.65:47810.service - OpenSSH per-connection server daemon (139.178.89.65:47810). Sep 12 17:13:40.316466 sshd[1619]: Accepted publickey for core from 139.178.89.65 port 47810 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:40.319269 sshd[1619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:40.326751 systemd-logind[1456]: New session 2 of user core. Sep 12 17:13:40.339223 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:13:41.009715 sshd[1619]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:41.016179 systemd[1]: sshd@1-188.245.115.118:22-139.178.89.65:47810.service: Deactivated successfully. Sep 12 17:13:41.019003 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 17:13:41.021666 systemd-logind[1456]: Session 2 logged out. Waiting for processes to exit. Sep 12 17:13:41.023287 systemd-logind[1456]: Removed session 2. Sep 12 17:13:41.186334 systemd[1]: Started sshd@2-188.245.115.118:22-139.178.89.65:41458.service - OpenSSH per-connection server daemon (139.178.89.65:41458). Sep 12 17:13:42.195169 sshd[1626]: Accepted publickey for core from 139.178.89.65 port 41458 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:42.197528 sshd[1626]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:42.205084 systemd-logind[1456]: New session 3 of user core. Sep 12 17:13:42.210223 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:13:42.882091 sshd[1626]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:42.886417 systemd[1]: sshd@2-188.245.115.118:22-139.178.89.65:41458.service: Deactivated successfully. Sep 12 17:13:42.888446 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 17:13:42.889512 systemd-logind[1456]: Session 3 logged out. Waiting for processes to exit. Sep 12 17:13:42.890843 systemd-logind[1456]: Removed session 3. Sep 12 17:13:43.061249 systemd[1]: Started sshd@3-188.245.115.118:22-139.178.89.65:41470.service - OpenSSH per-connection server daemon (139.178.89.65:41470). Sep 12 17:13:44.051298 sshd[1633]: Accepted publickey for core from 139.178.89.65 port 41470 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:44.053536 sshd[1633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:44.060777 systemd-logind[1456]: New session 4 of user core. Sep 12 17:13:44.073268 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:13:44.739623 sshd[1633]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:44.746038 systemd-logind[1456]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:13:44.746135 systemd[1]: sshd@3-188.245.115.118:22-139.178.89.65:41470.service: Deactivated successfully. Sep 12 17:13:44.748610 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:13:44.752062 systemd-logind[1456]: Removed session 4. Sep 12 17:13:44.918323 systemd[1]: Started sshd@4-188.245.115.118:22-139.178.89.65:41472.service - OpenSSH per-connection server daemon (139.178.89.65:41472). Sep 12 17:13:45.198612 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:13:45.210216 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:45.903441 sshd[1640]: Accepted publickey for core from 139.178.89.65 port 41472 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:45.905957 sshd[1640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:45.911855 systemd-logind[1456]: New session 5 of user core. Sep 12 17:13:45.922196 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:13:46.064911 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:46.077530 (kubelet)[1651]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:46.130636 kubelet[1651]: E0912 17:13:46.130551 1651 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:46.134044 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:46.134289 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:46.439419 sudo[1658]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:13:46.439767 sudo[1658]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:46.457082 sudo[1658]: pam_unix(sudo:session): session closed for user root Sep 12 17:13:46.618050 sshd[1640]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:46.623889 systemd[1]: sshd@4-188.245.115.118:22-139.178.89.65:41472.service: Deactivated successfully. Sep 12 17:13:46.627154 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:13:46.628459 systemd-logind[1456]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:13:46.629849 systemd-logind[1456]: Removed session 5. Sep 12 17:13:46.800333 systemd[1]: Started sshd@5-188.245.115.118:22-139.178.89.65:41478.service - OpenSSH per-connection server daemon (139.178.89.65:41478). Sep 12 17:13:47.787453 sshd[1663]: Accepted publickey for core from 139.178.89.65 port 41478 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:47.789948 sshd[1663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:47.795458 systemd-logind[1456]: New session 6 of user core. Sep 12 17:13:47.803227 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:13:48.315261 sudo[1667]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:13:48.315565 sudo[1667]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:48.320666 sudo[1667]: pam_unix(sudo:session): session closed for user root Sep 12 17:13:48.326622 sudo[1666]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 12 17:13:48.327389 sudo[1666]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:48.343363 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 12 17:13:48.358936 auditctl[1670]: No rules Sep 12 17:13:48.359327 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:13:48.359517 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 12 17:13:48.366496 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 17:13:48.392117 augenrules[1688]: No rules Sep 12 17:13:48.393733 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 17:13:48.396256 sudo[1666]: pam_unix(sudo:session): session closed for user root Sep 12 17:13:48.556590 sshd[1663]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:48.561638 systemd[1]: sshd@5-188.245.115.118:22-139.178.89.65:41478.service: Deactivated successfully. Sep 12 17:13:48.564194 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:13:48.567680 systemd-logind[1456]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:13:48.569140 systemd-logind[1456]: Removed session 6. Sep 12 17:13:48.742351 systemd[1]: Started sshd@6-188.245.115.118:22-139.178.89.65:41492.service - OpenSSH per-connection server daemon (139.178.89.65:41492). Sep 12 17:13:49.727760 sshd[1696]: Accepted publickey for core from 139.178.89.65 port 41492 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:49.729843 sshd[1696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:49.734883 systemd-logind[1456]: New session 7 of user core. Sep 12 17:13:49.743165 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:13:50.254442 sudo[1699]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:13:50.254744 sudo[1699]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:50.593428 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:13:50.593448 (dockerd)[1714]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:13:50.869173 dockerd[1714]: time="2025-09-12T17:13:50.868621280Z" level=info msg="Starting up" Sep 12 17:13:52.508745 dockerd[1714]: time="2025-09-12T17:13:52.508631960Z" level=info msg="Loading containers: start." Sep 12 17:13:53.487883 kernel: Initializing XFRM netlink socket Sep 12 17:13:53.515561 systemd-timesyncd[1382]: Network configuration changed, trying to establish connection. Sep 12 17:13:53.535809 systemd-timesyncd[1382]: Contacted time server 185.13.148.71:123 (2.flatcar.pool.ntp.org). Sep 12 17:13:53.536374 systemd-timesyncd[1382]: Initial clock synchronization to Fri 2025-09-12 17:13:53.278284 UTC. Sep 12 17:13:53.582092 systemd-networkd[1376]: docker0: Link UP Sep 12 17:13:53.661351 dockerd[1714]: time="2025-09-12T17:13:53.661258920Z" level=info msg="Loading containers: done." Sep 12 17:13:53.683077 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1449615464-merged.mount: Deactivated successfully. Sep 12 17:13:53.699481 dockerd[1714]: time="2025-09-12T17:13:53.699180200Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:13:53.699481 dockerd[1714]: time="2025-09-12T17:13:53.699526520Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 12 17:13:53.699901 dockerd[1714]: time="2025-09-12T17:13:53.699785240Z" level=info msg="Daemon has completed initialization" Sep 12 17:13:53.803057 dockerd[1714]: time="2025-09-12T17:13:53.801510440Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:13:53.802546 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:13:55.155587 containerd[1476]: time="2025-09-12T17:13:55.155151072Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 12 17:13:55.798718 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3236727151.mount: Deactivated successfully. Sep 12 17:13:56.198006 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 17:13:56.204074 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:56.338300 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:56.349002 (kubelet)[1915]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:56.427587 kubelet[1915]: E0912 17:13:56.427507 1915 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:56.432201 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:56.432466 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:56.792619 containerd[1476]: time="2025-09-12T17:13:56.792558061Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:56.793842 containerd[1476]: time="2025-09-12T17:13:56.793756114Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=27390326" Sep 12 17:13:56.794843 containerd[1476]: time="2025-09-12T17:13:56.794530770Z" level=info msg="ImageCreate event name:\"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:56.798831 containerd[1476]: time="2025-09-12T17:13:56.798323558Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:56.799906 containerd[1476]: time="2025-09-12T17:13:56.799846299Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"27386827\" in 1.644621317s" Sep 12 17:13:56.800001 containerd[1476]: time="2025-09-12T17:13:56.799905915Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\"" Sep 12 17:13:56.801522 containerd[1476]: time="2025-09-12T17:13:56.801486477Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 12 17:13:57.978670 containerd[1476]: time="2025-09-12T17:13:57.978556403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:57.980656 containerd[1476]: time="2025-09-12T17:13:57.980588248Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=23547937" Sep 12 17:13:57.984037 containerd[1476]: time="2025-09-12T17:13:57.982883145Z" level=info msg="ImageCreate event name:\"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:57.989861 containerd[1476]: time="2025-09-12T17:13:57.987455365Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:57.991448 containerd[1476]: time="2025-09-12T17:13:57.991396983Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"25135832\" in 1.189871603s" Sep 12 17:13:57.991607 containerd[1476]: time="2025-09-12T17:13:57.991590093Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\"" Sep 12 17:13:57.993441 containerd[1476]: time="2025-09-12T17:13:57.993397478Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 12 17:13:59.055201 containerd[1476]: time="2025-09-12T17:13:59.055038636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:59.057321 containerd[1476]: time="2025-09-12T17:13:59.057253312Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=18295997" Sep 12 17:13:59.058828 containerd[1476]: time="2025-09-12T17:13:59.058736373Z" level=info msg="ImageCreate event name:\"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:59.063908 containerd[1476]: time="2025-09-12T17:13:59.062636162Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:59.064536 containerd[1476]: time="2025-09-12T17:13:59.064486270Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"19883910\" in 1.071038305s" Sep 12 17:13:59.064649 containerd[1476]: time="2025-09-12T17:13:59.064630453Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\"" Sep 12 17:13:59.065506 containerd[1476]: time="2025-09-12T17:13:59.065300029Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 12 17:14:00.094385 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount635392097.mount: Deactivated successfully. Sep 12 17:14:00.513872 containerd[1476]: time="2025-09-12T17:14:00.512172300Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:00.518196 containerd[1476]: time="2025-09-12T17:14:00.518119454Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=28240132" Sep 12 17:14:00.519810 containerd[1476]: time="2025-09-12T17:14:00.519753537Z" level=info msg="ImageCreate event name:\"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:00.523424 containerd[1476]: time="2025-09-12T17:14:00.523371660Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:00.524181 containerd[1476]: time="2025-09-12T17:14:00.524126610Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"28239125\" in 1.458445734s" Sep 12 17:14:00.524181 containerd[1476]: time="2025-09-12T17:14:00.524178767Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\"" Sep 12 17:14:00.525755 containerd[1476]: time="2025-09-12T17:14:00.525716303Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 12 17:14:01.118138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3088856913.mount: Deactivated successfully. Sep 12 17:14:01.855842 containerd[1476]: time="2025-09-12T17:14:01.854432072Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.856353 containerd[1476]: time="2025-09-12T17:14:01.856278192Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Sep 12 17:14:01.856621 containerd[1476]: time="2025-09-12T17:14:01.856593166Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.860647 containerd[1476]: time="2025-09-12T17:14:01.860579052Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.862380 containerd[1476]: time="2025-09-12T17:14:01.862316402Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.336543067s" Sep 12 17:14:01.862380 containerd[1476]: time="2025-09-12T17:14:01.862372801Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 12 17:14:01.863386 containerd[1476]: time="2025-09-12T17:14:01.863148446Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:14:02.402224 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount893324743.mount: Deactivated successfully. Sep 12 17:14:02.413869 containerd[1476]: time="2025-09-12T17:14:02.413586265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:02.416092 containerd[1476]: time="2025-09-12T17:14:02.416009404Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 12 17:14:02.417860 containerd[1476]: time="2025-09-12T17:14:02.417126330Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:02.421063 containerd[1476]: time="2025-09-12T17:14:02.421012609Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:02.422171 containerd[1476]: time="2025-09-12T17:14:02.422130880Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 558.933715ms" Sep 12 17:14:02.422323 containerd[1476]: time="2025-09-12T17:14:02.422302365Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 17:14:02.423000 containerd[1476]: time="2025-09-12T17:14:02.422980433Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 12 17:14:02.969546 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3300535566.mount: Deactivated successfully. Sep 12 17:14:04.569471 containerd[1476]: time="2025-09-12T17:14:04.569384644Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:04.571793 containerd[1476]: time="2025-09-12T17:14:04.571251920Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465913" Sep 12 17:14:04.572993 containerd[1476]: time="2025-09-12T17:14:04.572948412Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:04.582786 containerd[1476]: time="2025-09-12T17:14:04.582704333Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:04.585857 containerd[1476]: time="2025-09-12T17:14:04.585762295Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.162656531s" Sep 12 17:14:04.586396 containerd[1476]: time="2025-09-12T17:14:04.586097516Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 12 17:14:06.448474 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 17:14:06.457355 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:06.614288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:06.616997 (kubelet)[2082]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:14:06.667843 kubelet[2082]: E0912 17:14:06.667477 2082 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:14:06.670681 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:14:06.670928 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:14:07.820102 update_engine[1457]: I20250912 17:14:07.819974 1457 update_attempter.cc:509] Updating boot flags... Sep 12 17:14:07.887659 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2098) Sep 12 17:14:07.974843 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2098) Sep 12 17:14:08.081832 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2098) Sep 12 17:14:09.591273 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:09.601071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:09.669369 systemd[1]: Reloading requested from client PID 2118 ('systemctl') (unit session-7.scope)... Sep 12 17:14:09.669408 systemd[1]: Reloading... Sep 12 17:14:09.826844 zram_generator::config[2158]: No configuration found. Sep 12 17:14:09.949236 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:14:10.026251 systemd[1]: Reloading finished in 353 ms. Sep 12 17:14:10.090107 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:10.094666 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:10.099488 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:14:10.099893 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:10.107508 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:10.285203 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:10.296077 (kubelet)[2208]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:14:10.347873 kubelet[2208]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:10.347873 kubelet[2208]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:14:10.347873 kubelet[2208]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:10.347873 kubelet[2208]: I0912 17:14:10.345985 2208 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:14:10.878772 kubelet[2208]: I0912 17:14:10.878670 2208 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 17:14:10.878772 kubelet[2208]: I0912 17:14:10.878739 2208 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:14:10.879192 kubelet[2208]: I0912 17:14:10.879141 2208 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 17:14:10.909056 kubelet[2208]: E0912 17:14:10.908996 2208 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://188.245.115.118:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 12 17:14:10.909645 kubelet[2208]: I0912 17:14:10.909339 2208 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:14:10.922562 kubelet[2208]: E0912 17:14:10.922507 2208 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 17:14:10.922562 kubelet[2208]: I0912 17:14:10.922553 2208 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 17:14:10.925130 kubelet[2208]: I0912 17:14:10.925097 2208 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:14:10.926795 kubelet[2208]: I0912 17:14:10.926713 2208 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:14:10.927063 kubelet[2208]: I0912 17:14:10.926773 2208 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-e-c5bf4513f4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:14:10.927204 kubelet[2208]: I0912 17:14:10.927120 2208 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:14:10.927204 kubelet[2208]: I0912 17:14:10.927131 2208 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 17:14:10.927428 kubelet[2208]: I0912 17:14:10.927384 2208 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:10.931345 kubelet[2208]: I0912 17:14:10.931281 2208 kubelet.go:480] "Attempting to sync node with API server" Sep 12 17:14:10.931452 kubelet[2208]: I0912 17:14:10.931424 2208 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:14:10.931501 kubelet[2208]: I0912 17:14:10.931462 2208 kubelet.go:386] "Adding apiserver pod source" Sep 12 17:14:10.931501 kubelet[2208]: I0912 17:14:10.931486 2208 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:14:10.936404 kubelet[2208]: E0912 17:14:10.936098 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://188.245.115.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-e-c5bf4513f4&limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 17:14:10.936663 kubelet[2208]: E0912 17:14:10.936620 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://188.245.115.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 17:14:10.936840 kubelet[2208]: I0912 17:14:10.936717 2208 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 17:14:10.940222 kubelet[2208]: I0912 17:14:10.940145 2208 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 17:14:10.940376 kubelet[2208]: W0912 17:14:10.940310 2208 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:14:10.944116 kubelet[2208]: I0912 17:14:10.944065 2208 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:14:10.944219 kubelet[2208]: I0912 17:14:10.944131 2208 server.go:1289] "Started kubelet" Sep 12 17:14:10.947050 kubelet[2208]: I0912 17:14:10.946715 2208 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:14:10.947797 kubelet[2208]: I0912 17:14:10.947714 2208 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:14:10.949863 kubelet[2208]: I0912 17:14:10.948353 2208 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:14:10.949863 kubelet[2208]: I0912 17:14:10.948398 2208 server.go:317] "Adding debug handlers to kubelet server" Sep 12 17:14:10.951185 kubelet[2208]: I0912 17:14:10.951153 2208 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:14:10.955146 kubelet[2208]: E0912 17:14:10.951452 2208 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://188.245.115.118:6443/api/v1/namespaces/default/events\": dial tcp 188.245.115.118:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-e-c5bf4513f4.186498564934168a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-e-c5bf4513f4,UID:ci-4081-3-6-e-c5bf4513f4,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-e-c5bf4513f4,},FirstTimestamp:2025-09-12 17:14:10.944095882 +0000 UTC m=+0.641631832,LastTimestamp:2025-09-12 17:14:10.944095882 +0000 UTC m=+0.641631832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-e-c5bf4513f4,}" Sep 12 17:14:10.956722 kubelet[2208]: I0912 17:14:10.956665 2208 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:14:10.960772 kubelet[2208]: E0912 17:14:10.960731 2208 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" Sep 12 17:14:10.960772 kubelet[2208]: I0912 17:14:10.960776 2208 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:14:10.961085 kubelet[2208]: I0912 17:14:10.961056 2208 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:14:10.961165 kubelet[2208]: I0912 17:14:10.961147 2208 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:14:10.961680 kubelet[2208]: E0912 17:14:10.961639 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://188.245.115.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 17:14:10.962497 kubelet[2208]: E0912 17:14:10.962257 2208 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.115.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-e-c5bf4513f4?timeout=10s\": dial tcp 188.245.115.118:6443: connect: connection refused" interval="200ms" Sep 12 17:14:10.962778 kubelet[2208]: I0912 17:14:10.962748 2208 factory.go:223] Registration of the systemd container factory successfully Sep 12 17:14:10.963067 kubelet[2208]: I0912 17:14:10.963034 2208 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:14:10.964443 kubelet[2208]: I0912 17:14:10.964408 2208 factory.go:223] Registration of the containerd container factory successfully Sep 12 17:14:10.965195 kubelet[2208]: E0912 17:14:10.965170 2208 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:14:10.989159 kubelet[2208]: I0912 17:14:10.989098 2208 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:14:10.989159 kubelet[2208]: I0912 17:14:10.989122 2208 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:14:10.989159 kubelet[2208]: I0912 17:14:10.989148 2208 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:10.995993 kubelet[2208]: I0912 17:14:10.995007 2208 policy_none.go:49] "None policy: Start" Sep 12 17:14:10.995993 kubelet[2208]: I0912 17:14:10.995056 2208 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:14:10.995993 kubelet[2208]: I0912 17:14:10.995071 2208 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:14:10.998719 kubelet[2208]: I0912 17:14:10.998657 2208 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 17:14:11.003763 kubelet[2208]: I0912 17:14:11.003680 2208 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 17:14:11.003763 kubelet[2208]: I0912 17:14:11.003751 2208 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 17:14:11.004079 kubelet[2208]: I0912 17:14:11.003784 2208 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:14:11.004079 kubelet[2208]: I0912 17:14:11.003792 2208 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 17:14:11.004079 kubelet[2208]: E0912 17:14:11.003870 2208 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:14:11.005484 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:14:11.006113 kubelet[2208]: E0912 17:14:11.006076 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://188.245.115.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 17:14:11.016978 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:14:11.021854 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:14:11.035562 kubelet[2208]: E0912 17:14:11.034725 2208 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 17:14:11.035562 kubelet[2208]: I0912 17:14:11.035164 2208 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:14:11.035562 kubelet[2208]: I0912 17:14:11.035189 2208 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:14:11.035961 kubelet[2208]: I0912 17:14:11.035681 2208 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:14:11.039244 kubelet[2208]: E0912 17:14:11.039188 2208 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:14:11.040248 kubelet[2208]: E0912 17:14:11.040220 2208 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-e-c5bf4513f4\" not found" Sep 12 17:14:11.127457 systemd[1]: Created slice kubepods-burstable-pod23ceac683a3ba4e96382b176df7bd8b1.slice - libcontainer container kubepods-burstable-pod23ceac683a3ba4e96382b176df7bd8b1.slice. Sep 12 17:14:11.139426 kubelet[2208]: I0912 17:14:11.139213 2208 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.141054 kubelet[2208]: E0912 17:14:11.140031 2208 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://188.245.115.118:6443/api/v1/nodes\": dial tcp 188.245.115.118:6443: connect: connection refused" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.143986 kubelet[2208]: E0912 17:14:11.143935 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.144808 systemd[1]: Created slice kubepods-burstable-pod26e302cf255a1af75e6fb084ae572445.slice - libcontainer container kubepods-burstable-pod26e302cf255a1af75e6fb084ae572445.slice. Sep 12 17:14:11.148409 kubelet[2208]: E0912 17:14:11.148144 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.151797 systemd[1]: Created slice kubepods-burstable-pod901616d75bfff2fd1b4614ccc55b4858.slice - libcontainer container kubepods-burstable-pod901616d75bfff2fd1b4614ccc55b4858.slice. Sep 12 17:14:11.154078 kubelet[2208]: E0912 17:14:11.154031 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.163313 kubelet[2208]: E0912 17:14:11.163248 2208 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.115.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-e-c5bf4513f4?timeout=10s\": dial tcp 188.245.115.118:6443: connect: connection refused" interval="400ms" Sep 12 17:14:11.262917 kubelet[2208]: I0912 17:14:11.262703 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23ceac683a3ba4e96382b176df7bd8b1-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-e-c5bf4513f4\" (UID: \"23ceac683a3ba4e96382b176df7bd8b1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.262917 kubelet[2208]: I0912 17:14:11.262779 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23ceac683a3ba4e96382b176df7bd8b1-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-e-c5bf4513f4\" (UID: \"23ceac683a3ba4e96382b176df7bd8b1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.262917 kubelet[2208]: I0912 17:14:11.262888 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.263635 kubelet[2208]: I0912 17:14:11.262950 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.263635 kubelet[2208]: I0912 17:14:11.263058 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/901616d75bfff2fd1b4614ccc55b4858-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-e-c5bf4513f4\" (UID: \"901616d75bfff2fd1b4614ccc55b4858\") " pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.263635 kubelet[2208]: I0912 17:14:11.263137 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23ceac683a3ba4e96382b176df7bd8b1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-e-c5bf4513f4\" (UID: \"23ceac683a3ba4e96382b176df7bd8b1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.263635 kubelet[2208]: I0912 17:14:11.263210 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.263635 kubelet[2208]: I0912 17:14:11.263258 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.264001 kubelet[2208]: I0912 17:14:11.263293 2208 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.343547 kubelet[2208]: I0912 17:14:11.343455 2208 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.344233 kubelet[2208]: E0912 17:14:11.344076 2208 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://188.245.115.118:6443/api/v1/nodes\": dial tcp 188.245.115.118:6443: connect: connection refused" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.447339 containerd[1476]: time="2025-09-12T17:14:11.447059720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-e-c5bf4513f4,Uid:23ceac683a3ba4e96382b176df7bd8b1,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:11.450358 containerd[1476]: time="2025-09-12T17:14:11.450283459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-e-c5bf4513f4,Uid:26e302cf255a1af75e6fb084ae572445,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:11.455864 containerd[1476]: time="2025-09-12T17:14:11.455781063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-e-c5bf4513f4,Uid:901616d75bfff2fd1b4614ccc55b4858,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:11.564329 kubelet[2208]: E0912 17:14:11.564243 2208 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.115.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-e-c5bf4513f4?timeout=10s\": dial tcp 188.245.115.118:6443: connect: connection refused" interval="800ms" Sep 12 17:14:11.747693 kubelet[2208]: I0912 17:14:11.747526 2208 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.748435 kubelet[2208]: E0912 17:14:11.748124 2208 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://188.245.115.118:6443/api/v1/nodes\": dial tcp 188.245.115.118:6443: connect: connection refused" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:11.967849 kubelet[2208]: E0912 17:14:11.967760 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://188.245.115.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 17:14:11.982049 kubelet[2208]: E0912 17:14:11.981945 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://188.245.115.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 17:14:12.040530 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1256607522.mount: Deactivated successfully. Sep 12 17:14:12.049661 containerd[1476]: time="2025-09-12T17:14:12.049560762Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.057358 containerd[1476]: time="2025-09-12T17:14:12.057203133Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Sep 12 17:14:12.058062 containerd[1476]: time="2025-09-12T17:14:12.058020072Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.059860 containerd[1476]: time="2025-09-12T17:14:12.059781459Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.062851 containerd[1476]: time="2025-09-12T17:14:12.061437992Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.067010 containerd[1476]: time="2025-09-12T17:14:12.066936051Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 17:14:12.070053 containerd[1476]: time="2025-09-12T17:14:12.069963509Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 17:14:12.072178 containerd[1476]: time="2025-09-12T17:14:12.072096454Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.074300 containerd[1476]: time="2025-09-12T17:14:12.074216157Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 623.794509ms" Sep 12 17:14:12.079175 containerd[1476]: time="2025-09-12T17:14:12.079056771Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 623.133932ms" Sep 12 17:14:12.082376 containerd[1476]: time="2025-09-12T17:14:12.082272999Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 634.232939ms" Sep 12 17:14:12.141918 kubelet[2208]: E0912 17:14:12.141797 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://188.245.115.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-e-c5bf4513f4&limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 17:14:12.165861 kubelet[2208]: E0912 17:14:12.164085 2208 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://188.245.115.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 188.245.115.118:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 17:14:12.261747 containerd[1476]: time="2025-09-12T17:14:12.261591746Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:12.261747 containerd[1476]: time="2025-09-12T17:14:12.261752398Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:12.262030 containerd[1476]: time="2025-09-12T17:14:12.261784903Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.262030 containerd[1476]: time="2025-09-12T17:14:12.261950740Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.271571 containerd[1476]: time="2025-09-12T17:14:12.271275288Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:12.271571 containerd[1476]: time="2025-09-12T17:14:12.271351187Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:12.271571 containerd[1476]: time="2025-09-12T17:14:12.271377191Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.271571 containerd[1476]: time="2025-09-12T17:14:12.271499555Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.279605 containerd[1476]: time="2025-09-12T17:14:12.279134427Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:12.279605 containerd[1476]: time="2025-09-12T17:14:12.279232621Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:12.279605 containerd[1476]: time="2025-09-12T17:14:12.279280043Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.279605 containerd[1476]: time="2025-09-12T17:14:12.279420234Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.309637 systemd[1]: Started cri-containerd-52ddb3e6d7f90345dd1c8e779349aa269775ef930c45ba12af70657448a6d775.scope - libcontainer container 52ddb3e6d7f90345dd1c8e779349aa269775ef930c45ba12af70657448a6d775. Sep 12 17:14:12.323734 systemd[1]: Started cri-containerd-e8bbf014ec2eca08466943c1433a4dea65eac422e921fea294167e1b0f28fe65.scope - libcontainer container e8bbf014ec2eca08466943c1433a4dea65eac422e921fea294167e1b0f28fe65. Sep 12 17:14:12.332731 systemd[1]: Started cri-containerd-9b30347329ab37b335b82f843ec1fd602ac4c1424ae24cae62894c077ad43779.scope - libcontainer container 9b30347329ab37b335b82f843ec1fd602ac4c1424ae24cae62894c077ad43779. Sep 12 17:14:12.365367 kubelet[2208]: E0912 17:14:12.365292 2208 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.115.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-e-c5bf4513f4?timeout=10s\": dial tcp 188.245.115.118:6443: connect: connection refused" interval="1.6s" Sep 12 17:14:12.401643 containerd[1476]: time="2025-09-12T17:14:12.401202920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-e-c5bf4513f4,Uid:26e302cf255a1af75e6fb084ae572445,Namespace:kube-system,Attempt:0,} returns sandbox id \"52ddb3e6d7f90345dd1c8e779349aa269775ef930c45ba12af70657448a6d775\"" Sep 12 17:14:12.404493 containerd[1476]: time="2025-09-12T17:14:12.404003041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-e-c5bf4513f4,Uid:901616d75bfff2fd1b4614ccc55b4858,Namespace:kube-system,Attempt:0,} returns sandbox id \"e8bbf014ec2eca08466943c1433a4dea65eac422e921fea294167e1b0f28fe65\"" Sep 12 17:14:12.413087 containerd[1476]: time="2025-09-12T17:14:12.413039269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-e-c5bf4513f4,Uid:23ceac683a3ba4e96382b176df7bd8b1,Namespace:kube-system,Attempt:0,} returns sandbox id \"9b30347329ab37b335b82f843ec1fd602ac4c1424ae24cae62894c077ad43779\"" Sep 12 17:14:12.414886 containerd[1476]: time="2025-09-12T17:14:12.414730062Z" level=info msg="CreateContainer within sandbox \"52ddb3e6d7f90345dd1c8e779349aa269775ef930c45ba12af70657448a6d775\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:14:12.416453 containerd[1476]: time="2025-09-12T17:14:12.416281741Z" level=info msg="CreateContainer within sandbox \"e8bbf014ec2eca08466943c1433a4dea65eac422e921fea294167e1b0f28fe65\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:14:12.420545 containerd[1476]: time="2025-09-12T17:14:12.420496539Z" level=info msg="CreateContainer within sandbox \"9b30347329ab37b335b82f843ec1fd602ac4c1424ae24cae62894c077ad43779\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:14:12.451098 containerd[1476]: time="2025-09-12T17:14:12.451001887Z" level=info msg="CreateContainer within sandbox \"52ddb3e6d7f90345dd1c8e779349aa269775ef930c45ba12af70657448a6d775\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761\"" Sep 12 17:14:12.453379 containerd[1476]: time="2025-09-12T17:14:12.452749714Z" level=info msg="CreateContainer within sandbox \"9b30347329ab37b335b82f843ec1fd602ac4c1424ae24cae62894c077ad43779\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7af2b9821c98ac94f109dec579c80f2fe9a0ff8ef854463d97d88088fef8b3d1\"" Sep 12 17:14:12.454016 containerd[1476]: time="2025-09-12T17:14:12.453979809Z" level=info msg="StartContainer for \"7af2b9821c98ac94f109dec579c80f2fe9a0ff8ef854463d97d88088fef8b3d1\"" Sep 12 17:14:12.455952 containerd[1476]: time="2025-09-12T17:14:12.454222582Z" level=info msg="CreateContainer within sandbox \"e8bbf014ec2eca08466943c1433a4dea65eac422e921fea294167e1b0f28fe65\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a\"" Sep 12 17:14:12.456125 containerd[1476]: time="2025-09-12T17:14:12.454367479Z" level=info msg="StartContainer for \"2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761\"" Sep 12 17:14:12.467055 containerd[1476]: time="2025-09-12T17:14:12.466973306Z" level=info msg="StartContainer for \"3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a\"" Sep 12 17:14:12.498191 systemd[1]: Started cri-containerd-2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761.scope - libcontainer container 2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761. Sep 12 17:14:12.512145 systemd[1]: Started cri-containerd-7af2b9821c98ac94f109dec579c80f2fe9a0ff8ef854463d97d88088fef8b3d1.scope - libcontainer container 7af2b9821c98ac94f109dec579c80f2fe9a0ff8ef854463d97d88088fef8b3d1. Sep 12 17:14:12.535198 systemd[1]: Started cri-containerd-3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a.scope - libcontainer container 3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a. Sep 12 17:14:12.553000 kubelet[2208]: I0912 17:14:12.552395 2208 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:12.554836 kubelet[2208]: E0912 17:14:12.554247 2208 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://188.245.115.118:6443/api/v1/nodes\": dial tcp 188.245.115.118:6443: connect: connection refused" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:12.602330 containerd[1476]: time="2025-09-12T17:14:12.602150481Z" level=info msg="StartContainer for \"2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761\" returns successfully" Sep 12 17:14:12.604358 containerd[1476]: time="2025-09-12T17:14:12.602311811Z" level=info msg="StartContainer for \"7af2b9821c98ac94f109dec579c80f2fe9a0ff8ef854463d97d88088fef8b3d1\" returns successfully" Sep 12 17:14:12.615675 containerd[1476]: time="2025-09-12T17:14:12.615457824Z" level=info msg="StartContainer for \"3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a\" returns successfully" Sep 12 17:14:13.027989 kubelet[2208]: E0912 17:14:13.027735 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:13.027989 kubelet[2208]: E0912 17:14:13.027738 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:13.036921 kubelet[2208]: E0912 17:14:13.034662 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:14.029885 kubelet[2208]: E0912 17:14:14.029759 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:14.031263 kubelet[2208]: E0912 17:14:14.030806 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:14.157193 kubelet[2208]: I0912 17:14:14.157116 2208 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.031964 kubelet[2208]: E0912 17:14:15.030690 2208 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.334310 kubelet[2208]: E0912 17:14:15.334092 2208 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-e-c5bf4513f4\" not found" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.506864 kubelet[2208]: I0912 17:14:15.505420 2208 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.562674 kubelet[2208]: I0912 17:14:15.562552 2208 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.615352 kubelet[2208]: E0912 17:14:15.615166 2208 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.615352 kubelet[2208]: I0912 17:14:15.615217 2208 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.622435 kubelet[2208]: E0912 17:14:15.622364 2208 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-e-c5bf4513f4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.622435 kubelet[2208]: I0912 17:14:15.622418 2208 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.635168 kubelet[2208]: E0912 17:14:15.635105 2208 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-e-c5bf4513f4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:15.945983 kubelet[2208]: I0912 17:14:15.943786 2208 apiserver.go:52] "Watching apiserver" Sep 12 17:14:15.961255 kubelet[2208]: I0912 17:14:15.961198 2208 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:14:17.493213 kubelet[2208]: I0912 17:14:17.492849 2208 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:17.766809 systemd[1]: Reloading requested from client PID 2496 ('systemctl') (unit session-7.scope)... Sep 12 17:14:17.767343 systemd[1]: Reloading... Sep 12 17:14:17.919871 zram_generator::config[2536]: No configuration found. Sep 12 17:14:18.045735 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:14:18.136187 systemd[1]: Reloading finished in 368 ms. Sep 12 17:14:18.191790 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:18.214977 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:14:18.215346 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:18.215444 systemd[1]: kubelet.service: Consumed 1.190s CPU time, 129.4M memory peak, 0B memory swap peak. Sep 12 17:14:18.223330 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:18.421399 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:18.433673 (kubelet)[2581]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:14:18.507126 kubelet[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:18.507126 kubelet[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:14:18.507126 kubelet[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:18.507707 kubelet[2581]: I0912 17:14:18.507182 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:14:18.526234 kubelet[2581]: I0912 17:14:18.525576 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 17:14:18.526234 kubelet[2581]: I0912 17:14:18.525641 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:14:18.526234 kubelet[2581]: I0912 17:14:18.526026 2581 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 17:14:18.533863 kubelet[2581]: I0912 17:14:18.532276 2581 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 12 17:14:18.538056 kubelet[2581]: I0912 17:14:18.537945 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:14:18.545583 kubelet[2581]: E0912 17:14:18.545468 2581 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 17:14:18.545583 kubelet[2581]: I0912 17:14:18.545521 2581 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 17:14:18.549850 kubelet[2581]: I0912 17:14:18.549481 2581 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:14:18.549850 kubelet[2581]: I0912 17:14:18.549785 2581 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:14:18.550145 kubelet[2581]: I0912 17:14:18.549881 2581 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-e-c5bf4513f4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:14:18.550145 kubelet[2581]: I0912 17:14:18.550129 2581 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:14:18.550145 kubelet[2581]: I0912 17:14:18.550141 2581 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 17:14:18.550279 kubelet[2581]: I0912 17:14:18.550196 2581 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:18.550892 kubelet[2581]: I0912 17:14:18.550423 2581 kubelet.go:480] "Attempting to sync node with API server" Sep 12 17:14:18.550892 kubelet[2581]: I0912 17:14:18.550457 2581 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:14:18.551425 kubelet[2581]: I0912 17:14:18.551173 2581 kubelet.go:386] "Adding apiserver pod source" Sep 12 17:14:18.551425 kubelet[2581]: I0912 17:14:18.551203 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:14:18.560436 kubelet[2581]: I0912 17:14:18.560388 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 17:14:18.562886 kubelet[2581]: I0912 17:14:18.561395 2581 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 17:14:18.585845 kubelet[2581]: I0912 17:14:18.585020 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:14:18.585845 kubelet[2581]: I0912 17:14:18.585099 2581 server.go:1289] "Started kubelet" Sep 12 17:14:18.589056 kubelet[2581]: I0912 17:14:18.588956 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:14:18.590659 kubelet[2581]: I0912 17:14:18.590617 2581 server.go:317] "Adding debug handlers to kubelet server" Sep 12 17:14:18.602081 kubelet[2581]: I0912 17:14:18.601976 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:14:18.604289 kubelet[2581]: I0912 17:14:18.604180 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:14:18.605541 kubelet[2581]: I0912 17:14:18.605488 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:14:18.609385 kubelet[2581]: I0912 17:14:18.608730 2581 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:14:18.615042 kubelet[2581]: I0912 17:14:18.614147 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:14:18.615042 kubelet[2581]: I0912 17:14:18.614314 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:14:18.615042 kubelet[2581]: I0912 17:14:18.614455 2581 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:14:18.621968 kubelet[2581]: E0912 17:14:18.621919 2581 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:14:18.626859 kubelet[2581]: I0912 17:14:18.626726 2581 factory.go:223] Registration of the systemd container factory successfully Sep 12 17:14:18.628101 kubelet[2581]: I0912 17:14:18.628049 2581 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:14:18.635574 kubelet[2581]: I0912 17:14:18.635505 2581 factory.go:223] Registration of the containerd container factory successfully Sep 12 17:14:18.648546 kubelet[2581]: I0912 17:14:18.648343 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 17:14:18.651623 kubelet[2581]: I0912 17:14:18.651458 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 17:14:18.651623 kubelet[2581]: I0912 17:14:18.651509 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 17:14:18.651623 kubelet[2581]: I0912 17:14:18.651550 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:14:18.651623 kubelet[2581]: I0912 17:14:18.651559 2581 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 17:14:18.651949 kubelet[2581]: E0912 17:14:18.651663 2581 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:14:18.713137 kubelet[2581]: I0912 17:14:18.713074 2581 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:14:18.713137 kubelet[2581]: I0912 17:14:18.713105 2581 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:14:18.713137 kubelet[2581]: I0912 17:14:18.713143 2581 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:18.713571 kubelet[2581]: I0912 17:14:18.713372 2581 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:14:18.713571 kubelet[2581]: I0912 17:14:18.713385 2581 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:14:18.713571 kubelet[2581]: I0912 17:14:18.713416 2581 policy_none.go:49] "None policy: Start" Sep 12 17:14:18.713571 kubelet[2581]: I0912 17:14:18.713425 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:14:18.713571 kubelet[2581]: I0912 17:14:18.713435 2581 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:14:18.713571 kubelet[2581]: I0912 17:14:18.713573 2581 state_mem.go:75] "Updated machine memory state" Sep 12 17:14:18.720188 kubelet[2581]: E0912 17:14:18.720083 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 17:14:18.720449 kubelet[2581]: I0912 17:14:18.720428 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:14:18.720520 kubelet[2581]: I0912 17:14:18.720452 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:14:18.725501 kubelet[2581]: I0912 17:14:18.724104 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:14:18.734457 kubelet[2581]: E0912 17:14:18.734403 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:14:18.755356 kubelet[2581]: I0912 17:14:18.753658 2581 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.759479 kubelet[2581]: I0912 17:14:18.757862 2581 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.760197 kubelet[2581]: I0912 17:14:18.758428 2581 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.777722 kubelet[2581]: E0912 17:14:18.777627 2581 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-e-c5bf4513f4\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.815127 kubelet[2581]: I0912 17:14:18.815054 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/901616d75bfff2fd1b4614ccc55b4858-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-e-c5bf4513f4\" (UID: \"901616d75bfff2fd1b4614ccc55b4858\") " pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.815894 kubelet[2581]: I0912 17:14:18.815447 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23ceac683a3ba4e96382b176df7bd8b1-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-e-c5bf4513f4\" (UID: \"23ceac683a3ba4e96382b176df7bd8b1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.815894 kubelet[2581]: I0912 17:14:18.815504 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.815894 kubelet[2581]: I0912 17:14:18.815548 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.815894 kubelet[2581]: I0912 17:14:18.815588 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23ceac683a3ba4e96382b176df7bd8b1-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-e-c5bf4513f4\" (UID: \"23ceac683a3ba4e96382b176df7bd8b1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.815894 kubelet[2581]: I0912 17:14:18.815630 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23ceac683a3ba4e96382b176df7bd8b1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-e-c5bf4513f4\" (UID: \"23ceac683a3ba4e96382b176df7bd8b1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.816260 kubelet[2581]: I0912 17:14:18.815670 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.816260 kubelet[2581]: I0912 17:14:18.815706 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.816260 kubelet[2581]: I0912 17:14:18.815746 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/26e302cf255a1af75e6fb084ae572445-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-e-c5bf4513f4\" (UID: \"26e302cf255a1af75e6fb084ae572445\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.830515 kubelet[2581]: I0912 17:14:18.830471 2581 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.845574 kubelet[2581]: I0912 17:14:18.845287 2581 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:18.845574 kubelet[2581]: I0912 17:14:18.845449 2581 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:14:19.556350 kubelet[2581]: I0912 17:14:19.556264 2581 apiserver.go:52] "Watching apiserver" Sep 12 17:14:19.614866 kubelet[2581]: I0912 17:14:19.614768 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:14:19.766354 kubelet[2581]: I0912 17:14:19.766130 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-e-c5bf4513f4" podStartSLOduration=2.766097238 podStartE2EDuration="2.766097238s" podCreationTimestamp="2025-09-12 17:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:19.742444896 +0000 UTC m=+1.298268687" watchObservedRunningTime="2025-09-12 17:14:19.766097238 +0000 UTC m=+1.321920989" Sep 12 17:14:19.795767 kubelet[2581]: I0912 17:14:19.795646 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-e-c5bf4513f4" podStartSLOduration=1.795613882 podStartE2EDuration="1.795613882s" podCreationTimestamp="2025-09-12 17:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:19.769022897 +0000 UTC m=+1.324846687" watchObservedRunningTime="2025-09-12 17:14:19.795613882 +0000 UTC m=+1.351437713" Sep 12 17:14:19.816939 kubelet[2581]: I0912 17:14:19.816658 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-e-c5bf4513f4" podStartSLOduration=1.816630355 podStartE2EDuration="1.816630355s" podCreationTimestamp="2025-09-12 17:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:19.796084225 +0000 UTC m=+1.351908016" watchObservedRunningTime="2025-09-12 17:14:19.816630355 +0000 UTC m=+1.372454146" Sep 12 17:14:24.586151 kubelet[2581]: I0912 17:14:24.586057 2581 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:14:24.587240 kubelet[2581]: I0912 17:14:24.587148 2581 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:14:24.587341 containerd[1476]: time="2025-09-12T17:14:24.586654371Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:14:25.068998 systemd[1]: Created slice kubepods-besteffort-podea2bc899_9462_4d2d_a24f_406ea4397a10.slice - libcontainer container kubepods-besteffort-podea2bc899_9462_4d2d_a24f_406ea4397a10.slice. Sep 12 17:14:25.166543 kubelet[2581]: I0912 17:14:25.166457 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ea2bc899-9462-4d2d-a24f-406ea4397a10-kube-proxy\") pod \"kube-proxy-g98hh\" (UID: \"ea2bc899-9462-4d2d-a24f-406ea4397a10\") " pod="kube-system/kube-proxy-g98hh" Sep 12 17:14:25.166543 kubelet[2581]: I0912 17:14:25.166537 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ea2bc899-9462-4d2d-a24f-406ea4397a10-xtables-lock\") pod \"kube-proxy-g98hh\" (UID: \"ea2bc899-9462-4d2d-a24f-406ea4397a10\") " pod="kube-system/kube-proxy-g98hh" Sep 12 17:14:25.166543 kubelet[2581]: I0912 17:14:25.166560 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea2bc899-9462-4d2d-a24f-406ea4397a10-lib-modules\") pod \"kube-proxy-g98hh\" (UID: \"ea2bc899-9462-4d2d-a24f-406ea4397a10\") " pod="kube-system/kube-proxy-g98hh" Sep 12 17:14:25.166803 kubelet[2581]: I0912 17:14:25.166584 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c947h\" (UniqueName: \"kubernetes.io/projected/ea2bc899-9462-4d2d-a24f-406ea4397a10-kube-api-access-c947h\") pod \"kube-proxy-g98hh\" (UID: \"ea2bc899-9462-4d2d-a24f-406ea4397a10\") " pod="kube-system/kube-proxy-g98hh" Sep 12 17:14:25.281546 kubelet[2581]: E0912 17:14:25.281344 2581 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 12 17:14:25.281546 kubelet[2581]: E0912 17:14:25.281398 2581 projected.go:194] Error preparing data for projected volume kube-api-access-c947h for pod kube-system/kube-proxy-g98hh: configmap "kube-root-ca.crt" not found Sep 12 17:14:25.281546 kubelet[2581]: E0912 17:14:25.281511 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea2bc899-9462-4d2d-a24f-406ea4397a10-kube-api-access-c947h podName:ea2bc899-9462-4d2d-a24f-406ea4397a10 nodeName:}" failed. No retries permitted until 2025-09-12 17:14:25.781477036 +0000 UTC m=+7.337300827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c947h" (UniqueName: "kubernetes.io/projected/ea2bc899-9462-4d2d-a24f-406ea4397a10-kube-api-access-c947h") pod "kube-proxy-g98hh" (UID: "ea2bc899-9462-4d2d-a24f-406ea4397a10") : configmap "kube-root-ca.crt" not found Sep 12 17:14:25.781846 systemd[1]: Created slice kubepods-besteffort-pod46b1f06e_7701_4bc5_b45a_e7496d8fb665.slice - libcontainer container kubepods-besteffort-pod46b1f06e_7701_4bc5_b45a_e7496d8fb665.slice. Sep 12 17:14:25.874727 kubelet[2581]: I0912 17:14:25.873403 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb5q8\" (UniqueName: \"kubernetes.io/projected/46b1f06e-7701-4bc5-b45a-e7496d8fb665-kube-api-access-tb5q8\") pod \"tigera-operator-755d956888-2pj5v\" (UID: \"46b1f06e-7701-4bc5-b45a-e7496d8fb665\") " pod="tigera-operator/tigera-operator-755d956888-2pj5v" Sep 12 17:14:25.874727 kubelet[2581]: I0912 17:14:25.873580 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/46b1f06e-7701-4bc5-b45a-e7496d8fb665-var-lib-calico\") pod \"tigera-operator-755d956888-2pj5v\" (UID: \"46b1f06e-7701-4bc5-b45a-e7496d8fb665\") " pod="tigera-operator/tigera-operator-755d956888-2pj5v" Sep 12 17:14:25.980339 containerd[1476]: time="2025-09-12T17:14:25.980264673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-g98hh,Uid:ea2bc899-9462-4d2d-a24f-406ea4397a10,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:26.018678 containerd[1476]: time="2025-09-12T17:14:26.018461266Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:26.018678 containerd[1476]: time="2025-09-12T17:14:26.018553434Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:26.018678 containerd[1476]: time="2025-09-12T17:14:26.018596517Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:26.019394 containerd[1476]: time="2025-09-12T17:14:26.019253415Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:26.049334 systemd[1]: Started cri-containerd-4755e23b633812c7d3f90628c5d68338d07ce694f97fff32b2141b9fe20272e1.scope - libcontainer container 4755e23b633812c7d3f90628c5d68338d07ce694f97fff32b2141b9fe20272e1. Sep 12 17:14:26.089455 containerd[1476]: time="2025-09-12T17:14:26.089376866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-g98hh,Uid:ea2bc899-9462-4d2d-a24f-406ea4397a10,Namespace:kube-system,Attempt:0,} returns sandbox id \"4755e23b633812c7d3f90628c5d68338d07ce694f97fff32b2141b9fe20272e1\"" Sep 12 17:14:26.092071 containerd[1476]: time="2025-09-12T17:14:26.092012418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-2pj5v,Uid:46b1f06e-7701-4bc5-b45a-e7496d8fb665,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:14:26.102312 containerd[1476]: time="2025-09-12T17:14:26.102239758Z" level=info msg="CreateContainer within sandbox \"4755e23b633812c7d3f90628c5d68338d07ce694f97fff32b2141b9fe20272e1\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:14:26.127695 containerd[1476]: time="2025-09-12T17:14:26.127479459Z" level=info msg="CreateContainer within sandbox \"4755e23b633812c7d3f90628c5d68338d07ce694f97fff32b2141b9fe20272e1\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d6190f27aec6e79de1f19acc79e4b25d40ccf35175e000f0dfe90f2954eee801\"" Sep 12 17:14:26.133395 containerd[1476]: time="2025-09-12T17:14:26.132108027Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:26.133395 containerd[1476]: time="2025-09-12T17:14:26.132318845Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:26.133395 containerd[1476]: time="2025-09-12T17:14:26.132338527Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:26.133395 containerd[1476]: time="2025-09-12T17:14:26.132549706Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:26.134861 containerd[1476]: time="2025-09-12T17:14:26.134606727Z" level=info msg="StartContainer for \"d6190f27aec6e79de1f19acc79e4b25d40ccf35175e000f0dfe90f2954eee801\"" Sep 12 17:14:26.170137 systemd[1]: Started cri-containerd-5e33970f5127ded1869d7113c18762937b978a2358ed8a66767a6abe05ddbfc7.scope - libcontainer container 5e33970f5127ded1869d7113c18762937b978a2358ed8a66767a6abe05ddbfc7. Sep 12 17:14:26.194021 systemd[1]: Started cri-containerd-d6190f27aec6e79de1f19acc79e4b25d40ccf35175e000f0dfe90f2954eee801.scope - libcontainer container d6190f27aec6e79de1f19acc79e4b25d40ccf35175e000f0dfe90f2954eee801. Sep 12 17:14:26.254166 containerd[1476]: time="2025-09-12T17:14:26.253334775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-2pj5v,Uid:46b1f06e-7701-4bc5-b45a-e7496d8fb665,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5e33970f5127ded1869d7113c18762937b978a2358ed8a66767a6abe05ddbfc7\"" Sep 12 17:14:26.260104 containerd[1476]: time="2025-09-12T17:14:26.259215132Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:14:26.275869 containerd[1476]: time="2025-09-12T17:14:26.275729026Z" level=info msg="StartContainer for \"d6190f27aec6e79de1f19acc79e4b25d40ccf35175e000f0dfe90f2954eee801\" returns successfully" Sep 12 17:14:26.831272 kubelet[2581]: I0912 17:14:26.831160 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-g98hh" podStartSLOduration=1.831130782 podStartE2EDuration="1.831130782s" podCreationTimestamp="2025-09-12 17:14:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:26.758727611 +0000 UTC m=+8.314551402" watchObservedRunningTime="2025-09-12 17:14:26.831130782 +0000 UTC m=+8.386954533" Sep 12 17:14:28.302563 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount625188338.mount: Deactivated successfully. Sep 12 17:14:30.755883 containerd[1476]: time="2025-09-12T17:14:30.755313093Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:30.756784 containerd[1476]: time="2025-09-12T17:14:30.755908735Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 17:14:30.760183 containerd[1476]: time="2025-09-12T17:14:30.759994702Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:30.765936 containerd[1476]: time="2025-09-12T17:14:30.765796069Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:30.767278 containerd[1476]: time="2025-09-12T17:14:30.766786019Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 4.507505601s" Sep 12 17:14:30.767278 containerd[1476]: time="2025-09-12T17:14:30.766858184Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 17:14:30.776165 containerd[1476]: time="2025-09-12T17:14:30.775975224Z" level=info msg="CreateContainer within sandbox \"5e33970f5127ded1869d7113c18762937b978a2358ed8a66767a6abe05ddbfc7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:14:30.833925 containerd[1476]: time="2025-09-12T17:14:30.833411176Z" level=info msg="CreateContainer within sandbox \"5e33970f5127ded1869d7113c18762937b978a2358ed8a66767a6abe05ddbfc7\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7\"" Sep 12 17:14:30.835579 containerd[1476]: time="2025-09-12T17:14:30.835461960Z" level=info msg="StartContainer for \"286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7\"" Sep 12 17:14:30.897215 systemd[1]: Started cri-containerd-286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7.scope - libcontainer container 286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7. Sep 12 17:14:30.941221 containerd[1476]: time="2025-09-12T17:14:30.941034692Z" level=info msg="StartContainer for \"286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7\" returns successfully" Sep 12 17:14:38.420979 sudo[1699]: pam_unix(sudo:session): session closed for user root Sep 12 17:14:38.583232 sshd[1696]: pam_unix(sshd:session): session closed for user core Sep 12 17:14:38.592201 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:14:38.592948 systemd[1]: session-7.scope: Consumed 7.222s CPU time, 155.1M memory peak, 0B memory swap peak. Sep 12 17:14:38.593656 systemd[1]: sshd@6-188.245.115.118:22-139.178.89.65:41492.service: Deactivated successfully. Sep 12 17:14:38.606745 systemd-logind[1456]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:14:38.609546 systemd-logind[1456]: Removed session 7. Sep 12 17:14:47.973095 kubelet[2581]: I0912 17:14:47.972980 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-2pj5v" podStartSLOduration=18.460789569 podStartE2EDuration="22.972949301s" podCreationTimestamp="2025-09-12 17:14:25 +0000 UTC" firstStartedPulling="2025-09-12 17:14:26.256715913 +0000 UTC m=+7.812539704" lastFinishedPulling="2025-09-12 17:14:30.768875645 +0000 UTC m=+12.324699436" observedRunningTime="2025-09-12 17:14:31.77127439 +0000 UTC m=+13.327098181" watchObservedRunningTime="2025-09-12 17:14:47.972949301 +0000 UTC m=+29.528773092" Sep 12 17:14:47.990774 systemd[1]: Created slice kubepods-besteffort-podca8579a1_98e5_49a5_a3d6_801c9126d52e.slice - libcontainer container kubepods-besteffort-podca8579a1_98e5_49a5_a3d6_801c9126d52e.slice. Sep 12 17:14:48.028234 kubelet[2581]: I0912 17:14:48.028027 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca8579a1-98e5-49a5-a3d6-801c9126d52e-tigera-ca-bundle\") pod \"calico-typha-574bd669bb-psb45\" (UID: \"ca8579a1-98e5-49a5-a3d6-801c9126d52e\") " pod="calico-system/calico-typha-574bd669bb-psb45" Sep 12 17:14:48.028234 kubelet[2581]: I0912 17:14:48.028153 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ca8579a1-98e5-49a5-a3d6-801c9126d52e-typha-certs\") pod \"calico-typha-574bd669bb-psb45\" (UID: \"ca8579a1-98e5-49a5-a3d6-801c9126d52e\") " pod="calico-system/calico-typha-574bd669bb-psb45" Sep 12 17:14:48.028234 kubelet[2581]: I0912 17:14:48.028178 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dsl5\" (UniqueName: \"kubernetes.io/projected/ca8579a1-98e5-49a5-a3d6-801c9126d52e-kube-api-access-8dsl5\") pod \"calico-typha-574bd669bb-psb45\" (UID: \"ca8579a1-98e5-49a5-a3d6-801c9126d52e\") " pod="calico-system/calico-typha-574bd669bb-psb45" Sep 12 17:14:48.301134 containerd[1476]: time="2025-09-12T17:14:48.300879346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-574bd669bb-psb45,Uid:ca8579a1-98e5-49a5-a3d6-801c9126d52e,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:48.353298 containerd[1476]: time="2025-09-12T17:14:48.353084244Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:48.353298 containerd[1476]: time="2025-09-12T17:14:48.353314850Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:48.353657 containerd[1476]: time="2025-09-12T17:14:48.353345771Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:48.353799 containerd[1476]: time="2025-09-12T17:14:48.353743703Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:48.407229 systemd[1]: Started cri-containerd-733595de882fee109217928cf1cd184f2ce1f3c71ed5ef52a512d45900fef0cc.scope - libcontainer container 733595de882fee109217928cf1cd184f2ce1f3c71ed5ef52a512d45900fef0cc. Sep 12 17:14:48.504787 systemd[1]: Created slice kubepods-besteffort-pod4a9b6252_90cf_4255_bcfd_200708a162a1.slice - libcontainer container kubepods-besteffort-pod4a9b6252_90cf_4255_bcfd_200708a162a1.slice. Sep 12 17:14:48.531939 kubelet[2581]: I0912 17:14:48.531836 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9b6252-90cf-4255-bcfd-200708a162a1-tigera-ca-bundle\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.531939 kubelet[2581]: I0912 17:14:48.531928 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-var-lib-calico\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.531939 kubelet[2581]: I0912 17:14:48.531956 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-xtables-lock\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.532366 kubelet[2581]: I0912 17:14:48.531979 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-policysync\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.532366 kubelet[2581]: I0912 17:14:48.531999 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-cni-bin-dir\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.532366 kubelet[2581]: I0912 17:14:48.532023 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-flexvol-driver-host\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.532366 kubelet[2581]: I0912 17:14:48.532051 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-lib-modules\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.532366 kubelet[2581]: I0912 17:14:48.532069 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-var-run-calico\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.533466 kubelet[2581]: I0912 17:14:48.532091 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-cni-log-dir\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.533466 kubelet[2581]: I0912 17:14:48.532110 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vlgk\" (UniqueName: \"kubernetes.io/projected/4a9b6252-90cf-4255-bcfd-200708a162a1-kube-api-access-6vlgk\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.533466 kubelet[2581]: I0912 17:14:48.532126 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4a9b6252-90cf-4255-bcfd-200708a162a1-cni-net-dir\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.533466 kubelet[2581]: I0912 17:14:48.532143 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4a9b6252-90cf-4255-bcfd-200708a162a1-node-certs\") pod \"calico-node-nxdct\" (UID: \"4a9b6252-90cf-4255-bcfd-200708a162a1\") " pod="calico-system/calico-node-nxdct" Sep 12 17:14:48.634769 kubelet[2581]: E0912 17:14:48.634503 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.634769 kubelet[2581]: W0912 17:14:48.634560 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.634769 kubelet[2581]: E0912 17:14:48.634617 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.636750 kubelet[2581]: E0912 17:14:48.636687 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.636750 kubelet[2581]: W0912 17:14:48.636733 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.636750 kubelet[2581]: E0912 17:14:48.636774 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.637337 kubelet[2581]: E0912 17:14:48.637297 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.637337 kubelet[2581]: W0912 17:14:48.637323 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.637337 kubelet[2581]: E0912 17:14:48.637339 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.637692 kubelet[2581]: E0912 17:14:48.637665 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.637692 kubelet[2581]: W0912 17:14:48.637685 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.637692 kubelet[2581]: E0912 17:14:48.637699 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.638621 kubelet[2581]: E0912 17:14:48.638582 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.638621 kubelet[2581]: W0912 17:14:48.638614 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.638745 kubelet[2581]: E0912 17:14:48.638629 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.640026 kubelet[2581]: E0912 17:14:48.639981 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.640026 kubelet[2581]: W0912 17:14:48.640020 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.640281 kubelet[2581]: E0912 17:14:48.640041 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.640490 kubelet[2581]: E0912 17:14:48.640357 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.640490 kubelet[2581]: W0912 17:14:48.640376 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.640490 kubelet[2581]: E0912 17:14:48.640387 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.640774 kubelet[2581]: E0912 17:14:48.640617 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.640774 kubelet[2581]: W0912 17:14:48.640628 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.640774 kubelet[2581]: E0912 17:14:48.640638 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.641004 kubelet[2581]: E0912 17:14:48.640910 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.641004 kubelet[2581]: W0912 17:14:48.640921 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.641004 kubelet[2581]: E0912 17:14:48.640931 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.641982 kubelet[2581]: E0912 17:14:48.641417 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.641982 kubelet[2581]: W0912 17:14:48.641431 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.641982 kubelet[2581]: E0912 17:14:48.641444 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.641982 kubelet[2581]: E0912 17:14:48.641874 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.641982 kubelet[2581]: W0912 17:14:48.641886 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.641982 kubelet[2581]: E0912 17:14:48.641898 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.642413 kubelet[2581]: E0912 17:14:48.642389 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.642413 kubelet[2581]: W0912 17:14:48.642407 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.643028 kubelet[2581]: E0912 17:14:48.642420 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.643028 kubelet[2581]: E0912 17:14:48.642743 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.643028 kubelet[2581]: W0912 17:14:48.642754 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.643028 kubelet[2581]: E0912 17:14:48.642765 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.643183 kubelet[2581]: E0912 17:14:48.643163 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.643218 kubelet[2581]: W0912 17:14:48.643184 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.643218 kubelet[2581]: E0912 17:14:48.643196 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.643990 kubelet[2581]: E0912 17:14:48.643960 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.643990 kubelet[2581]: W0912 17:14:48.643983 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.644100 kubelet[2581]: E0912 17:14:48.644003 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.647112 kubelet[2581]: E0912 17:14:48.647024 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.647112 kubelet[2581]: W0912 17:14:48.647081 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.647112 kubelet[2581]: E0912 17:14:48.647124 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.649061 kubelet[2581]: E0912 17:14:48.648445 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.649061 kubelet[2581]: W0912 17:14:48.648474 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.649061 kubelet[2581]: E0912 17:14:48.648502 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.658892 kubelet[2581]: E0912 17:14:48.658572 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.658892 kubelet[2581]: W0912 17:14:48.658617 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.658892 kubelet[2581]: E0912 17:14:48.658648 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.677840 kubelet[2581]: E0912 17:14:48.676562 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.677840 kubelet[2581]: W0912 17:14:48.676618 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.677840 kubelet[2581]: E0912 17:14:48.676971 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.702635 containerd[1476]: time="2025-09-12T17:14:48.702550870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-574bd669bb-psb45,Uid:ca8579a1-98e5-49a5-a3d6-801c9126d52e,Namespace:calico-system,Attempt:0,} returns sandbox id \"733595de882fee109217928cf1cd184f2ce1f3c71ed5ef52a512d45900fef0cc\"" Sep 12 17:14:48.711862 containerd[1476]: time="2025-09-12T17:14:48.711752134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:14:48.816415 containerd[1476]: time="2025-09-12T17:14:48.816332175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nxdct,Uid:4a9b6252-90cf-4255-bcfd-200708a162a1,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:48.864919 kubelet[2581]: E0912 17:14:48.864122 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:14:48.895961 containerd[1476]: time="2025-09-12T17:14:48.894202649Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:48.896482 containerd[1476]: time="2025-09-12T17:14:48.896345790Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:48.896482 containerd[1476]: time="2025-09-12T17:14:48.896411672Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:48.900064 containerd[1476]: time="2025-09-12T17:14:48.898482852Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:48.917358 kubelet[2581]: E0912 17:14:48.917046 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.917358 kubelet[2581]: W0912 17:14:48.917099 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.917358 kubelet[2581]: E0912 17:14:48.917145 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.918349 kubelet[2581]: E0912 17:14:48.918132 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.918781 kubelet[2581]: W0912 17:14:48.918161 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.919181 kubelet[2581]: E0912 17:14:48.918674 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.926135 kubelet[2581]: E0912 17:14:48.924985 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.928916 kubelet[2581]: W0912 17:14:48.926796 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.928916 kubelet[2581]: E0912 17:14:48.926898 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.933031 kubelet[2581]: E0912 17:14:48.932902 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.935407 kubelet[2581]: W0912 17:14:48.933966 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.935407 kubelet[2581]: E0912 17:14:48.934461 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.940179 kubelet[2581]: E0912 17:14:48.939796 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.940179 kubelet[2581]: W0912 17:14:48.939872 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.940179 kubelet[2581]: E0912 17:14:48.939917 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.941842 kubelet[2581]: E0912 17:14:48.941449 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.942147 kubelet[2581]: W0912 17:14:48.942077 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.942675 kubelet[2581]: E0912 17:14:48.942374 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.944754 kubelet[2581]: E0912 17:14:48.944704 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.945156 systemd[1]: Started cri-containerd-07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47.scope - libcontainer container 07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47. Sep 12 17:14:48.946449 kubelet[2581]: W0912 17:14:48.946305 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.948330 kubelet[2581]: E0912 17:14:48.947484 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.952337 kubelet[2581]: E0912 17:14:48.951898 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.952337 kubelet[2581]: W0912 17:14:48.952065 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.952337 kubelet[2581]: E0912 17:14:48.952100 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.958887 kubelet[2581]: E0912 17:14:48.958071 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.961288 kubelet[2581]: W0912 17:14:48.958414 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.961288 kubelet[2581]: E0912 17:14:48.961175 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.963331 kubelet[2581]: E0912 17:14:48.963040 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.963331 kubelet[2581]: W0912 17:14:48.963076 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.963331 kubelet[2581]: E0912 17:14:48.963112 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.966218 kubelet[2581]: E0912 17:14:48.965726 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.966218 kubelet[2581]: W0912 17:14:48.965769 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.966218 kubelet[2581]: E0912 17:14:48.965833 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.966806 kubelet[2581]: E0912 17:14:48.966624 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.966806 kubelet[2581]: W0912 17:14:48.966665 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.966806 kubelet[2581]: E0912 17:14:48.966694 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.967611 kubelet[2581]: E0912 17:14:48.967401 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.967611 kubelet[2581]: W0912 17:14:48.967421 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.967611 kubelet[2581]: E0912 17:14:48.967434 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.968474 kubelet[2581]: E0912 17:14:48.968242 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.968474 kubelet[2581]: W0912 17:14:48.968264 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.968474 kubelet[2581]: E0912 17:14:48.968293 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.968991 kubelet[2581]: E0912 17:14:48.968854 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.968991 kubelet[2581]: W0912 17:14:48.968871 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.968991 kubelet[2581]: E0912 17:14:48.968936 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.969513 kubelet[2581]: E0912 17:14:48.969337 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.969513 kubelet[2581]: W0912 17:14:48.969365 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.969513 kubelet[2581]: E0912 17:14:48.969378 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.970316 kubelet[2581]: E0912 17:14:48.970071 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.970316 kubelet[2581]: W0912 17:14:48.970101 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.970316 kubelet[2581]: E0912 17:14:48.970116 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.970682 kubelet[2581]: E0912 17:14:48.970520 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.970682 kubelet[2581]: W0912 17:14:48.970567 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.970682 kubelet[2581]: E0912 17:14:48.970580 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.971135 kubelet[2581]: E0912 17:14:48.971028 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.971135 kubelet[2581]: W0912 17:14:48.971044 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.971135 kubelet[2581]: E0912 17:14:48.971056 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.971963 kubelet[2581]: E0912 17:14:48.971639 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.971963 kubelet[2581]: W0912 17:14:48.971653 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.971963 kubelet[2581]: E0912 17:14:48.971665 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.973614 kubelet[2581]: E0912 17:14:48.973583 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.974708 kubelet[2581]: W0912 17:14:48.974095 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.974708 kubelet[2581]: E0912 17:14:48.974140 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.974708 kubelet[2581]: I0912 17:14:48.974193 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5670fa1-14ff-4b4b-93e9-778888c14647-kubelet-dir\") pod \"csi-node-driver-cmgr5\" (UID: \"a5670fa1-14ff-4b4b-93e9-778888c14647\") " pod="calico-system/csi-node-driver-cmgr5" Sep 12 17:14:48.975706 kubelet[2581]: E0912 17:14:48.975126 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.975706 kubelet[2581]: W0912 17:14:48.975159 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.975706 kubelet[2581]: E0912 17:14:48.975186 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.975706 kubelet[2581]: I0912 17:14:48.975230 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a5670fa1-14ff-4b4b-93e9-778888c14647-varrun\") pod \"csi-node-driver-cmgr5\" (UID: \"a5670fa1-14ff-4b4b-93e9-778888c14647\") " pod="calico-system/csi-node-driver-cmgr5" Sep 12 17:14:48.975706 kubelet[2581]: E0912 17:14:48.975563 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.975706 kubelet[2581]: W0912 17:14:48.975577 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.975706 kubelet[2581]: E0912 17:14:48.975591 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.977071 kubelet[2581]: E0912 17:14:48.976484 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.977071 kubelet[2581]: W0912 17:14:48.976522 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.977071 kubelet[2581]: E0912 17:14:48.976538 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.977674 kubelet[2581]: E0912 17:14:48.977344 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.977674 kubelet[2581]: W0912 17:14:48.977369 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.977674 kubelet[2581]: E0912 17:14:48.977383 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.977674 kubelet[2581]: I0912 17:14:48.977421 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a5670fa1-14ff-4b4b-93e9-778888c14647-registration-dir\") pod \"csi-node-driver-cmgr5\" (UID: \"a5670fa1-14ff-4b4b-93e9-778888c14647\") " pod="calico-system/csi-node-driver-cmgr5" Sep 12 17:14:48.978196 kubelet[2581]: E0912 17:14:48.978150 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.978466 kubelet[2581]: W0912 17:14:48.978259 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.978466 kubelet[2581]: E0912 17:14:48.978279 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.978466 kubelet[2581]: I0912 17:14:48.978304 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flqt4\" (UniqueName: \"kubernetes.io/projected/a5670fa1-14ff-4b4b-93e9-778888c14647-kube-api-access-flqt4\") pod \"csi-node-driver-cmgr5\" (UID: \"a5670fa1-14ff-4b4b-93e9-778888c14647\") " pod="calico-system/csi-node-driver-cmgr5" Sep 12 17:14:48.978915 kubelet[2581]: E0912 17:14:48.978893 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.979266 kubelet[2581]: W0912 17:14:48.979008 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.979266 kubelet[2581]: E0912 17:14:48.979028 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.979266 kubelet[2581]: I0912 17:14:48.979049 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a5670fa1-14ff-4b4b-93e9-778888c14647-socket-dir\") pod \"csi-node-driver-cmgr5\" (UID: \"a5670fa1-14ff-4b4b-93e9-778888c14647\") " pod="calico-system/csi-node-driver-cmgr5" Sep 12 17:14:48.979637 kubelet[2581]: E0912 17:14:48.979618 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.979721 kubelet[2581]: W0912 17:14:48.979707 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.979829 kubelet[2581]: E0912 17:14:48.979765 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.980153 kubelet[2581]: E0912 17:14:48.980104 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.980153 kubelet[2581]: W0912 17:14:48.980117 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.980153 kubelet[2581]: E0912 17:14:48.980127 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.980890 kubelet[2581]: E0912 17:14:48.980670 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.980890 kubelet[2581]: W0912 17:14:48.980687 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.980890 kubelet[2581]: E0912 17:14:48.980699 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.981319 kubelet[2581]: E0912 17:14:48.981197 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.981319 kubelet[2581]: W0912 17:14:48.981226 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.981319 kubelet[2581]: E0912 17:14:48.981238 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.981808 kubelet[2581]: E0912 17:14:48.981685 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.981808 kubelet[2581]: W0912 17:14:48.981713 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.981808 kubelet[2581]: E0912 17:14:48.981727 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.982587 kubelet[2581]: E0912 17:14:48.982522 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.982587 kubelet[2581]: W0912 17:14:48.982535 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.982587 kubelet[2581]: E0912 17:14:48.982565 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.983637 kubelet[2581]: E0912 17:14:48.983221 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.983637 kubelet[2581]: W0912 17:14:48.983260 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.983637 kubelet[2581]: E0912 17:14:48.983272 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:48.984027 kubelet[2581]: E0912 17:14:48.983911 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:48.984027 kubelet[2581]: W0912 17:14:48.983982 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:48.984027 kubelet[2581]: E0912 17:14:48.983995 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.073192 containerd[1476]: time="2025-09-12T17:14:49.073115576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nxdct,Uid:4a9b6252-90cf-4255-bcfd-200708a162a1,Namespace:calico-system,Attempt:0,} returns sandbox id \"07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47\"" Sep 12 17:14:49.081268 kubelet[2581]: E0912 17:14:49.080952 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.081268 kubelet[2581]: W0912 17:14:49.081008 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.081268 kubelet[2581]: E0912 17:14:49.081063 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.083274 kubelet[2581]: E0912 17:14:49.082945 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.083274 kubelet[2581]: W0912 17:14:49.082992 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.083274 kubelet[2581]: E0912 17:14:49.083033 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.085129 kubelet[2581]: E0912 17:14:49.084662 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.085129 kubelet[2581]: W0912 17:14:49.084993 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.085296 kubelet[2581]: E0912 17:14:49.085051 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.087794 kubelet[2581]: E0912 17:14:49.086990 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.087794 kubelet[2581]: W0912 17:14:49.087040 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.087794 kubelet[2581]: E0912 17:14:49.087084 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.090255 kubelet[2581]: E0912 17:14:49.090193 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.090661 kubelet[2581]: W0912 17:14:49.090476 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.090661 kubelet[2581]: E0912 17:14:49.090520 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.091354 kubelet[2581]: E0912 17:14:49.091218 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.091354 kubelet[2581]: W0912 17:14:49.091235 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.091354 kubelet[2581]: E0912 17:14:49.091249 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.091974 kubelet[2581]: E0912 17:14:49.091798 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.091974 kubelet[2581]: W0912 17:14:49.091846 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.091974 kubelet[2581]: E0912 17:14:49.091864 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.093950 kubelet[2581]: E0912 17:14:49.093891 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.094322 kubelet[2581]: W0912 17:14:49.094138 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.094322 kubelet[2581]: E0912 17:14:49.094184 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.096449 kubelet[2581]: E0912 17:14:49.096408 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.097262 kubelet[2581]: W0912 17:14:49.096730 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.097262 kubelet[2581]: E0912 17:14:49.096785 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.098866 kubelet[2581]: E0912 17:14:49.097731 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.098866 kubelet[2581]: W0912 17:14:49.097753 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.098866 kubelet[2581]: E0912 17:14:49.097778 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.099942 kubelet[2581]: E0912 17:14:49.099758 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.099942 kubelet[2581]: W0912 17:14:49.099794 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.099942 kubelet[2581]: E0912 17:14:49.099853 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.101173 kubelet[2581]: E0912 17:14:49.101020 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.101173 kubelet[2581]: W0912 17:14:49.101052 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.101173 kubelet[2581]: E0912 17:14:49.101077 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.102025 kubelet[2581]: E0912 17:14:49.101655 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.102025 kubelet[2581]: W0912 17:14:49.101677 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.102025 kubelet[2581]: E0912 17:14:49.101695 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.103284 kubelet[2581]: E0912 17:14:49.103124 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.103284 kubelet[2581]: W0912 17:14:49.103149 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.103284 kubelet[2581]: E0912 17:14:49.103171 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.104541 kubelet[2581]: E0912 17:14:49.103774 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.104541 kubelet[2581]: W0912 17:14:49.103796 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.104541 kubelet[2581]: E0912 17:14:49.103813 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.105263 kubelet[2581]: E0912 17:14:49.105143 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.105263 kubelet[2581]: W0912 17:14:49.105165 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.105263 kubelet[2581]: E0912 17:14:49.105188 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.106284 kubelet[2581]: E0912 17:14:49.105880 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.106284 kubelet[2581]: W0912 17:14:49.105902 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.106284 kubelet[2581]: E0912 17:14:49.105921 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.107977 kubelet[2581]: E0912 17:14:49.107176 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.107977 kubelet[2581]: W0912 17:14:49.107193 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.107977 kubelet[2581]: E0912 17:14:49.107210 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.110042 kubelet[2581]: E0912 17:14:49.108429 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.110042 kubelet[2581]: W0912 17:14:49.108453 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.110042 kubelet[2581]: E0912 17:14:49.108474 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.111314 kubelet[2581]: E0912 17:14:49.111131 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.111314 kubelet[2581]: W0912 17:14:49.111163 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.111314 kubelet[2581]: E0912 17:14:49.111204 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.112279 kubelet[2581]: E0912 17:14:49.112062 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.112279 kubelet[2581]: W0912 17:14:49.112085 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.112279 kubelet[2581]: E0912 17:14:49.112110 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.113201 kubelet[2581]: E0912 17:14:49.113012 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.113201 kubelet[2581]: W0912 17:14:49.113041 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.113201 kubelet[2581]: E0912 17:14:49.113063 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.114697 kubelet[2581]: E0912 17:14:49.113913 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.114697 kubelet[2581]: W0912 17:14:49.113940 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.114697 kubelet[2581]: E0912 17:14:49.113960 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.116092 kubelet[2581]: E0912 17:14:49.116061 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.116468 kubelet[2581]: W0912 17:14:49.116177 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.116468 kubelet[2581]: E0912 17:14:49.116209 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.117141 kubelet[2581]: E0912 17:14:49.117120 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.117246 kubelet[2581]: W0912 17:14:49.117229 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.117313 kubelet[2581]: E0912 17:14:49.117300 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:49.155784 systemd[1]: run-containerd-runc-k8s.io-733595de882fee109217928cf1cd184f2ce1f3c71ed5ef52a512d45900fef0cc-runc.Qsa8va.mount: Deactivated successfully. Sep 12 17:14:49.183510 kubelet[2581]: E0912 17:14:49.182618 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:49.183510 kubelet[2581]: W0912 17:14:49.182656 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:49.183510 kubelet[2581]: E0912 17:14:49.182704 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.446544 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2522115253.mount: Deactivated successfully. Sep 12 17:14:50.655141 kubelet[2581]: E0912 17:14:50.654808 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:14:51.026092 containerd[1476]: time="2025-09-12T17:14:51.026001455Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:51.029570 containerd[1476]: time="2025-09-12T17:14:51.028963090Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 17:14:51.029570 containerd[1476]: time="2025-09-12T17:14:51.029409622Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:51.042844 containerd[1476]: time="2025-09-12T17:14:51.041775655Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:51.044671 containerd[1476]: time="2025-09-12T17:14:51.044608087Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.33274879s" Sep 12 17:14:51.044671 containerd[1476]: time="2025-09-12T17:14:51.044672169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 17:14:51.047488 containerd[1476]: time="2025-09-12T17:14:51.047338156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:14:51.072936 containerd[1476]: time="2025-09-12T17:14:51.072848363Z" level=info msg="CreateContainer within sandbox \"733595de882fee109217928cf1cd184f2ce1f3c71ed5ef52a512d45900fef0cc\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:14:51.101545 containerd[1476]: time="2025-09-12T17:14:51.101380567Z" level=info msg="CreateContainer within sandbox \"733595de882fee109217928cf1cd184f2ce1f3c71ed5ef52a512d45900fef0cc\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ba51796cac9ce439d2074e5686fd566c69aa901757d6024898fba94943360f49\"" Sep 12 17:14:51.104223 containerd[1476]: time="2025-09-12T17:14:51.104142797Z" level=info msg="StartContainer for \"ba51796cac9ce439d2074e5686fd566c69aa901757d6024898fba94943360f49\"" Sep 12 17:14:51.155666 systemd[1]: Started cri-containerd-ba51796cac9ce439d2074e5686fd566c69aa901757d6024898fba94943360f49.scope - libcontainer container ba51796cac9ce439d2074e5686fd566c69aa901757d6024898fba94943360f49. Sep 12 17:14:51.268548 containerd[1476]: time="2025-09-12T17:14:51.268457964Z" level=info msg="StartContainer for \"ba51796cac9ce439d2074e5686fd566c69aa901757d6024898fba94943360f49\" returns successfully" Sep 12 17:14:51.896694 kubelet[2581]: E0912 17:14:51.896061 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.896694 kubelet[2581]: W0912 17:14:51.896124 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.896694 kubelet[2581]: E0912 17:14:51.896170 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.899500 kubelet[2581]: E0912 17:14:51.899145 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.899500 kubelet[2581]: W0912 17:14:51.899200 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.899500 kubelet[2581]: E0912 17:14:51.899241 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.900194 kubelet[2581]: E0912 17:14:51.900022 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.900194 kubelet[2581]: W0912 17:14:51.900048 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.900194 kubelet[2581]: E0912 17:14:51.900100 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.900730 kubelet[2581]: E0912 17:14:51.900625 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.900730 kubelet[2581]: W0912 17:14:51.900639 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.900730 kubelet[2581]: E0912 17:14:51.900652 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.901212 kubelet[2581]: E0912 17:14:51.901140 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.901212 kubelet[2581]: W0912 17:14:51.901154 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.901212 kubelet[2581]: E0912 17:14:51.901167 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.903188 kubelet[2581]: E0912 17:14:51.903038 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.903188 kubelet[2581]: W0912 17:14:51.903076 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.903188 kubelet[2581]: E0912 17:14:51.903101 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.904131 kubelet[2581]: E0912 17:14:51.903836 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.904131 kubelet[2581]: W0912 17:14:51.903859 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.904131 kubelet[2581]: E0912 17:14:51.903877 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.904674 kubelet[2581]: E0912 17:14:51.904498 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.904674 kubelet[2581]: W0912 17:14:51.904513 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.904674 kubelet[2581]: E0912 17:14:51.904527 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.905876 kubelet[2581]: E0912 17:14:51.905195 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.906042 kubelet[2581]: W0912 17:14:51.906015 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.906170 kubelet[2581]: E0912 17:14:51.906097 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.906519 kubelet[2581]: E0912 17:14:51.906505 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.906721 kubelet[2581]: W0912 17:14:51.906574 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.906721 kubelet[2581]: E0912 17:14:51.906593 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.907591 kubelet[2581]: E0912 17:14:51.907484 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.907861 kubelet[2581]: W0912 17:14:51.907683 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.907861 kubelet[2581]: E0912 17:14:51.907708 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.909564 kubelet[2581]: E0912 17:14:51.909539 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.909865 kubelet[2581]: W0912 17:14:51.909725 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.909865 kubelet[2581]: E0912 17:14:51.909756 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.911245 kubelet[2581]: E0912 17:14:51.911013 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.911245 kubelet[2581]: W0912 17:14:51.911036 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.911245 kubelet[2581]: E0912 17:14:51.911057 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.912313 kubelet[2581]: E0912 17:14:51.912180 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.912313 kubelet[2581]: W0912 17:14:51.912198 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.912313 kubelet[2581]: E0912 17:14:51.912237 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.914392 kubelet[2581]: E0912 17:14:51.914356 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.914760 kubelet[2581]: W0912 17:14:51.914566 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.914760 kubelet[2581]: E0912 17:14:51.914639 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.927460 kubelet[2581]: E0912 17:14:51.927307 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.928637 kubelet[2581]: W0912 17:14:51.927672 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.928637 kubelet[2581]: E0912 17:14:51.927712 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.930002 kubelet[2581]: E0912 17:14:51.929961 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.930670 kubelet[2581]: W0912 17:14:51.930481 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.930670 kubelet[2581]: E0912 17:14:51.930533 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.933241 kubelet[2581]: E0912 17:14:51.932910 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.933241 kubelet[2581]: W0912 17:14:51.932969 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.933241 kubelet[2581]: E0912 17:14:51.933008 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.934265 kubelet[2581]: E0912 17:14:51.934143 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.934679 kubelet[2581]: W0912 17:14:51.934513 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.934679 kubelet[2581]: E0912 17:14:51.934563 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.937242 kubelet[2581]: E0912 17:14:51.936880 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.937242 kubelet[2581]: W0912 17:14:51.936935 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.939023 kubelet[2581]: E0912 17:14:51.938790 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.942444 kubelet[2581]: E0912 17:14:51.941519 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.942444 kubelet[2581]: W0912 17:14:51.941569 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.942444 kubelet[2581]: E0912 17:14:51.941636 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.945400 kubelet[2581]: E0912 17:14:51.945038 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.945400 kubelet[2581]: W0912 17:14:51.945082 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.945400 kubelet[2581]: E0912 17:14:51.945129 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.947122 kubelet[2581]: E0912 17:14:51.947047 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.950011 kubelet[2581]: W0912 17:14:51.949877 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.950011 kubelet[2581]: E0912 17:14:51.949971 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.952432 kubelet[2581]: E0912 17:14:51.952074 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.952432 kubelet[2581]: W0912 17:14:51.952131 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.952432 kubelet[2581]: E0912 17:14:51.952177 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.953005 kubelet[2581]: E0912 17:14:51.952877 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.953005 kubelet[2581]: W0912 17:14:51.952900 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.953005 kubelet[2581]: E0912 17:14:51.952923 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.953733 kubelet[2581]: E0912 17:14:51.953650 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.953733 kubelet[2581]: W0912 17:14:51.953697 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.953733 kubelet[2581]: E0912 17:14:51.953716 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.958685 kubelet[2581]: E0912 17:14:51.958264 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.958685 kubelet[2581]: W0912 17:14:51.958314 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.958685 kubelet[2581]: E0912 17:14:51.958380 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.961183 kubelet[2581]: E0912 17:14:51.960983 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.961183 kubelet[2581]: W0912 17:14:51.961038 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.961183 kubelet[2581]: E0912 17:14:51.961102 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.963147 kubelet[2581]: E0912 17:14:51.962957 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.963147 kubelet[2581]: W0912 17:14:51.963009 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.963147 kubelet[2581]: E0912 17:14:51.963056 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.969674 kubelet[2581]: E0912 17:14:51.969262 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.969674 kubelet[2581]: W0912 17:14:51.969305 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.969674 kubelet[2581]: E0912 17:14:51.969344 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.979871 kubelet[2581]: E0912 17:14:51.977387 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.979871 kubelet[2581]: W0912 17:14:51.977434 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.979871 kubelet[2581]: E0912 17:14:51.977494 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.983933 kubelet[2581]: E0912 17:14:51.983850 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.984864 kubelet[2581]: W0912 17:14:51.984154 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.984864 kubelet[2581]: E0912 17:14:51.984227 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.986506 kubelet[2581]: E0912 17:14:51.986458 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.986901 kubelet[2581]: W0912 17:14:51.986844 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.987007 kubelet[2581]: E0912 17:14:51.986990 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.653859 kubelet[2581]: E0912 17:14:52.652959 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:14:52.836750 kubelet[2581]: I0912 17:14:52.836556 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:14:52.922570 kubelet[2581]: E0912 17:14:52.922347 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.922570 kubelet[2581]: W0912 17:14:52.922402 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.922570 kubelet[2581]: E0912 17:14:52.922453 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.923587 kubelet[2581]: E0912 17:14:52.923330 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.923587 kubelet[2581]: W0912 17:14:52.923361 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.923587 kubelet[2581]: E0912 17:14:52.923502 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.924022 kubelet[2581]: E0912 17:14:52.923996 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.924022 kubelet[2581]: W0912 17:14:52.924023 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.924105 kubelet[2581]: E0912 17:14:52.924057 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.924476 kubelet[2581]: E0912 17:14:52.924456 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.924537 kubelet[2581]: W0912 17:14:52.924479 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.924537 kubelet[2581]: E0912 17:14:52.924500 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.924982 kubelet[2581]: E0912 17:14:52.924959 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.924982 kubelet[2581]: W0912 17:14:52.924978 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.925115 kubelet[2581]: E0912 17:14:52.924992 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.925291 kubelet[2581]: E0912 17:14:52.925276 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.925291 kubelet[2581]: W0912 17:14:52.925290 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.925388 kubelet[2581]: E0912 17:14:52.925301 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.926101 kubelet[2581]: E0912 17:14:52.926079 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.926101 kubelet[2581]: W0912 17:14:52.926100 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.926179 kubelet[2581]: E0912 17:14:52.926119 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.926555 kubelet[2581]: E0912 17:14:52.926538 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.926555 kubelet[2581]: W0912 17:14:52.926555 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.926663 kubelet[2581]: E0912 17:14:52.926567 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.927666 kubelet[2581]: E0912 17:14:52.927627 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.927666 kubelet[2581]: W0912 17:14:52.927650 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.927666 kubelet[2581]: E0912 17:14:52.927664 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.928001 kubelet[2581]: E0912 17:14:52.927975 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.928001 kubelet[2581]: W0912 17:14:52.927990 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.928001 kubelet[2581]: E0912 17:14:52.928001 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.929059 kubelet[2581]: E0912 17:14:52.929027 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.929059 kubelet[2581]: W0912 17:14:52.929055 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.929200 kubelet[2581]: E0912 17:14:52.929072 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.929354 kubelet[2581]: E0912 17:14:52.929339 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.929354 kubelet[2581]: W0912 17:14:52.929353 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.929459 kubelet[2581]: E0912 17:14:52.929364 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.929568 kubelet[2581]: E0912 17:14:52.929554 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.929568 kubelet[2581]: W0912 17:14:52.929567 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.929568 kubelet[2581]: E0912 17:14:52.929578 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.929967 kubelet[2581]: E0912 17:14:52.929929 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.929967 kubelet[2581]: W0912 17:14:52.929949 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.929967 kubelet[2581]: E0912 17:14:52.929961 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.930683 kubelet[2581]: E0912 17:14:52.930664 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.930683 kubelet[2581]: W0912 17:14:52.930681 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.930796 kubelet[2581]: E0912 17:14:52.930694 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.949361 kubelet[2581]: E0912 17:14:52.949214 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.949361 kubelet[2581]: W0912 17:14:52.949263 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.949361 kubelet[2581]: E0912 17:14:52.949307 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.950931 kubelet[2581]: E0912 17:14:52.950482 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.950931 kubelet[2581]: W0912 17:14:52.950510 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.950931 kubelet[2581]: E0912 17:14:52.950534 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.952575 kubelet[2581]: E0912 17:14:52.952133 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.952575 kubelet[2581]: W0912 17:14:52.952169 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.952575 kubelet[2581]: E0912 17:14:52.952196 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.953327 kubelet[2581]: E0912 17:14:52.953056 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.953327 kubelet[2581]: W0912 17:14:52.953084 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.953327 kubelet[2581]: E0912 17:14:52.953107 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.955524 kubelet[2581]: E0912 17:14:52.955016 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.955524 kubelet[2581]: W0912 17:14:52.955047 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.955524 kubelet[2581]: E0912 17:14:52.955070 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.956023 kubelet[2581]: E0912 17:14:52.956002 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.956727 kubelet[2581]: W0912 17:14:52.956392 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.957064 kubelet[2581]: E0912 17:14:52.956871 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.957987 kubelet[2581]: E0912 17:14:52.957701 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.957987 kubelet[2581]: W0912 17:14:52.957718 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.957987 kubelet[2581]: E0912 17:14:52.957748 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.958573 kubelet[2581]: E0912 17:14:52.958542 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.959033 kubelet[2581]: W0912 17:14:52.958721 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.959033 kubelet[2581]: E0912 17:14:52.958752 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.959229 kubelet[2581]: E0912 17:14:52.959216 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.959288 kubelet[2581]: W0912 17:14:52.959275 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.959368 kubelet[2581]: E0912 17:14:52.959354 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.959711 kubelet[2581]: E0912 17:14:52.959694 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.959798 kubelet[2581]: W0912 17:14:52.959784 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.959981 kubelet[2581]: E0912 17:14:52.959891 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.960255 kubelet[2581]: E0912 17:14:52.960241 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.960445 kubelet[2581]: W0912 17:14:52.960332 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.960445 kubelet[2581]: E0912 17:14:52.960350 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.960773 kubelet[2581]: E0912 17:14:52.960755 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.961014 kubelet[2581]: W0912 17:14:52.960918 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.961014 kubelet[2581]: E0912 17:14:52.960938 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.961411 kubelet[2581]: E0912 17:14:52.961283 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.961411 kubelet[2581]: W0912 17:14:52.961300 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.961411 kubelet[2581]: E0912 17:14:52.961310 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.961627 kubelet[2581]: E0912 17:14:52.961600 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.961811 kubelet[2581]: W0912 17:14:52.961675 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.961811 kubelet[2581]: E0912 17:14:52.961693 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.962123 kubelet[2581]: E0912 17:14:52.962109 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.962210 kubelet[2581]: W0912 17:14:52.962196 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.962366 kubelet[2581]: E0912 17:14:52.962272 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.962665 kubelet[2581]: E0912 17:14:52.962646 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.963171 kubelet[2581]: W0912 17:14:52.962733 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.963171 kubelet[2581]: E0912 17:14:52.962753 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.963341 kubelet[2581]: E0912 17:14:52.963323 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.963401 kubelet[2581]: W0912 17:14:52.963341 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.963401 kubelet[2581]: E0912 17:14:52.963356 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:52.963585 kubelet[2581]: E0912 17:14:52.963574 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:52.963630 kubelet[2581]: W0912 17:14:52.963586 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:52.963630 kubelet[2581]: E0912 17:14:52.963599 2581 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:53.657905 containerd[1476]: time="2025-09-12T17:14:53.657715707Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:53.660451 containerd[1476]: time="2025-09-12T17:14:53.660345528Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 17:14:53.663199 containerd[1476]: time="2025-09-12T17:14:53.662057569Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:53.667116 containerd[1476]: time="2025-09-12T17:14:53.667002165Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:53.668872 containerd[1476]: time="2025-09-12T17:14:53.668758926Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 2.621335287s" Sep 12 17:14:53.668872 containerd[1476]: time="2025-09-12T17:14:53.668875809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 17:14:53.677480 containerd[1476]: time="2025-09-12T17:14:53.677396369Z" level=info msg="CreateContainer within sandbox \"07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:14:53.730806 containerd[1476]: time="2025-09-12T17:14:53.730684380Z" level=info msg="CreateContainer within sandbox \"07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866\"" Sep 12 17:14:53.733884 containerd[1476]: time="2025-09-12T17:14:53.732526063Z" level=info msg="StartContainer for \"d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866\"" Sep 12 17:14:53.780263 systemd[1]: run-containerd-runc-k8s.io-d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866-runc.0cMWjN.mount: Deactivated successfully. Sep 12 17:14:53.790276 systemd[1]: Started cri-containerd-d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866.scope - libcontainer container d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866. Sep 12 17:14:53.842890 containerd[1476]: time="2025-09-12T17:14:53.841358818Z" level=info msg="StartContainer for \"d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866\" returns successfully" Sep 12 17:14:53.885415 kubelet[2581]: I0912 17:14:53.885263 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-574bd669bb-psb45" podStartSLOduration=4.550018512 podStartE2EDuration="6.885229007s" podCreationTimestamp="2025-09-12 17:14:47 +0000 UTC" firstStartedPulling="2025-09-12 17:14:48.711235799 +0000 UTC m=+30.267059590" lastFinishedPulling="2025-09-12 17:14:51.046446294 +0000 UTC m=+32.602270085" observedRunningTime="2025-09-12 17:14:51.878176268 +0000 UTC m=+33.434000059" watchObservedRunningTime="2025-09-12 17:14:53.885229007 +0000 UTC m=+35.441052798" Sep 12 17:14:53.887406 systemd[1]: cri-containerd-d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866.scope: Deactivated successfully. Sep 12 17:14:53.931876 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866-rootfs.mount: Deactivated successfully. Sep 12 17:14:54.017863 containerd[1476]: time="2025-09-12T17:14:54.017449457Z" level=info msg="shim disconnected" id=d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866 namespace=k8s.io Sep 12 17:14:54.017863 containerd[1476]: time="2025-09-12T17:14:54.017579820Z" level=warning msg="cleaning up after shim disconnected" id=d912580e4d8f32d1f16c2405f6a35d67fe2e0fbbddb03262833b8338759da866 namespace=k8s.io Sep 12 17:14:54.017863 containerd[1476]: time="2025-09-12T17:14:54.017591940Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:14:54.652916 kubelet[2581]: E0912 17:14:54.652510 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:14:54.864227 containerd[1476]: time="2025-09-12T17:14:54.864156048Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:14:56.654338 kubelet[2581]: E0912 17:14:56.653405 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:14:58.052342 containerd[1476]: time="2025-09-12T17:14:58.050574038Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:58.053103 containerd[1476]: time="2025-09-12T17:14:58.052974765Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 17:14:58.056351 containerd[1476]: time="2025-09-12T17:14:58.056250950Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:58.062440 containerd[1476]: time="2025-09-12T17:14:58.061230168Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:58.062440 containerd[1476]: time="2025-09-12T17:14:58.062215827Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.197984017s" Sep 12 17:14:58.062440 containerd[1476]: time="2025-09-12T17:14:58.062271148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 17:14:58.072328 containerd[1476]: time="2025-09-12T17:14:58.072249625Z" level=info msg="CreateContainer within sandbox \"07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:14:58.095302 containerd[1476]: time="2025-09-12T17:14:58.095210117Z" level=info msg="CreateContainer within sandbox \"07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb\"" Sep 12 17:14:58.097941 containerd[1476]: time="2025-09-12T17:14:58.096625905Z" level=info msg="StartContainer for \"4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb\"" Sep 12 17:14:58.149318 systemd[1]: Started cri-containerd-4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb.scope - libcontainer container 4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb. Sep 12 17:14:58.200503 containerd[1476]: time="2025-09-12T17:14:58.200275827Z" level=info msg="StartContainer for \"4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb\" returns successfully" Sep 12 17:14:58.658319 kubelet[2581]: E0912 17:14:58.657330 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:14:58.890117 containerd[1476]: time="2025-09-12T17:14:58.888311299Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:14:58.899281 systemd[1]: cri-containerd-4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb.scope: Deactivated successfully. Sep 12 17:14:58.950056 kubelet[2581]: I0912 17:14:58.947994 2581 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 17:14:58.976259 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb-rootfs.mount: Deactivated successfully. Sep 12 17:14:59.043328 containerd[1476]: time="2025-09-12T17:14:59.043090162Z" level=info msg="shim disconnected" id=4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb namespace=k8s.io Sep 12 17:14:59.045410 containerd[1476]: time="2025-09-12T17:14:59.044513629Z" level=warning msg="cleaning up after shim disconnected" id=4110ffc70e2f17444c2c25690e14dbdec3c0097d74faad340172ba2819e448bb namespace=k8s.io Sep 12 17:14:59.045410 containerd[1476]: time="2025-09-12T17:14:59.045045679Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:14:59.061837 systemd[1]: Created slice kubepods-burstable-podc0d7b056_4972_4c60_a388_fd78bc10058f.slice - libcontainer container kubepods-burstable-podc0d7b056_4972_4c60_a388_fd78bc10058f.slice. Sep 12 17:14:59.092491 systemd[1]: Created slice kubepods-burstable-pod01137908_9b3f_43f3_895b_bb6ad5c520d9.slice - libcontainer container kubepods-burstable-pod01137908_9b3f_43f3_895b_bb6ad5c520d9.slice. Sep 12 17:14:59.113833 kubelet[2581]: I0912 17:14:59.113532 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01137908-9b3f-43f3-895b-bb6ad5c520d9-config-volume\") pod \"coredns-674b8bbfcf-tcms6\" (UID: \"01137908-9b3f-43f3-895b-bb6ad5c520d9\") " pod="kube-system/coredns-674b8bbfcf-tcms6" Sep 12 17:14:59.115124 kubelet[2581]: I0912 17:14:59.113855 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxnbf\" (UniqueName: \"kubernetes.io/projected/c0d7b056-4972-4c60-a388-fd78bc10058f-kube-api-access-nxnbf\") pod \"coredns-674b8bbfcf-tgfmg\" (UID: \"c0d7b056-4972-4c60-a388-fd78bc10058f\") " pod="kube-system/coredns-674b8bbfcf-tgfmg" Sep 12 17:14:59.115124 kubelet[2581]: I0912 17:14:59.113891 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttxbw\" (UniqueName: \"kubernetes.io/projected/01137908-9b3f-43f3-895b-bb6ad5c520d9-kube-api-access-ttxbw\") pod \"coredns-674b8bbfcf-tcms6\" (UID: \"01137908-9b3f-43f3-895b-bb6ad5c520d9\") " pod="kube-system/coredns-674b8bbfcf-tcms6" Sep 12 17:14:59.115124 kubelet[2581]: I0912 17:14:59.113995 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0d7b056-4972-4c60-a388-fd78bc10058f-config-volume\") pod \"coredns-674b8bbfcf-tgfmg\" (UID: \"c0d7b056-4972-4c60-a388-fd78bc10058f\") " pod="kube-system/coredns-674b8bbfcf-tgfmg" Sep 12 17:14:59.130418 systemd[1]: Created slice kubepods-besteffort-pod829c1f4c_85d8_405a_a4f5_f939b129360b.slice - libcontainer container kubepods-besteffort-pod829c1f4c_85d8_405a_a4f5_f939b129360b.slice. Sep 12 17:14:59.163591 systemd[1]: Created slice kubepods-besteffort-pod7997260a_d514_4a5c_b1af_8a79861928e5.slice - libcontainer container kubepods-besteffort-pod7997260a_d514_4a5c_b1af_8a79861928e5.slice. Sep 12 17:14:59.180397 systemd[1]: Created slice kubepods-besteffort-pod75922f2c_6510_4ef8_a348_e60d70df3666.slice - libcontainer container kubepods-besteffort-pod75922f2c_6510_4ef8_a348_e60d70df3666.slice. Sep 12 17:14:59.197213 systemd[1]: Created slice kubepods-besteffort-podc181bfd7_3816_4413_900a_22a73e4de4f5.slice - libcontainer container kubepods-besteffort-podc181bfd7_3816_4413_900a_22a73e4de4f5.slice. Sep 12 17:14:59.228000 kubelet[2581]: I0912 17:14:59.227174 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829c1f4c-85d8-405a-a4f5-f939b129360b-config\") pod \"goldmane-54d579b49d-96m69\" (UID: \"829c1f4c-85d8-405a-a4f5-f939b129360b\") " pod="calico-system/goldmane-54d579b49d-96m69" Sep 12 17:14:59.228000 kubelet[2581]: I0912 17:14:59.227260 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rcs\" (UniqueName: \"kubernetes.io/projected/829c1f4c-85d8-405a-a4f5-f939b129360b-kube-api-access-b7rcs\") pod \"goldmane-54d579b49d-96m69\" (UID: \"829c1f4c-85d8-405a-a4f5-f939b129360b\") " pod="calico-system/goldmane-54d579b49d-96m69" Sep 12 17:14:59.228000 kubelet[2581]: I0912 17:14:59.227294 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvws\" (UniqueName: \"kubernetes.io/projected/c181bfd7-3816-4413-900a-22a73e4de4f5-kube-api-access-cbvws\") pod \"calico-apiserver-6dcc7b598-kld6v\" (UID: \"c181bfd7-3816-4413-900a-22a73e4de4f5\") " pod="calico-apiserver/calico-apiserver-6dcc7b598-kld6v" Sep 12 17:14:59.228000 kubelet[2581]: I0912 17:14:59.227373 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frxg5\" (UniqueName: \"kubernetes.io/projected/5cfa2648-c93e-445e-9e1f-6da367fb2890-kube-api-access-frxg5\") pod \"calico-apiserver-6dcc7b598-zt2zj\" (UID: \"5cfa2648-c93e-445e-9e1f-6da367fb2890\") " pod="calico-apiserver/calico-apiserver-6dcc7b598-zt2zj" Sep 12 17:14:59.228000 kubelet[2581]: I0912 17:14:59.227415 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/829c1f4c-85d8-405a-a4f5-f939b129360b-goldmane-key-pair\") pod \"goldmane-54d579b49d-96m69\" (UID: \"829c1f4c-85d8-405a-a4f5-f939b129360b\") " pod="calico-system/goldmane-54d579b49d-96m69" Sep 12 17:14:59.228347 kubelet[2581]: I0912 17:14:59.227444 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvk84\" (UniqueName: \"kubernetes.io/projected/7997260a-d514-4a5c-b1af-8a79861928e5-kube-api-access-kvk84\") pod \"calico-kube-controllers-bfcb5d966-vxpgp\" (UID: \"7997260a-d514-4a5c-b1af-8a79861928e5\") " pod="calico-system/calico-kube-controllers-bfcb5d966-vxpgp" Sep 12 17:14:59.228347 kubelet[2581]: I0912 17:14:59.227472 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c181bfd7-3816-4413-900a-22a73e4de4f5-calico-apiserver-certs\") pod \"calico-apiserver-6dcc7b598-kld6v\" (UID: \"c181bfd7-3816-4413-900a-22a73e4de4f5\") " pod="calico-apiserver/calico-apiserver-6dcc7b598-kld6v" Sep 12 17:14:59.228347 kubelet[2581]: I0912 17:14:59.227500 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-ca-bundle\") pod \"whisker-54d6bb5877-49g7j\" (UID: \"75922f2c-6510-4ef8-a348-e60d70df3666\") " pod="calico-system/whisker-54d6bb5877-49g7j" Sep 12 17:14:59.228347 kubelet[2581]: I0912 17:14:59.227536 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5cfa2648-c93e-445e-9e1f-6da367fb2890-calico-apiserver-certs\") pod \"calico-apiserver-6dcc7b598-zt2zj\" (UID: \"5cfa2648-c93e-445e-9e1f-6da367fb2890\") " pod="calico-apiserver/calico-apiserver-6dcc7b598-zt2zj" Sep 12 17:14:59.228347 kubelet[2581]: I0912 17:14:59.227556 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/829c1f4c-85d8-405a-a4f5-f939b129360b-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-96m69\" (UID: \"829c1f4c-85d8-405a-a4f5-f939b129360b\") " pod="calico-system/goldmane-54d579b49d-96m69" Sep 12 17:14:59.228487 kubelet[2581]: I0912 17:14:59.227582 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7997260a-d514-4a5c-b1af-8a79861928e5-tigera-ca-bundle\") pod \"calico-kube-controllers-bfcb5d966-vxpgp\" (UID: \"7997260a-d514-4a5c-b1af-8a79861928e5\") " pod="calico-system/calico-kube-controllers-bfcb5d966-vxpgp" Sep 12 17:14:59.228487 kubelet[2581]: I0912 17:14:59.227605 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvgm\" (UniqueName: \"kubernetes.io/projected/75922f2c-6510-4ef8-a348-e60d70df3666-kube-api-access-lmvgm\") pod \"whisker-54d6bb5877-49g7j\" (UID: \"75922f2c-6510-4ef8-a348-e60d70df3666\") " pod="calico-system/whisker-54d6bb5877-49g7j" Sep 12 17:14:59.228487 kubelet[2581]: I0912 17:14:59.227662 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-backend-key-pair\") pod \"whisker-54d6bb5877-49g7j\" (UID: \"75922f2c-6510-4ef8-a348-e60d70df3666\") " pod="calico-system/whisker-54d6bb5877-49g7j" Sep 12 17:14:59.230644 systemd[1]: Created slice kubepods-besteffort-pod5cfa2648_c93e_445e_9e1f_6da367fb2890.slice - libcontainer container kubepods-besteffort-pod5cfa2648_c93e_445e_9e1f_6da367fb2890.slice. Sep 12 17:14:59.373266 containerd[1476]: time="2025-09-12T17:14:59.371566828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tgfmg,Uid:c0d7b056-4972-4c60-a388-fd78bc10058f,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:59.435278 containerd[1476]: time="2025-09-12T17:14:59.433310086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tcms6,Uid:01137908-9b3f-43f3-895b-bb6ad5c520d9,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:59.494085 containerd[1476]: time="2025-09-12T17:14:59.493017505Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54d6bb5877-49g7j,Uid:75922f2c-6510-4ef8-a348-e60d70df3666,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:59.529241 containerd[1476]: time="2025-09-12T17:14:59.529166275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-kld6v,Uid:c181bfd7-3816-4413-900a-22a73e4de4f5,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:14:59.560130 containerd[1476]: time="2025-09-12T17:14:59.559227968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-zt2zj,Uid:5cfa2648-c93e-445e-9e1f-6da367fb2890,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:14:59.756055 containerd[1476]: time="2025-09-12T17:14:59.754461533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-96m69,Uid:829c1f4c-85d8-405a-a4f5-f939b129360b,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:59.775599 containerd[1476]: time="2025-09-12T17:14:59.775521255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bfcb5d966-vxpgp,Uid:7997260a-d514-4a5c-b1af-8a79861928e5,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:59.833174 containerd[1476]: time="2025-09-12T17:14:59.833076233Z" level=error msg="Failed to destroy network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.834131 containerd[1476]: time="2025-09-12T17:14:59.834058691Z" level=error msg="encountered an error cleaning up failed sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.834482 containerd[1476]: time="2025-09-12T17:14:59.834453379Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tcms6,Uid:01137908-9b3f-43f3-895b-bb6ad5c520d9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.835878 kubelet[2581]: E0912 17:14:59.835105 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.835878 kubelet[2581]: E0912 17:14:59.835237 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tcms6" Sep 12 17:14:59.835878 kubelet[2581]: E0912 17:14:59.835269 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tcms6" Sep 12 17:14:59.836488 kubelet[2581]: E0912 17:14:59.835352 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-tcms6_kube-system(01137908-9b3f-43f3-895b-bb6ad5c520d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-tcms6_kube-system(01137908-9b3f-43f3-895b-bb6ad5c520d9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-tcms6" podUID="01137908-9b3f-43f3-895b-bb6ad5c520d9" Sep 12 17:14:59.888711 kubelet[2581]: I0912 17:14:59.888266 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:14:59.893218 containerd[1476]: time="2025-09-12T17:14:59.892468606Z" level=info msg="StopPodSandbox for \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\"" Sep 12 17:14:59.898240 containerd[1476]: time="2025-09-12T17:14:59.897997471Z" level=info msg="Ensure that sandbox 1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e in task-service has been cleanup successfully" Sep 12 17:14:59.910646 containerd[1476]: time="2025-09-12T17:14:59.910461149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:14:59.934752 containerd[1476]: time="2025-09-12T17:14:59.934604449Z" level=error msg="Failed to destroy network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.941878 containerd[1476]: time="2025-09-12T17:14:59.938509844Z" level=error msg="encountered an error cleaning up failed sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.941878 containerd[1476]: time="2025-09-12T17:14:59.938667247Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tgfmg,Uid:c0d7b056-4972-4c60-a388-fd78bc10058f,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.964452 kubelet[2581]: E0912 17:14:59.964373 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.965384 kubelet[2581]: E0912 17:14:59.965344 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tgfmg" Sep 12 17:14:59.965932 kubelet[2581]: E0912 17:14:59.965844 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tgfmg" Sep 12 17:14:59.966048 kubelet[2581]: E0912 17:14:59.965969 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-tgfmg_kube-system(c0d7b056-4972-4c60-a388-fd78bc10058f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-tgfmg_kube-system(c0d7b056-4972-4c60-a388-fd78bc10058f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-tgfmg" podUID="c0d7b056-4972-4c60-a388-fd78bc10058f" Sep 12 17:15:00.014286 containerd[1476]: time="2025-09-12T17:15:00.014015317Z" level=error msg="Failed to destroy network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.017382 containerd[1476]: time="2025-09-12T17:15:00.017070893Z" level=error msg="encountered an error cleaning up failed sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.017756 containerd[1476]: time="2025-09-12T17:15:00.017717545Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-kld6v,Uid:c181bfd7-3816-4413-900a-22a73e4de4f5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.020156 kubelet[2581]: E0912 17:15:00.019952 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.020156 kubelet[2581]: E0912 17:15:00.020070 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dcc7b598-kld6v" Sep 12 17:15:00.020156 kubelet[2581]: E0912 17:15:00.020103 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dcc7b598-kld6v" Sep 12 17:15:00.021755 kubelet[2581]: E0912 17:15:00.020193 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6dcc7b598-kld6v_calico-apiserver(c181bfd7-3816-4413-900a-22a73e4de4f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6dcc7b598-kld6v_calico-apiserver(c181bfd7-3816-4413-900a-22a73e4de4f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dcc7b598-kld6v" podUID="c181bfd7-3816-4413-900a-22a73e4de4f5" Sep 12 17:15:00.056969 containerd[1476]: time="2025-09-12T17:15:00.056109295Z" level=error msg="StopPodSandbox for \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\" failed" error="failed to destroy network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.057227 kubelet[2581]: E0912 17:15:00.056557 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:00.057227 kubelet[2581]: E0912 17:15:00.056657 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e"} Sep 12 17:15:00.057227 kubelet[2581]: E0912 17:15:00.056766 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"01137908-9b3f-43f3-895b-bb6ad5c520d9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:00.058640 kubelet[2581]: E0912 17:15:00.056798 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"01137908-9b3f-43f3-895b-bb6ad5c520d9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-tcms6" podUID="01137908-9b3f-43f3-895b-bb6ad5c520d9" Sep 12 17:15:00.062179 containerd[1476]: time="2025-09-12T17:15:00.062098246Z" level=error msg="Failed to destroy network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.063093 containerd[1476]: time="2025-09-12T17:15:00.063037023Z" level=error msg="encountered an error cleaning up failed sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.063633 containerd[1476]: time="2025-09-12T17:15:00.063595394Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54d6bb5877-49g7j,Uid:75922f2c-6510-4ef8-a348-e60d70df3666,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.064520 kubelet[2581]: E0912 17:15:00.064465 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.064941 kubelet[2581]: E0912 17:15:00.064910 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54d6bb5877-49g7j" Sep 12 17:15:00.065063 kubelet[2581]: E0912 17:15:00.065048 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54d6bb5877-49g7j" Sep 12 17:15:00.065785 kubelet[2581]: E0912 17:15:00.065323 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-54d6bb5877-49g7j_calico-system(75922f2c-6510-4ef8-a348-e60d70df3666)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-54d6bb5877-49g7j_calico-system(75922f2c-6510-4ef8-a348-e60d70df3666)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-54d6bb5877-49g7j" podUID="75922f2c-6510-4ef8-a348-e60d70df3666" Sep 12 17:15:00.067905 containerd[1476]: time="2025-09-12T17:15:00.067741550Z" level=error msg="Failed to destroy network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.068853 containerd[1476]: time="2025-09-12T17:15:00.068673048Z" level=error msg="encountered an error cleaning up failed sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.068853 containerd[1476]: time="2025-09-12T17:15:00.068785090Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-zt2zj,Uid:5cfa2648-c93e-445e-9e1f-6da367fb2890,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.073180 kubelet[2581]: E0912 17:15:00.072980 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.073180 kubelet[2581]: E0912 17:15:00.073122 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dcc7b598-zt2zj" Sep 12 17:15:00.074721 kubelet[2581]: E0912 17:15:00.073150 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dcc7b598-zt2zj" Sep 12 17:15:00.074721 kubelet[2581]: E0912 17:15:00.073610 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6dcc7b598-zt2zj_calico-apiserver(5cfa2648-c93e-445e-9e1f-6da367fb2890)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6dcc7b598-zt2zj_calico-apiserver(5cfa2648-c93e-445e-9e1f-6da367fb2890)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dcc7b598-zt2zj" podUID="5cfa2648-c93e-445e-9e1f-6da367fb2890" Sep 12 17:15:00.082623 containerd[1476]: time="2025-09-12T17:15:00.082547504Z" level=error msg="Failed to destroy network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.087919 containerd[1476]: time="2025-09-12T17:15:00.087074908Z" level=error msg="encountered an error cleaning up failed sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.091560 containerd[1476]: time="2025-09-12T17:15:00.091014141Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-96m69,Uid:829c1f4c-85d8-405a-a4f5-f939b129360b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.092861 kubelet[2581]: E0912 17:15:00.092144 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.092861 kubelet[2581]: E0912 17:15:00.092233 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-96m69" Sep 12 17:15:00.092861 kubelet[2581]: E0912 17:15:00.092259 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-96m69" Sep 12 17:15:00.093923 kubelet[2581]: E0912 17:15:00.092339 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-96m69_calico-system(829c1f4c-85d8-405a-a4f5-f939b129360b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-96m69_calico-system(829c1f4c-85d8-405a-a4f5-f939b129360b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-96m69" podUID="829c1f4c-85d8-405a-a4f5-f939b129360b" Sep 12 17:15:00.124129 containerd[1476]: time="2025-09-12T17:15:00.124013311Z" level=error msg="Failed to destroy network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.124639 containerd[1476]: time="2025-09-12T17:15:00.124568322Z" level=error msg="encountered an error cleaning up failed sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.124719 containerd[1476]: time="2025-09-12T17:15:00.124651803Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bfcb5d966-vxpgp,Uid:7997260a-d514-4a5c-b1af-8a79861928e5,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.125935 kubelet[2581]: E0912 17:15:00.125134 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.125935 kubelet[2581]: E0912 17:15:00.125249 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-bfcb5d966-vxpgp" Sep 12 17:15:00.125935 kubelet[2581]: E0912 17:15:00.125291 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-bfcb5d966-vxpgp" Sep 12 17:15:00.126193 kubelet[2581]: E0912 17:15:00.125374 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-bfcb5d966-vxpgp_calico-system(7997260a-d514-4a5c-b1af-8a79861928e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-bfcb5d966-vxpgp_calico-system(7997260a-d514-4a5c-b1af-8a79861928e5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-bfcb5d966-vxpgp" podUID="7997260a-d514-4a5c-b1af-8a79861928e5" Sep 12 17:15:00.664807 systemd[1]: Created slice kubepods-besteffort-poda5670fa1_14ff_4b4b_93e9_778888c14647.slice - libcontainer container kubepods-besteffort-poda5670fa1_14ff_4b4b_93e9_778888c14647.slice. Sep 12 17:15:00.670473 containerd[1476]: time="2025-09-12T17:15:00.670379617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cmgr5,Uid:a5670fa1-14ff-4b4b-93e9-778888c14647,Namespace:calico-system,Attempt:0,}" Sep 12 17:15:00.775751 containerd[1476]: time="2025-09-12T17:15:00.775493681Z" level=error msg="Failed to destroy network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.778917 containerd[1476]: time="2025-09-12T17:15:00.778652460Z" level=error msg="encountered an error cleaning up failed sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.779912 containerd[1476]: time="2025-09-12T17:15:00.778802342Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cmgr5,Uid:a5670fa1-14ff-4b4b-93e9-778888c14647,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.780471 kubelet[2581]: E0912 17:15:00.780223 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.780471 kubelet[2581]: E0912 17:15:00.780447 2581 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cmgr5" Sep 12 17:15:00.780614 kubelet[2581]: E0912 17:15:00.780485 2581 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cmgr5" Sep 12 17:15:00.780687 kubelet[2581]: E0912 17:15:00.780627 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cmgr5_calico-system(a5670fa1-14ff-4b4b-93e9-778888c14647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cmgr5_calico-system(a5670fa1-14ff-4b4b-93e9-778888c14647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:15:00.781701 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982-shm.mount: Deactivated successfully. Sep 12 17:15:00.908378 kubelet[2581]: I0912 17:15:00.907683 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:00.914084 containerd[1476]: time="2025-09-12T17:15:00.912233650Z" level=info msg="StopPodSandbox for \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\"" Sep 12 17:15:00.914084 containerd[1476]: time="2025-09-12T17:15:00.912554296Z" level=info msg="Ensure that sandbox 29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b in task-service has been cleanup successfully" Sep 12 17:15:00.914805 kubelet[2581]: I0912 17:15:00.913437 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:00.916998 containerd[1476]: time="2025-09-12T17:15:00.915800636Z" level=info msg="StopPodSandbox for \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\"" Sep 12 17:15:00.916998 containerd[1476]: time="2025-09-12T17:15:00.916559650Z" level=info msg="Ensure that sandbox 6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75 in task-service has been cleanup successfully" Sep 12 17:15:00.929333 kubelet[2581]: I0912 17:15:00.929288 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:00.933013 containerd[1476]: time="2025-09-12T17:15:00.932753990Z" level=info msg="StopPodSandbox for \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\"" Sep 12 17:15:00.934367 containerd[1476]: time="2025-09-12T17:15:00.933995533Z" level=info msg="Ensure that sandbox f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4 in task-service has been cleanup successfully" Sep 12 17:15:00.938403 kubelet[2581]: I0912 17:15:00.937059 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:00.941366 containerd[1476]: time="2025-09-12T17:15:00.940788899Z" level=info msg="StopPodSandbox for \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\"" Sep 12 17:15:00.944313 containerd[1476]: time="2025-09-12T17:15:00.942890337Z" level=info msg="Ensure that sandbox 1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96 in task-service has been cleanup successfully" Sep 12 17:15:00.949011 kubelet[2581]: I0912 17:15:00.948212 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:00.960316 containerd[1476]: time="2025-09-12T17:15:00.959345722Z" level=info msg="StopPodSandbox for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\"" Sep 12 17:15:00.960316 containerd[1476]: time="2025-09-12T17:15:00.959766050Z" level=info msg="Ensure that sandbox b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982 in task-service has been cleanup successfully" Sep 12 17:15:00.964522 kubelet[2581]: I0912 17:15:00.964322 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:00.971192 containerd[1476]: time="2025-09-12T17:15:00.971123420Z" level=info msg="StopPodSandbox for \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\"" Sep 12 17:15:00.973481 containerd[1476]: time="2025-09-12T17:15:00.972921333Z" level=info msg="Ensure that sandbox fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d in task-service has been cleanup successfully" Sep 12 17:15:00.983606 kubelet[2581]: I0912 17:15:00.983550 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:00.986081 containerd[1476]: time="2025-09-12T17:15:00.985762970Z" level=info msg="StopPodSandbox for \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\"" Sep 12 17:15:00.992352 containerd[1476]: time="2025-09-12T17:15:00.991961685Z" level=info msg="Ensure that sandbox 4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22 in task-service has been cleanup successfully" Sep 12 17:15:01.108857 containerd[1476]: time="2025-09-12T17:15:01.108130975Z" level=error msg="StopPodSandbox for \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\" failed" error="failed to destroy network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:01.109116 kubelet[2581]: E0912 17:15:01.108732 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:01.110007 kubelet[2581]: E0912 17:15:01.108812 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75"} Sep 12 17:15:01.110007 kubelet[2581]: E0912 17:15:01.109887 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c181bfd7-3816-4413-900a-22a73e4de4f5\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:01.110007 kubelet[2581]: E0912 17:15:01.109934 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c181bfd7-3816-4413-900a-22a73e4de4f5\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dcc7b598-kld6v" podUID="c181bfd7-3816-4413-900a-22a73e4de4f5" Sep 12 17:15:01.135699 containerd[1476]: time="2025-09-12T17:15:01.135282623Z" level=error msg="StopPodSandbox for \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\" failed" error="failed to destroy network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:01.139426 kubelet[2581]: E0912 17:15:01.139106 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:01.139426 kubelet[2581]: E0912 17:15:01.139235 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96"} Sep 12 17:15:01.139426 kubelet[2581]: E0912 17:15:01.139296 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"5cfa2648-c93e-445e-9e1f-6da367fb2890\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:01.139426 kubelet[2581]: E0912 17:15:01.139329 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"5cfa2648-c93e-445e-9e1f-6da367fb2890\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dcc7b598-zt2zj" podUID="5cfa2648-c93e-445e-9e1f-6da367fb2890" Sep 12 17:15:01.140881 containerd[1476]: time="2025-09-12T17:15:01.140397355Z" level=error msg="StopPodSandbox for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" failed" error="failed to destroy network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:01.142455 containerd[1476]: time="2025-09-12T17:15:01.142284668Z" level=error msg="StopPodSandbox for \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\" failed" error="failed to destroy network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:01.142645 kubelet[2581]: E0912 17:15:01.142455 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:01.142645 kubelet[2581]: E0912 17:15:01.142586 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982"} Sep 12 17:15:01.142804 kubelet[2581]: E0912 17:15:01.142650 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a5670fa1-14ff-4b4b-93e9-778888c14647\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:01.142804 kubelet[2581]: E0912 17:15:01.142700 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a5670fa1-14ff-4b4b-93e9-778888c14647\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cmgr5" podUID="a5670fa1-14ff-4b4b-93e9-778888c14647" Sep 12 17:15:01.144983 kubelet[2581]: E0912 17:15:01.143126 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:01.144983 kubelet[2581]: E0912 17:15:01.143181 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d"} Sep 12 17:15:01.144983 kubelet[2581]: E0912 17:15:01.143217 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"75922f2c-6510-4ef8-a348-e60d70df3666\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:01.144983 kubelet[2581]: E0912 17:15:01.143237 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"75922f2c-6510-4ef8-a348-e60d70df3666\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-54d6bb5877-49g7j" podUID="75922f2c-6510-4ef8-a348-e60d70df3666" Sep 12 17:15:01.150201 containerd[1476]: time="2025-09-12T17:15:01.150093449Z" level=error msg="StopPodSandbox for \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\" failed" error="failed to destroy network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:01.151100 kubelet[2581]: E0912 17:15:01.150754 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:01.151100 kubelet[2581]: E0912 17:15:01.150876 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4"} Sep 12 17:15:01.151100 kubelet[2581]: E0912 17:15:01.150988 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7997260a-d514-4a5c-b1af-8a79861928e5\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:01.151100 kubelet[2581]: E0912 17:15:01.151021 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7997260a-d514-4a5c-b1af-8a79861928e5\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-bfcb5d966-vxpgp" podUID="7997260a-d514-4a5c-b1af-8a79861928e5" Sep 12 17:15:01.155146 containerd[1476]: time="2025-09-12T17:15:01.155021857Z" level=error msg="StopPodSandbox for \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\" failed" error="failed to destroy network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:01.155605 kubelet[2581]: E0912 17:15:01.155484 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:01.155605 kubelet[2581]: E0912 17:15:01.155577 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b"} Sep 12 17:15:01.155783 kubelet[2581]: E0912 17:15:01.155624 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"829c1f4c-85d8-405a-a4f5-f939b129360b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:01.155783 kubelet[2581]: E0912 17:15:01.155653 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"829c1f4c-85d8-405a-a4f5-f939b129360b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-96m69" podUID="829c1f4c-85d8-405a-a4f5-f939b129360b" Sep 12 17:15:01.171619 containerd[1476]: time="2025-09-12T17:15:01.171323550Z" level=error msg="StopPodSandbox for \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\" failed" error="failed to destroy network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:01.172894 kubelet[2581]: E0912 17:15:01.172281 2581 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:01.172894 kubelet[2581]: E0912 17:15:01.172404 2581 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22"} Sep 12 17:15:01.173123 kubelet[2581]: E0912 17:15:01.172472 2581 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c0d7b056-4972-4c60-a388-fd78bc10058f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:01.173123 kubelet[2581]: E0912 17:15:01.173006 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c0d7b056-4972-4c60-a388-fd78bc10058f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-tgfmg" podUID="c0d7b056-4972-4c60-a388-fd78bc10058f" Sep 12 17:15:07.391162 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3811584746.mount: Deactivated successfully. Sep 12 17:15:07.430171 containerd[1476]: time="2025-09-12T17:15:07.430067044Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:07.432207 containerd[1476]: time="2025-09-12T17:15:07.432118036Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 17:15:07.434366 containerd[1476]: time="2025-09-12T17:15:07.433970344Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:07.438277 containerd[1476]: time="2025-09-12T17:15:07.438172168Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:07.440189 containerd[1476]: time="2025-09-12T17:15:07.439893035Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 7.528875955s" Sep 12 17:15:07.440189 containerd[1476]: time="2025-09-12T17:15:07.439968196Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 17:15:07.468747 containerd[1476]: time="2025-09-12T17:15:07.468604235Z" level=info msg="CreateContainer within sandbox \"07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:15:07.495516 containerd[1476]: time="2025-09-12T17:15:07.495233323Z" level=info msg="CreateContainer within sandbox \"07baf1cd5226856ca53617dbd518fe5b537d0553fec6e4cd573784aeabea3c47\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd\"" Sep 12 17:15:07.498137 containerd[1476]: time="2025-09-12T17:15:07.496844788Z" level=info msg="StartContainer for \"0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd\"" Sep 12 17:15:07.537144 systemd[1]: Started cri-containerd-0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd.scope - libcontainer container 0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd. Sep 12 17:15:07.588033 containerd[1476]: time="2025-09-12T17:15:07.587940064Z" level=info msg="StartContainer for \"0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd\" returns successfully" Sep 12 17:15:07.777854 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:15:07.778015 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:15:07.997616 containerd[1476]: time="2025-09-12T17:15:07.997545582Z" level=info msg="StopPodSandbox for \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\"" Sep 12 17:15:08.143018 kubelet[2581]: I0912 17:15:08.142631 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-nxdct" podStartSLOduration=1.778763683 podStartE2EDuration="20.142588316s" podCreationTimestamp="2025-09-12 17:14:48 +0000 UTC" firstStartedPulling="2025-09-12 17:14:49.078078113 +0000 UTC m=+30.633901904" lastFinishedPulling="2025-09-12 17:15:07.441902706 +0000 UTC m=+48.997726537" observedRunningTime="2025-09-12 17:15:08.140093079 +0000 UTC m=+49.695916870" watchObservedRunningTime="2025-09-12 17:15:08.142588316 +0000 UTC m=+49.698412107" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.210 [INFO][3834] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.211 [INFO][3834] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" iface="eth0" netns="/var/run/netns/cni-d6694899-8a33-c070-ca9f-d0162c6e1356" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.211 [INFO][3834] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" iface="eth0" netns="/var/run/netns/cni-d6694899-8a33-c070-ca9f-d0162c6e1356" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.211 [INFO][3834] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" iface="eth0" netns="/var/run/netns/cni-d6694899-8a33-c070-ca9f-d0162c6e1356" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.211 [INFO][3834] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.211 [INFO][3834] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.277 [INFO][3863] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.280 [INFO][3863] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.280 [INFO][3863] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.295 [WARNING][3863] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.295 [INFO][3863] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.298 [INFO][3863] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:08.304959 containerd[1476]: 2025-09-12 17:15:08.301 [INFO][3834] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:08.305654 containerd[1476]: time="2025-09-12T17:15:08.305604118Z" level=info msg="TearDown network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\" successfully" Sep 12 17:15:08.305707 containerd[1476]: time="2025-09-12T17:15:08.305655679Z" level=info msg="StopPodSandbox for \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\" returns successfully" Sep 12 17:15:08.394438 systemd[1]: run-netns-cni\x2dd6694899\x2d8a33\x2dc070\x2dca9f\x2dd0162c6e1356.mount: Deactivated successfully. Sep 12 17:15:08.427514 kubelet[2581]: I0912 17:15:08.426827 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-ca-bundle\") pod \"75922f2c-6510-4ef8-a348-e60d70df3666\" (UID: \"75922f2c-6510-4ef8-a348-e60d70df3666\") " Sep 12 17:15:08.427514 kubelet[2581]: I0912 17:15:08.426909 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-backend-key-pair\") pod \"75922f2c-6510-4ef8-a348-e60d70df3666\" (UID: \"75922f2c-6510-4ef8-a348-e60d70df3666\") " Sep 12 17:15:08.427514 kubelet[2581]: I0912 17:15:08.426945 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmvgm\" (UniqueName: \"kubernetes.io/projected/75922f2c-6510-4ef8-a348-e60d70df3666-kube-api-access-lmvgm\") pod \"75922f2c-6510-4ef8-a348-e60d70df3666\" (UID: \"75922f2c-6510-4ef8-a348-e60d70df3666\") " Sep 12 17:15:08.427514 kubelet[2581]: I0912 17:15:08.427435 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "75922f2c-6510-4ef8-a348-e60d70df3666" (UID: "75922f2c-6510-4ef8-a348-e60d70df3666"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 17:15:08.437728 kubelet[2581]: I0912 17:15:08.437637 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "75922f2c-6510-4ef8-a348-e60d70df3666" (UID: "75922f2c-6510-4ef8-a348-e60d70df3666"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 17:15:08.439133 systemd[1]: var-lib-kubelet-pods-75922f2c\x2d6510\x2d4ef8\x2da348\x2de60d70df3666-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dlmvgm.mount: Deactivated successfully. Sep 12 17:15:08.439285 systemd[1]: var-lib-kubelet-pods-75922f2c\x2d6510\x2d4ef8\x2da348\x2de60d70df3666-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:15:08.442148 kubelet[2581]: I0912 17:15:08.441988 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75922f2c-6510-4ef8-a348-e60d70df3666-kube-api-access-lmvgm" (OuterVolumeSpecName: "kube-api-access-lmvgm") pod "75922f2c-6510-4ef8-a348-e60d70df3666" (UID: "75922f2c-6510-4ef8-a348-e60d70df3666"). InnerVolumeSpecName "kube-api-access-lmvgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 17:15:08.528559 kubelet[2581]: I0912 17:15:08.528386 2581 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-ca-bundle\") on node \"ci-4081-3-6-e-c5bf4513f4\" DevicePath \"\"" Sep 12 17:15:08.528559 kubelet[2581]: I0912 17:15:08.528472 2581 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/75922f2c-6510-4ef8-a348-e60d70df3666-whisker-backend-key-pair\") on node \"ci-4081-3-6-e-c5bf4513f4\" DevicePath \"\"" Sep 12 17:15:08.528559 kubelet[2581]: I0912 17:15:08.528502 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lmvgm\" (UniqueName: \"kubernetes.io/projected/75922f2c-6510-4ef8-a348-e60d70df3666-kube-api-access-lmvgm\") on node \"ci-4081-3-6-e-c5bf4513f4\" DevicePath \"\"" Sep 12 17:15:08.679589 systemd[1]: Removed slice kubepods-besteffort-pod75922f2c_6510_4ef8_a348_e60d70df3666.slice - libcontainer container kubepods-besteffort-pod75922f2c_6510_4ef8_a348_e60d70df3666.slice. Sep 12 17:15:09.173417 systemd[1]: Created slice kubepods-besteffort-pod5855899d_e66c_447d_85fb_a67eba074898.slice - libcontainer container kubepods-besteffort-pod5855899d_e66c_447d_85fb_a67eba074898.slice. Sep 12 17:15:09.233882 kubelet[2581]: I0912 17:15:09.233631 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5855899d-e66c-447d-85fb-a67eba074898-whisker-backend-key-pair\") pod \"whisker-579586b5b-88x9v\" (UID: \"5855899d-e66c-447d-85fb-a67eba074898\") " pod="calico-system/whisker-579586b5b-88x9v" Sep 12 17:15:09.233882 kubelet[2581]: I0912 17:15:09.233757 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5855899d-e66c-447d-85fb-a67eba074898-whisker-ca-bundle\") pod \"whisker-579586b5b-88x9v\" (UID: \"5855899d-e66c-447d-85fb-a67eba074898\") " pod="calico-system/whisker-579586b5b-88x9v" Sep 12 17:15:09.233882 kubelet[2581]: I0912 17:15:09.233788 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svz4w\" (UniqueName: \"kubernetes.io/projected/5855899d-e66c-447d-85fb-a67eba074898-kube-api-access-svz4w\") pod \"whisker-579586b5b-88x9v\" (UID: \"5855899d-e66c-447d-85fb-a67eba074898\") " pod="calico-system/whisker-579586b5b-88x9v" Sep 12 17:15:09.491803 containerd[1476]: time="2025-09-12T17:15:09.491698527Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-579586b5b-88x9v,Uid:5855899d-e66c-447d-85fb-a67eba074898,Namespace:calico-system,Attempt:0,}" Sep 12 17:15:09.793042 systemd-networkd[1376]: califc4387ce29e: Link UP Sep 12 17:15:09.799182 systemd-networkd[1376]: califc4387ce29e: Gained carrier Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.572 [INFO][3907] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.603 [INFO][3907] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0 whisker-579586b5b- calico-system 5855899d-e66c-447d-85fb-a67eba074898 914 0 2025-09-12 17:15:09 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:579586b5b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 whisker-579586b5b-88x9v eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] califc4387ce29e [] [] }} ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.604 [INFO][3907] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.677 [INFO][3956] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" HandleID="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.678 [INFO][3956] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" HandleID="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034eed0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"whisker-579586b5b-88x9v", "timestamp":"2025-09-12 17:15:09.677889976 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.678 [INFO][3956] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.678 [INFO][3956] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.678 [INFO][3956] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.695 [INFO][3956] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.706 [INFO][3956] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.719 [INFO][3956] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.723 [INFO][3956] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.730 [INFO][3956] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.730 [INFO][3956] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.733 [INFO][3956] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32 Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.748 [INFO][3956] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.766 [INFO][3956] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.193/26] block=192.168.7.192/26 handle="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.766 [INFO][3956] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.193/26] handle="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.766 [INFO][3956] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:09.842695 containerd[1476]: 2025-09-12 17:15:09.766 [INFO][3956] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.193/26] IPv6=[] ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" HandleID="k8s-pod-network.eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" Sep 12 17:15:09.845360 containerd[1476]: 2025-09-12 17:15:09.773 [INFO][3907] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0", GenerateName:"whisker-579586b5b-", Namespace:"calico-system", SelfLink:"", UID:"5855899d-e66c-447d-85fb-a67eba074898", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 15, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"579586b5b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"whisker-579586b5b-88x9v", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.7.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"califc4387ce29e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:09.845360 containerd[1476]: 2025-09-12 17:15:09.773 [INFO][3907] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.193/32] ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" Sep 12 17:15:09.845360 containerd[1476]: 2025-09-12 17:15:09.774 [INFO][3907] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc4387ce29e ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" Sep 12 17:15:09.845360 containerd[1476]: 2025-09-12 17:15:09.807 [INFO][3907] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" Sep 12 17:15:09.845360 containerd[1476]: 2025-09-12 17:15:09.807 [INFO][3907] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0", GenerateName:"whisker-579586b5b-", Namespace:"calico-system", SelfLink:"", UID:"5855899d-e66c-447d-85fb-a67eba074898", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 15, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"579586b5b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32", Pod:"whisker-579586b5b-88x9v", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.7.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"califc4387ce29e", MAC:"ce:cf:3a:ff:a8:58", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:09.845360 containerd[1476]: 2025-09-12 17:15:09.835 [INFO][3907] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32" Namespace="calico-system" Pod="whisker-579586b5b-88x9v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--579586b5b--88x9v-eth0" Sep 12 17:15:09.879751 containerd[1476]: time="2025-09-12T17:15:09.879095365Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:09.879751 containerd[1476]: time="2025-09-12T17:15:09.879649933Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:09.879751 containerd[1476]: time="2025-09-12T17:15:09.879668933Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:09.881208 containerd[1476]: time="2025-09-12T17:15:09.880466465Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:09.934284 systemd[1]: Started cri-containerd-eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32.scope - libcontainer container eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32. Sep 12 17:15:10.052742 containerd[1476]: time="2025-09-12T17:15:10.050686224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-579586b5b-88x9v,Uid:5855899d-e66c-447d-85fb-a67eba074898,Namespace:calico-system,Attempt:0,} returns sandbox id \"eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32\"" Sep 12 17:15:10.076135 containerd[1476]: time="2025-09-12T17:15:10.074316163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:15:10.666728 kubelet[2581]: I0912 17:15:10.666650 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75922f2c-6510-4ef8-a348-e60d70df3666" path="/var/lib/kubelet/pods/75922f2c-6510-4ef8-a348-e60d70df3666/volumes" Sep 12 17:15:11.174255 systemd-networkd[1376]: califc4387ce29e: Gained IPv6LL Sep 12 17:15:11.621396 containerd[1476]: time="2025-09-12T17:15:11.621311387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:11.623268 containerd[1476]: time="2025-09-12T17:15:11.623200093Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 17:15:11.624618 containerd[1476]: time="2025-09-12T17:15:11.624233108Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:11.631463 containerd[1476]: time="2025-09-12T17:15:11.631390888Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:11.632312 containerd[1476]: time="2025-09-12T17:15:11.632255501Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.556447955s" Sep 12 17:15:11.632312 containerd[1476]: time="2025-09-12T17:15:11.632308581Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 17:15:11.643145 containerd[1476]: time="2025-09-12T17:15:11.642899730Z" level=info msg="CreateContainer within sandbox \"eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:15:11.658262 containerd[1476]: time="2025-09-12T17:15:11.657730379Z" level=info msg="StopPodSandbox for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\"" Sep 12 17:15:11.676674 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2688643167.mount: Deactivated successfully. Sep 12 17:15:11.686005 containerd[1476]: time="2025-09-12T17:15:11.684221432Z" level=info msg="CreateContainer within sandbox \"eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"61a3ed1e0adb6e82a9f84db51b371d20e207bb8249c4813211ad7443a7db82f8\"" Sep 12 17:15:11.686859 containerd[1476]: time="2025-09-12T17:15:11.686508784Z" level=info msg="StartContainer for \"61a3ed1e0adb6e82a9f84db51b371d20e207bb8249c4813211ad7443a7db82f8\"" Sep 12 17:15:11.759403 systemd[1]: Started cri-containerd-61a3ed1e0adb6e82a9f84db51b371d20e207bb8249c4813211ad7443a7db82f8.scope - libcontainer container 61a3ed1e0adb6e82a9f84db51b371d20e207bb8249c4813211ad7443a7db82f8. Sep 12 17:15:11.835736 containerd[1476]: time="2025-09-12T17:15:11.835397478Z" level=info msg="StartContainer for \"61a3ed1e0adb6e82a9f84db51b371d20e207bb8249c4813211ad7443a7db82f8\" returns successfully" Sep 12 17:15:11.840349 containerd[1476]: time="2025-09-12T17:15:11.840145785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.772 [INFO][4103] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.773 [INFO][4103] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" iface="eth0" netns="/var/run/netns/cni-1705db14-5bfa-9953-7feb-17ab6bae780b" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.774 [INFO][4103] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" iface="eth0" netns="/var/run/netns/cni-1705db14-5bfa-9953-7feb-17ab6bae780b" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.774 [INFO][4103] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" iface="eth0" netns="/var/run/netns/cni-1705db14-5bfa-9953-7feb-17ab6bae780b" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.774 [INFO][4103] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.774 [INFO][4103] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.811 [INFO][4135] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.812 [INFO][4135] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.812 [INFO][4135] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.830 [WARNING][4135] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.830 [INFO][4135] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.833 [INFO][4135] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:11.843535 containerd[1476]: 2025-09-12 17:15:11.839 [INFO][4103] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:11.847059 containerd[1476]: time="2025-09-12T17:15:11.845020253Z" level=info msg="TearDown network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" successfully" Sep 12 17:15:11.847059 containerd[1476]: time="2025-09-12T17:15:11.845072774Z" level=info msg="StopPodSandbox for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" returns successfully" Sep 12 17:15:11.848501 containerd[1476]: time="2025-09-12T17:15:11.848433061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cmgr5,Uid:a5670fa1-14ff-4b4b-93e9-778888c14647,Namespace:calico-system,Attempt:1,}" Sep 12 17:15:11.852385 systemd[1]: run-netns-cni\x2d1705db14\x2d5bfa\x2d9953\x2d7feb\x2d17ab6bae780b.mount: Deactivated successfully. Sep 12 17:15:12.045020 systemd-networkd[1376]: calidb37d61c2c4: Link UP Sep 12 17:15:12.047925 systemd-networkd[1376]: calidb37d61c2c4: Gained carrier Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.912 [INFO][4154] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.931 [INFO][4154] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0 csi-node-driver- calico-system a5670fa1-14ff-4b4b-93e9-778888c14647 929 0 2025-09-12 17:14:48 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 csi-node-driver-cmgr5 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calidb37d61c2c4 [] [] }} ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.932 [INFO][4154] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.974 [INFO][4168] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" HandleID="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.974 [INFO][4168] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" HandleID="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"csi-node-driver-cmgr5", "timestamp":"2025-09-12 17:15:11.974546435 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.974 [INFO][4168] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.974 [INFO][4168] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.975 [INFO][4168] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.989 [INFO][4168] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:11.997 [INFO][4168] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.004 [INFO][4168] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.008 [INFO][4168] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.012 [INFO][4168] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.012 [INFO][4168] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.015 [INFO][4168] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666 Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.026 [INFO][4168] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.036 [INFO][4168] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.194/26] block=192.168.7.192/26 handle="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.036 [INFO][4168] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.194/26] handle="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.036 [INFO][4168] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:12.084087 containerd[1476]: 2025-09-12 17:15:12.036 [INFO][4168] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.194/26] IPv6=[] ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" HandleID="k8s-pod-network.23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:12.085463 containerd[1476]: 2025-09-12 17:15:12.041 [INFO][4154] cni-plugin/k8s.go 418: Populated endpoint ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5670fa1-14ff-4b4b-93e9-778888c14647", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"csi-node-driver-cmgr5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.7.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidb37d61c2c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:12.085463 containerd[1476]: 2025-09-12 17:15:12.041 [INFO][4154] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.194/32] ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:12.085463 containerd[1476]: 2025-09-12 17:15:12.041 [INFO][4154] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb37d61c2c4 ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:12.085463 containerd[1476]: 2025-09-12 17:15:12.046 [INFO][4154] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:12.085463 containerd[1476]: 2025-09-12 17:15:12.049 [INFO][4154] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5670fa1-14ff-4b4b-93e9-778888c14647", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666", Pod:"csi-node-driver-cmgr5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.7.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidb37d61c2c4", MAC:"c6:3e:99:b1:94:49", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:12.085463 containerd[1476]: 2025-09-12 17:15:12.076 [INFO][4154] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666" Namespace="calico-system" Pod="csi-node-driver-cmgr5" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:12.118439 containerd[1476]: time="2025-09-12T17:15:12.117085449Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:12.118439 containerd[1476]: time="2025-09-12T17:15:12.117193730Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:12.118439 containerd[1476]: time="2025-09-12T17:15:12.117209290Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:12.118439 containerd[1476]: time="2025-09-12T17:15:12.117340732Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:12.145248 systemd[1]: Started cri-containerd-23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666.scope - libcontainer container 23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666. Sep 12 17:15:12.184879 containerd[1476]: time="2025-09-12T17:15:12.184770222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cmgr5,Uid:a5670fa1-14ff-4b4b-93e9-778888c14647,Namespace:calico-system,Attempt:1,} returns sandbox id \"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666\"" Sep 12 17:15:12.657055 containerd[1476]: time="2025-09-12T17:15:12.656071725Z" level=info msg="StopPodSandbox for \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\"" Sep 12 17:15:12.657055 containerd[1476]: time="2025-09-12T17:15:12.656703694Z" level=info msg="StopPodSandbox for \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\"" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.790 [INFO][4263] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.791 [INFO][4263] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" iface="eth0" netns="/var/run/netns/cni-2bbe1a34-01a6-e0c8-e43d-de9b8284fc18" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.791 [INFO][4263] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" iface="eth0" netns="/var/run/netns/cni-2bbe1a34-01a6-e0c8-e43d-de9b8284fc18" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.795 [INFO][4263] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" iface="eth0" netns="/var/run/netns/cni-2bbe1a34-01a6-e0c8-e43d-de9b8284fc18" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.796 [INFO][4263] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.796 [INFO][4263] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.833 [INFO][4273] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.834 [INFO][4273] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.834 [INFO][4273] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.850 [WARNING][4273] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.850 [INFO][4273] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.854 [INFO][4273] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:12.861263 containerd[1476]: 2025-09-12 17:15:12.858 [INFO][4263] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:12.866087 containerd[1476]: time="2025-09-12T17:15:12.863910953Z" level=info msg="TearDown network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\" successfully" Sep 12 17:15:12.866087 containerd[1476]: time="2025-09-12T17:15:12.863996354Z" level=info msg="StopPodSandbox for \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\" returns successfully" Sep 12 17:15:12.866505 systemd[1]: run-netns-cni\x2d2bbe1a34\x2d01a6\x2de0c8\x2de43d\x2dde9b8284fc18.mount: Deactivated successfully. Sep 12 17:15:12.870742 containerd[1476]: time="2025-09-12T17:15:12.869453869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-96m69,Uid:829c1f4c-85d8-405a-a4f5-f939b129360b,Namespace:calico-system,Attempt:1,}" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.787 [INFO][4259] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.792 [INFO][4259] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" iface="eth0" netns="/var/run/netns/cni-265ef70c-29e0-897e-d7e4-a7586a354303" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.794 [INFO][4259] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" iface="eth0" netns="/var/run/netns/cni-265ef70c-29e0-897e-d7e4-a7586a354303" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.801 [INFO][4259] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" iface="eth0" netns="/var/run/netns/cni-265ef70c-29e0-897e-d7e4-a7586a354303" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.801 [INFO][4259] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.802 [INFO][4259] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.836 [INFO][4278] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.837 [INFO][4278] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.854 [INFO][4278] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.880 [WARNING][4278] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.880 [INFO][4278] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.888 [INFO][4278] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:12.902986 containerd[1476]: 2025-09-12 17:15:12.893 [INFO][4259] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:12.905121 containerd[1476]: time="2025-09-12T17:15:12.904966799Z" level=info msg="TearDown network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\" successfully" Sep 12 17:15:12.906608 systemd[1]: run-netns-cni\x2d265ef70c\x2d29e0\x2d897e\x2dd7e4\x2da7586a354303.mount: Deactivated successfully. Sep 12 17:15:12.909640 containerd[1476]: time="2025-09-12T17:15:12.909317979Z" level=info msg="StopPodSandbox for \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\" returns successfully" Sep 12 17:15:12.912350 containerd[1476]: time="2025-09-12T17:15:12.911336967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bfcb5d966-vxpgp,Uid:7997260a-d514-4a5c-b1af-8a79861928e5,Namespace:calico-system,Attempt:1,}" Sep 12 17:15:13.203764 systemd-networkd[1376]: cali2dad3e17c21: Link UP Sep 12 17:15:13.207221 systemd-networkd[1376]: cali2dad3e17c21: Gained carrier Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:12.982 [INFO][4287] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.014 [INFO][4287] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0 goldmane-54d579b49d- calico-system 829c1f4c-85d8-405a-a4f5-f939b129360b 941 0 2025-09-12 17:14:48 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 goldmane-54d579b49d-96m69 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali2dad3e17c21 [] [] }} ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.014 [INFO][4287] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.086 [INFO][4309] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" HandleID="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.087 [INFO][4309] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" HandleID="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3680), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"goldmane-54d579b49d-96m69", "timestamp":"2025-09-12 17:15:13.086773606 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.087 [INFO][4309] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.087 [INFO][4309] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.087 [INFO][4309] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.117 [INFO][4309] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.130 [INFO][4309] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.143 [INFO][4309] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.148 [INFO][4309] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.153 [INFO][4309] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.153 [INFO][4309] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.156 [INFO][4309] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.168 [INFO][4309] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.183 [INFO][4309] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.195/26] block=192.168.7.192/26 handle="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.183 [INFO][4309] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.195/26] handle="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.183 [INFO][4309] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:13.251116 containerd[1476]: 2025-09-12 17:15:13.184 [INFO][4309] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.195/26] IPv6=[] ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" HandleID="k8s-pod-network.aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:13.253201 containerd[1476]: 2025-09-12 17:15:13.187 [INFO][4287] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"829c1f4c-85d8-405a-a4f5-f939b129360b", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"goldmane-54d579b49d-96m69", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.7.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2dad3e17c21", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:13.253201 containerd[1476]: 2025-09-12 17:15:13.188 [INFO][4287] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.195/32] ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:13.253201 containerd[1476]: 2025-09-12 17:15:13.188 [INFO][4287] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2dad3e17c21 ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:13.253201 containerd[1476]: 2025-09-12 17:15:13.220 [INFO][4287] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:13.253201 containerd[1476]: 2025-09-12 17:15:13.223 [INFO][4287] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"829c1f4c-85d8-405a-a4f5-f939b129360b", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a", Pod:"goldmane-54d579b49d-96m69", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.7.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2dad3e17c21", MAC:"f6:7e:32:4f:cd:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:13.253201 containerd[1476]: 2025-09-12 17:15:13.247 [INFO][4287] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a" Namespace="calico-system" Pod="goldmane-54d579b49d-96m69" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:13.286129 systemd-networkd[1376]: calidb37d61c2c4: Gained IPv6LL Sep 12 17:15:13.295523 containerd[1476]: time="2025-09-12T17:15:13.294117375Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:13.295523 containerd[1476]: time="2025-09-12T17:15:13.294239656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:13.295523 containerd[1476]: time="2025-09-12T17:15:13.294253416Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:13.295523 containerd[1476]: time="2025-09-12T17:15:13.294406539Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:13.325348 systemd[1]: Started cri-containerd-aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a.scope - libcontainer container aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a. Sep 12 17:15:13.406587 systemd-networkd[1376]: cali0cbdf3dae2c: Link UP Sep 12 17:15:13.414336 systemd-networkd[1376]: cali0cbdf3dae2c: Gained carrier Sep 12 17:15:13.418672 containerd[1476]: time="2025-09-12T17:15:13.418617221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-96m69,Uid:829c1f4c-85d8-405a-a4f5-f939b129360b,Namespace:calico-system,Attempt:1,} returns sandbox id \"aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a\"" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.010 [INFO][4295] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.051 [INFO][4295] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0 calico-kube-controllers-bfcb5d966- calico-system 7997260a-d514-4a5c-b1af-8a79861928e5 940 0 2025-09-12 17:14:49 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:bfcb5d966 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 calico-kube-controllers-bfcb5d966-vxpgp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0cbdf3dae2c [] [] }} ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.051 [INFO][4295] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.133 [INFO][4314] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" HandleID="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.134 [INFO][4314] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" HandleID="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d740), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"calico-kube-controllers-bfcb5d966-vxpgp", "timestamp":"2025-09-12 17:15:13.13357792 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.134 [INFO][4314] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.184 [INFO][4314] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.184 [INFO][4314] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.225 [INFO][4314] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.333 [INFO][4314] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.345 [INFO][4314] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.352 [INFO][4314] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.357 [INFO][4314] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.357 [INFO][4314] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.361 [INFO][4314] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882 Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.372 [INFO][4314] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.386 [INFO][4314] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.196/26] block=192.168.7.192/26 handle="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.386 [INFO][4314] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.196/26] handle="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.387 [INFO][4314] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:13.440260 containerd[1476]: 2025-09-12 17:15:13.387 [INFO][4314] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.196/26] IPv6=[] ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" HandleID="k8s-pod-network.c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:13.441217 containerd[1476]: 2025-09-12 17:15:13.390 [INFO][4295] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0", GenerateName:"calico-kube-controllers-bfcb5d966-", Namespace:"calico-system", SelfLink:"", UID:"7997260a-d514-4a5c-b1af-8a79861928e5", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bfcb5d966", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"calico-kube-controllers-bfcb5d966-vxpgp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.7.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0cbdf3dae2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:13.441217 containerd[1476]: 2025-09-12 17:15:13.391 [INFO][4295] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.196/32] ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:13.441217 containerd[1476]: 2025-09-12 17:15:13.391 [INFO][4295] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0cbdf3dae2c ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:13.441217 containerd[1476]: 2025-09-12 17:15:13.417 [INFO][4295] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:13.441217 containerd[1476]: 2025-09-12 17:15:13.417 [INFO][4295] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0", GenerateName:"calico-kube-controllers-bfcb5d966-", Namespace:"calico-system", SelfLink:"", UID:"7997260a-d514-4a5c-b1af-8a79861928e5", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bfcb5d966", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882", Pod:"calico-kube-controllers-bfcb5d966-vxpgp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.7.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0cbdf3dae2c", MAC:"62:ba:ae:46:fe:1e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:13.441217 containerd[1476]: 2025-09-12 17:15:13.434 [INFO][4295] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882" Namespace="calico-system" Pod="calico-kube-controllers-bfcb5d966-vxpgp" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:13.469916 containerd[1476]: time="2025-09-12T17:15:13.467643645Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:13.469916 containerd[1476]: time="2025-09-12T17:15:13.467749807Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:13.469916 containerd[1476]: time="2025-09-12T17:15:13.467774487Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:13.469916 containerd[1476]: time="2025-09-12T17:15:13.468012850Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:13.491446 systemd[1]: Started cri-containerd-c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882.scope - libcontainer container c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882. Sep 12 17:15:13.543164 containerd[1476]: time="2025-09-12T17:15:13.543093067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bfcb5d966-vxpgp,Uid:7997260a-d514-4a5c-b1af-8a79861928e5,Namespace:calico-system,Attempt:1,} returns sandbox id \"c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882\"" Sep 12 17:15:13.656729 containerd[1476]: time="2025-09-12T17:15:13.656230640Z" level=info msg="StopPodSandbox for \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\"" Sep 12 17:15:13.670133 containerd[1476]: time="2025-09-12T17:15:13.670065147Z" level=info msg="StopPodSandbox for \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\"" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.860 [INFO][4440] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.861 [INFO][4440] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" iface="eth0" netns="/var/run/netns/cni-f0c6712d-1414-1b1e-575d-b6d82cf1a0d5" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.862 [INFO][4440] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" iface="eth0" netns="/var/run/netns/cni-f0c6712d-1414-1b1e-575d-b6d82cf1a0d5" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.863 [INFO][4440] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" iface="eth0" netns="/var/run/netns/cni-f0c6712d-1414-1b1e-575d-b6d82cf1a0d5" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.863 [INFO][4440] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.863 [INFO][4440] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.918 [INFO][4465] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.918 [INFO][4465] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.919 [INFO][4465] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.934 [WARNING][4465] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.935 [INFO][4465] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.941 [INFO][4465] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:13.952728 containerd[1476]: 2025-09-12 17:15:13.946 [INFO][4440] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:13.957021 containerd[1476]: time="2025-09-12T17:15:13.955186369Z" level=info msg="TearDown network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\" successfully" Sep 12 17:15:13.957021 containerd[1476]: time="2025-09-12T17:15:13.955257250Z" level=info msg="StopPodSandbox for \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\" returns successfully" Sep 12 17:15:13.960564 systemd[1]: run-netns-cni\x2df0c6712d\x2d1414\x2d1b1e\x2d575d\x2db6d82cf1a0d5.mount: Deactivated successfully. Sep 12 17:15:13.965456 containerd[1476]: time="2025-09-12T17:15:13.964730139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-zt2zj,Uid:5cfa2648-c93e-445e-9e1f-6da367fb2890,Namespace:calico-apiserver,Attempt:1,}" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.879 [INFO][4454] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.881 [INFO][4454] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" iface="eth0" netns="/var/run/netns/cni-129dd123-a500-d418-008c-c0c399ed9d39" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.882 [INFO][4454] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" iface="eth0" netns="/var/run/netns/cni-129dd123-a500-d418-008c-c0c399ed9d39" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.884 [INFO][4454] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" iface="eth0" netns="/var/run/netns/cni-129dd123-a500-d418-008c-c0c399ed9d39" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.884 [INFO][4454] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.884 [INFO][4454] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.966 [INFO][4470] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.967 [INFO][4470] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.967 [INFO][4470] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.994 [WARNING][4470] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.994 [INFO][4470] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:13.999 [INFO][4470] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:14.024741 containerd[1476]: 2025-09-12 17:15:14.010 [INFO][4454] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:14.031282 kubelet[2581]: I0912 17:15:14.030725 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:15:14.036192 containerd[1476]: time="2025-09-12T17:15:14.033011776Z" level=info msg="TearDown network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\" successfully" Sep 12 17:15:14.036192 containerd[1476]: time="2025-09-12T17:15:14.033080577Z" level=info msg="StopPodSandbox for \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\" returns successfully" Sep 12 17:15:14.034643 systemd[1]: run-netns-cni\x2d129dd123\x2da500\x2dd418\x2d008c\x2dc0c399ed9d39.mount: Deactivated successfully. Sep 12 17:15:14.046112 containerd[1476]: time="2025-09-12T17:15:14.044406208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tcms6,Uid:01137908-9b3f-43f3-895b-bb6ad5c520d9,Namespace:kube-system,Attempt:1,}" Sep 12 17:15:14.301598 systemd-networkd[1376]: calia38d68c3e10: Link UP Sep 12 17:15:14.302600 systemd-networkd[1376]: calia38d68c3e10: Gained carrier Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.106 [INFO][4479] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.157 [INFO][4479] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0 calico-apiserver-6dcc7b598- calico-apiserver 5cfa2648-c93e-445e-9e1f-6da367fb2890 953 0 2025-09-12 17:14:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6dcc7b598 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 calico-apiserver-6dcc7b598-zt2zj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia38d68c3e10 [] [] }} ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.157 [INFO][4479] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.210 [INFO][4505] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" HandleID="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.210 [INFO][4505] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" HandleID="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"calico-apiserver-6dcc7b598-zt2zj", "timestamp":"2025-09-12 17:15:14.210505859 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.210 [INFO][4505] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.211 [INFO][4505] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.211 [INFO][4505] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.229 [INFO][4505] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.237 [INFO][4505] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.250 [INFO][4505] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.254 [INFO][4505] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.258 [INFO][4505] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.258 [INFO][4505] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.261 [INFO][4505] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.269 [INFO][4505] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.283 [INFO][4505] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.197/26] block=192.168.7.192/26 handle="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.283 [INFO][4505] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.197/26] handle="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.284 [INFO][4505] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:14.337130 containerd[1476]: 2025-09-12 17:15:14.284 [INFO][4505] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.197/26] IPv6=[] ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" HandleID="k8s-pod-network.f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:14.340216 containerd[1476]: 2025-09-12 17:15:14.294 [INFO][4479] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"5cfa2648-c93e-445e-9e1f-6da367fb2890", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"calico-apiserver-6dcc7b598-zt2zj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia38d68c3e10", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:14.340216 containerd[1476]: 2025-09-12 17:15:14.295 [INFO][4479] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.197/32] ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:14.340216 containerd[1476]: 2025-09-12 17:15:14.295 [INFO][4479] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia38d68c3e10 ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:14.340216 containerd[1476]: 2025-09-12 17:15:14.300 [INFO][4479] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:14.340216 containerd[1476]: 2025-09-12 17:15:14.312 [INFO][4479] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"5cfa2648-c93e-445e-9e1f-6da367fb2890", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae", Pod:"calico-apiserver-6dcc7b598-zt2zj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia38d68c3e10", MAC:"82:cb:1e:dd:a1:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:14.340216 containerd[1476]: 2025-09-12 17:15:14.331 [INFO][4479] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-zt2zj" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:14.398924 containerd[1476]: time="2025-09-12T17:15:14.397515908Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:14.398924 containerd[1476]: time="2025-09-12T17:15:14.397621229Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:14.398924 containerd[1476]: time="2025-09-12T17:15:14.397649310Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:14.398924 containerd[1476]: time="2025-09-12T17:15:14.397793712Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:14.437200 systemd[1]: Started cri-containerd-f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae.scope - libcontainer container f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae. Sep 12 17:15:14.441931 systemd-networkd[1376]: calie4626d7d07a: Link UP Sep 12 17:15:14.445068 systemd-networkd[1376]: calie4626d7d07a: Gained carrier Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.169 [INFO][4490] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.189 [INFO][4490] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0 coredns-674b8bbfcf- kube-system 01137908-9b3f-43f3-895b-bb6ad5c520d9 954 0 2025-09-12 17:14:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 coredns-674b8bbfcf-tcms6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie4626d7d07a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.189 [INFO][4490] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.248 [INFO][4512] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" HandleID="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.249 [INFO][4512] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" HandleID="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d36e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"coredns-674b8bbfcf-tcms6", "timestamp":"2025-09-12 17:15:14.24892421 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.249 [INFO][4512] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.283 [INFO][4512] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.284 [INFO][4512] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.336 [INFO][4512] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.359 [INFO][4512] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.374 [INFO][4512] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.381 [INFO][4512] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.388 [INFO][4512] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.388 [INFO][4512] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.393 [INFO][4512] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423 Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.402 [INFO][4512] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.417 [INFO][4512] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.198/26] block=192.168.7.192/26 handle="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.418 [INFO][4512] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.198/26] handle="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.418 [INFO][4512] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:14.484382 containerd[1476]: 2025-09-12 17:15:14.419 [INFO][4512] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.198/26] IPv6=[] ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" HandleID="k8s-pod-network.f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.485550 containerd[1476]: 2025-09-12 17:15:14.427 [INFO][4490] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"01137908-9b3f-43f3-895b-bb6ad5c520d9", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"coredns-674b8bbfcf-tcms6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie4626d7d07a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:14.485550 containerd[1476]: 2025-09-12 17:15:14.427 [INFO][4490] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.198/32] ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.485550 containerd[1476]: 2025-09-12 17:15:14.427 [INFO][4490] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie4626d7d07a ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.485550 containerd[1476]: 2025-09-12 17:15:14.446 [INFO][4490] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.485550 containerd[1476]: 2025-09-12 17:15:14.452 [INFO][4490] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"01137908-9b3f-43f3-895b-bb6ad5c520d9", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423", Pod:"coredns-674b8bbfcf-tcms6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie4626d7d07a", MAC:"ea:f5:0a:57:61:9a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:14.485550 containerd[1476]: 2025-09-12 17:15:14.479 [INFO][4490] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423" Namespace="kube-system" Pod="coredns-674b8bbfcf-tcms6" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:14.502036 systemd-networkd[1376]: cali2dad3e17c21: Gained IPv6LL Sep 12 17:15:14.540872 containerd[1476]: time="2025-09-12T17:15:14.540316169Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:14.540872 containerd[1476]: time="2025-09-12T17:15:14.540783175Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:14.541394 containerd[1476]: time="2025-09-12T17:15:14.541142700Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:14.543532 containerd[1476]: time="2025-09-12T17:15:14.542867243Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:14.582132 systemd[1]: Started cri-containerd-f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423.scope - libcontainer container f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423. Sep 12 17:15:14.645060 containerd[1476]: time="2025-09-12T17:15:14.644905041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-zt2zj,Uid:5cfa2648-c93e-445e-9e1f-6da367fb2890,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae\"" Sep 12 17:15:14.658374 containerd[1476]: time="2025-09-12T17:15:14.657571209Z" level=info msg="StopPodSandbox for \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\"" Sep 12 17:15:14.685441 containerd[1476]: time="2025-09-12T17:15:14.684784972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tcms6,Uid:01137908-9b3f-43f3-895b-bb6ad5c520d9,Namespace:kube-system,Attempt:1,} returns sandbox id \"f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423\"" Sep 12 17:15:14.713731 containerd[1476]: time="2025-09-12T17:15:14.713168749Z" level=info msg="CreateContainer within sandbox \"f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:15:14.760731 containerd[1476]: time="2025-09-12T17:15:14.759636408Z" level=info msg="CreateContainer within sandbox \"f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8350d5f728a06eb533f24d9413d52850cc5517d71bfa579d068d17034d5b600c\"" Sep 12 17:15:14.762920 containerd[1476]: time="2025-09-12T17:15:14.762247283Z" level=info msg="StartContainer for \"8350d5f728a06eb533f24d9413d52850cc5517d71bfa579d068d17034d5b600c\"" Sep 12 17:15:14.858682 systemd[1]: Started cri-containerd-8350d5f728a06eb533f24d9413d52850cc5517d71bfa579d068d17034d5b600c.scope - libcontainer container 8350d5f728a06eb533f24d9413d52850cc5517d71bfa579d068d17034d5b600c. Sep 12 17:15:14.887048 systemd-networkd[1376]: cali0cbdf3dae2c: Gained IPv6LL Sep 12 17:15:14.986922 containerd[1476]: time="2025-09-12T17:15:14.986671070Z" level=info msg="StartContainer for \"8350d5f728a06eb533f24d9413d52850cc5517d71bfa579d068d17034d5b600c\" returns successfully" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.875 [INFO][4627] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.878 [INFO][4627] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" iface="eth0" netns="/var/run/netns/cni-d53e55dc-7943-3e30-8013-562f98ed6ec5" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.878 [INFO][4627] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" iface="eth0" netns="/var/run/netns/cni-d53e55dc-7943-3e30-8013-562f98ed6ec5" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.880 [INFO][4627] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" iface="eth0" netns="/var/run/netns/cni-d53e55dc-7943-3e30-8013-562f98ed6ec5" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.880 [INFO][4627] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.880 [INFO][4627] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.984 [INFO][4664] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.984 [INFO][4664] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:14.984 [INFO][4664] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:15.019 [WARNING][4664] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:15.019 [INFO][4664] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:15.028 [INFO][4664] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:15.044963 containerd[1476]: 2025-09-12 17:15:15.033 [INFO][4627] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:15.064381 containerd[1476]: time="2025-09-12T17:15:15.064306089Z" level=info msg="TearDown network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\" successfully" Sep 12 17:15:15.065414 containerd[1476]: time="2025-09-12T17:15:15.065370943Z" level=info msg="StopPodSandbox for \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\" returns successfully" Sep 12 17:15:15.067740 containerd[1476]: time="2025-09-12T17:15:15.067632253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tgfmg,Uid:c0d7b056-4972-4c60-a388-fd78bc10058f,Namespace:kube-system,Attempt:1,}" Sep 12 17:15:15.207509 kubelet[2581]: I0912 17:15:15.206791 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-tcms6" podStartSLOduration=50.206763674 podStartE2EDuration="50.206763674s" podCreationTimestamp="2025-09-12 17:14:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:15:15.205247934 +0000 UTC m=+56.761071725" watchObservedRunningTime="2025-09-12 17:15:15.206763674 +0000 UTC m=+56.762587465" Sep 12 17:15:15.642610 systemd-networkd[1376]: calib085a15dc0a: Link UP Sep 12 17:15:15.645652 systemd-networkd[1376]: calib085a15dc0a: Gained carrier Sep 12 17:15:15.656449 containerd[1476]: time="2025-09-12T17:15:15.653553162Z" level=info msg="StopPodSandbox for \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\"" Sep 12 17:15:15.671575 systemd[1]: run-netns-cni\x2dd53e55dc\x2d7943\x2d3e30\x2d8013\x2d562f98ed6ec5.mount: Deactivated successfully. Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.274 [INFO][4694] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.362 [INFO][4694] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0 coredns-674b8bbfcf- kube-system c0d7b056-4972-4c60-a388-fd78bc10058f 974 0 2025-09-12 17:14:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 coredns-674b8bbfcf-tgfmg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib085a15dc0a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.363 [INFO][4694] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.448 [INFO][4709] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" HandleID="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.449 [INFO][4709] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" HandleID="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3650), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"coredns-674b8bbfcf-tgfmg", "timestamp":"2025-09-12 17:15:15.447996792 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.450 [INFO][4709] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.450 [INFO][4709] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.450 [INFO][4709] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.480 [INFO][4709] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.505 [INFO][4709] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.535 [INFO][4709] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.564 [INFO][4709] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.574 [INFO][4709] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.574 [INFO][4709] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.583 [INFO][4709] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6 Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.598 [INFO][4709] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.617 [INFO][4709] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.199/26] block=192.168.7.192/26 handle="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.617 [INFO][4709] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.199/26] handle="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.617 [INFO][4709] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:15.708944 containerd[1476]: 2025-09-12 17:15:15.617 [INFO][4709] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.199/26] IPv6=[] ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" HandleID="k8s-pod-network.a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.711280 containerd[1476]: 2025-09-12 17:15:15.626 [INFO][4694] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c0d7b056-4972-4c60-a388-fd78bc10058f", ResourceVersion:"974", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"coredns-674b8bbfcf-tgfmg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib085a15dc0a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:15.711280 containerd[1476]: 2025-09-12 17:15:15.626 [INFO][4694] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.199/32] ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.711280 containerd[1476]: 2025-09-12 17:15:15.626 [INFO][4694] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib085a15dc0a ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.711280 containerd[1476]: 2025-09-12 17:15:15.645 [INFO][4694] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.711280 containerd[1476]: 2025-09-12 17:15:15.648 [INFO][4694] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c0d7b056-4972-4c60-a388-fd78bc10058f", ResourceVersion:"974", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6", Pod:"coredns-674b8bbfcf-tgfmg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib085a15dc0a", MAC:"46:93:56:eb:db:99", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:15.711280 containerd[1476]: 2025-09-12 17:15:15.698 [INFO][4694] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6" Namespace="kube-system" Pod="coredns-674b8bbfcf-tgfmg" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:15.784067 systemd-networkd[1376]: calie4626d7d07a: Gained IPv6LL Sep 12 17:15:15.833016 containerd[1476]: time="2025-09-12T17:15:15.832544025Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:15.833016 containerd[1476]: time="2025-09-12T17:15:15.832644707Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:15.833016 containerd[1476]: time="2025-09-12T17:15:15.832661867Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:15.836943 containerd[1476]: time="2025-09-12T17:15:15.835853629Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:15.942887 systemd[1]: Started cri-containerd-a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6.scope - libcontainer container a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6. Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.848 [INFO][4737] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.852 [INFO][4737] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" iface="eth0" netns="/var/run/netns/cni-d97d4b49-3a15-242c-2627-558e6227badf" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.852 [INFO][4737] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" iface="eth0" netns="/var/run/netns/cni-d97d4b49-3a15-242c-2627-558e6227badf" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.853 [INFO][4737] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" iface="eth0" netns="/var/run/netns/cni-d97d4b49-3a15-242c-2627-558e6227badf" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.853 [INFO][4737] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.853 [INFO][4737] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.907 [INFO][4770] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.908 [INFO][4770] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.908 [INFO][4770] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.967 [WARNING][4770] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.967 [INFO][4770] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.978 [INFO][4770] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:15.988274 containerd[1476]: 2025-09-12 17:15:15.982 [INFO][4737] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:15.996858 containerd[1476]: time="2025-09-12T17:15:15.996594373Z" level=info msg="TearDown network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\" successfully" Sep 12 17:15:15.996858 containerd[1476]: time="2025-09-12T17:15:15.996651013Z" level=info msg="StopPodSandbox for \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\" returns successfully" Sep 12 17:15:15.998644 systemd[1]: run-netns-cni\x2dd97d4b49\x2d3a15\x2d242c\x2d2627\x2d558e6227badf.mount: Deactivated successfully. Sep 12 17:15:15.999084 containerd[1476]: time="2025-09-12T17:15:15.998705960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-kld6v,Uid:c181bfd7-3816-4413-900a-22a73e4de4f5,Namespace:calico-apiserver,Attempt:1,}" Sep 12 17:15:16.166987 systemd-networkd[1376]: calia38d68c3e10: Gained IPv6LL Sep 12 17:15:16.192663 containerd[1476]: time="2025-09-12T17:15:16.192475817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tgfmg,Uid:c0d7b056-4972-4c60-a388-fd78bc10058f,Namespace:kube-system,Attempt:1,} returns sandbox id \"a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6\"" Sep 12 17:15:16.214132 containerd[1476]: time="2025-09-12T17:15:16.212590996Z" level=info msg="CreateContainer within sandbox \"a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:15:16.263076 containerd[1476]: time="2025-09-12T17:15:16.262932165Z" level=info msg="CreateContainer within sandbox \"a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f86c746d077f70f8ba88d054445ff0ae433dca5a7734926187e644689aff1450\"" Sep 12 17:15:16.266585 containerd[1476]: time="2025-09-12T17:15:16.266001524Z" level=info msg="StartContainer for \"f86c746d077f70f8ba88d054445ff0ae433dca5a7734926187e644689aff1450\"" Sep 12 17:15:16.333090 kernel: bpftool[4837]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 12 17:15:16.395207 systemd[1]: Started cri-containerd-f86c746d077f70f8ba88d054445ff0ae433dca5a7734926187e644689aff1450.scope - libcontainer container f86c746d077f70f8ba88d054445ff0ae433dca5a7734926187e644689aff1450. Sep 12 17:15:16.495215 containerd[1476]: time="2025-09-12T17:15:16.492853047Z" level=info msg="StartContainer for \"f86c746d077f70f8ba88d054445ff0ae433dca5a7734926187e644689aff1450\" returns successfully" Sep 12 17:15:16.650769 containerd[1476]: time="2025-09-12T17:15:16.650678440Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:16.659013 containerd[1476]: time="2025-09-12T17:15:16.657986934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 17:15:16.662177 containerd[1476]: time="2025-09-12T17:15:16.662088627Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:16.680228 containerd[1476]: time="2025-09-12T17:15:16.676678255Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:16.680228 containerd[1476]: time="2025-09-12T17:15:16.677319943Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 4.837096237s" Sep 12 17:15:16.680228 containerd[1476]: time="2025-09-12T17:15:16.677364864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 17:15:16.678481 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4248729256.mount: Deactivated successfully. Sep 12 17:15:16.687856 containerd[1476]: time="2025-09-12T17:15:16.687305392Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:15:16.696504 containerd[1476]: time="2025-09-12T17:15:16.696281588Z" level=info msg="CreateContainer within sandbox \"eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:15:16.730316 containerd[1476]: time="2025-09-12T17:15:16.730224465Z" level=info msg="CreateContainer within sandbox \"eab7723b7c5513258e05e58bc91822348996473c70c537dce195ecd8a0a56a32\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"6d1959ff5a3c927788e665e09a51bb4e59cec93e538dbbd5587684975769ffba\"" Sep 12 17:15:16.732503 containerd[1476]: time="2025-09-12T17:15:16.731336959Z" level=info msg="StartContainer for \"6d1959ff5a3c927788e665e09a51bb4e59cec93e538dbbd5587684975769ffba\"" Sep 12 17:15:16.766386 systemd-networkd[1376]: cali773af562ac8: Link UP Sep 12 17:15:16.766705 systemd-networkd[1376]: cali773af562ac8: Gained carrier Sep 12 17:15:16.841666 systemd[1]: Started cri-containerd-6d1959ff5a3c927788e665e09a51bb4e59cec93e538dbbd5587684975769ffba.scope - libcontainer container 6d1959ff5a3c927788e665e09a51bb4e59cec93e538dbbd5587684975769ffba. Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.296 [INFO][4797] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0 calico-apiserver-6dcc7b598- calico-apiserver c181bfd7-3816-4413-900a-22a73e4de4f5 985 0 2025-09-12 17:14:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6dcc7b598 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-e-c5bf4513f4 calico-apiserver-6dcc7b598-kld6v eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali773af562ac8 [] [] }} ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.296 [INFO][4797] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.510 [INFO][4833] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" HandleID="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.510 [INFO][4833] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" HandleID="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400062a060), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-6-e-c5bf4513f4", "pod":"calico-apiserver-6dcc7b598-kld6v", "timestamp":"2025-09-12 17:15:16.510368832 +0000 UTC"}, Hostname:"ci-4081-3-6-e-c5bf4513f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.510 [INFO][4833] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.510 [INFO][4833] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.510 [INFO][4833] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-e-c5bf4513f4' Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.568 [INFO][4833] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.607 [INFO][4833] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.652 [INFO][4833] ipam/ipam.go 511: Trying affinity for 192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.667 [INFO][4833] ipam/ipam.go 158: Attempting to load block cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.697 [INFO][4833] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.7.192/26 host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.700 [INFO][4833] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.7.192/26 handle="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.703 [INFO][4833] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.724 [INFO][4833] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.7.192/26 handle="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.749 [INFO][4833] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.7.200/26] block=192.168.7.192/26 handle="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.749 [INFO][4833] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.7.200/26] handle="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" host="ci-4081-3-6-e-c5bf4513f4" Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.750 [INFO][4833] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:16.843102 containerd[1476]: 2025-09-12 17:15:16.750 [INFO][4833] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.200/26] IPv6=[] ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" HandleID="k8s-pod-network.bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:16.845745 containerd[1476]: 2025-09-12 17:15:16.760 [INFO][4797] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"c181bfd7-3816-4413-900a-22a73e4de4f5", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"", Pod:"calico-apiserver-6dcc7b598-kld6v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali773af562ac8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:16.845745 containerd[1476]: 2025-09-12 17:15:16.760 [INFO][4797] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.7.200/32] ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:16.845745 containerd[1476]: 2025-09-12 17:15:16.760 [INFO][4797] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali773af562ac8 ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:16.845745 containerd[1476]: 2025-09-12 17:15:16.768 [INFO][4797] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:16.845745 containerd[1476]: 2025-09-12 17:15:16.770 [INFO][4797] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"c181bfd7-3816-4413-900a-22a73e4de4f5", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e", Pod:"calico-apiserver-6dcc7b598-kld6v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali773af562ac8", MAC:"7e:d9:6c:7d:85:37", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:16.845745 containerd[1476]: 2025-09-12 17:15:16.828 [INFO][4797] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e" Namespace="calico-apiserver" Pod="calico-apiserver-6dcc7b598-kld6v" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:16.870655 systemd-networkd[1376]: calib085a15dc0a: Gained IPv6LL Sep 12 17:15:16.889765 containerd[1476]: time="2025-09-12T17:15:16.888750707Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:16.889765 containerd[1476]: time="2025-09-12T17:15:16.888889789Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:16.889765 containerd[1476]: time="2025-09-12T17:15:16.888905669Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:16.889765 containerd[1476]: time="2025-09-12T17:15:16.889034951Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:16.927158 systemd[1]: Started cri-containerd-bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e.scope - libcontainer container bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e. Sep 12 17:15:17.060960 containerd[1476]: time="2025-09-12T17:15:17.060108543Z" level=info msg="StartContainer for \"6d1959ff5a3c927788e665e09a51bb4e59cec93e538dbbd5587684975769ffba\" returns successfully" Sep 12 17:15:17.141130 containerd[1476]: time="2025-09-12T17:15:17.141035250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dcc7b598-kld6v,Uid:c181bfd7-3816-4413-900a-22a73e4de4f5,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e\"" Sep 12 17:15:17.192133 kubelet[2581]: I0912 17:15:17.190715 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-579586b5b-88x9v" podStartSLOduration=1.579737116 podStartE2EDuration="8.1906836s" podCreationTimestamp="2025-09-12 17:15:09 +0000 UTC" firstStartedPulling="2025-09-12 17:15:10.07338043 +0000 UTC m=+51.629204181" lastFinishedPulling="2025-09-12 17:15:16.684326874 +0000 UTC m=+58.240150665" observedRunningTime="2025-09-12 17:15:17.189221621 +0000 UTC m=+58.745045412" watchObservedRunningTime="2025-09-12 17:15:17.1906836 +0000 UTC m=+58.746507391" Sep 12 17:15:17.228703 kubelet[2581]: I0912 17:15:17.227948 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-tgfmg" podStartSLOduration=52.227920953 podStartE2EDuration="52.227920953s" podCreationTimestamp="2025-09-12 17:14:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:15:17.226460654 +0000 UTC m=+58.782284445" watchObservedRunningTime="2025-09-12 17:15:17.227920953 +0000 UTC m=+58.783744744" Sep 12 17:15:17.462907 systemd-networkd[1376]: vxlan.calico: Link UP Sep 12 17:15:17.462917 systemd-networkd[1376]: vxlan.calico: Gained carrier Sep 12 17:15:18.220631 systemd-networkd[1376]: cali773af562ac8: Gained IPv6LL Sep 12 17:15:18.303087 containerd[1476]: time="2025-09-12T17:15:18.302989859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:18.304339 containerd[1476]: time="2025-09-12T17:15:18.303978311Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 17:15:18.304339 containerd[1476]: time="2025-09-12T17:15:18.303997232Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:18.322193 containerd[1476]: time="2025-09-12T17:15:18.322127898Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:18.324533 containerd[1476]: time="2025-09-12T17:15:18.323917921Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.636535567s" Sep 12 17:15:18.324533 containerd[1476]: time="2025-09-12T17:15:18.323981242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 17:15:18.327143 containerd[1476]: time="2025-09-12T17:15:18.326545274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:15:18.333679 containerd[1476]: time="2025-09-12T17:15:18.333288038Z" level=info msg="CreateContainer within sandbox \"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:15:18.358698 containerd[1476]: time="2025-09-12T17:15:18.358589194Z" level=info msg="CreateContainer within sandbox \"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"bfe17e107a1713a244a4a263eaa2546c42758989c691c296dca10f4d0f004315\"" Sep 12 17:15:18.360685 containerd[1476]: time="2025-09-12T17:15:18.359686648Z" level=info msg="StartContainer for \"bfe17e107a1713a244a4a263eaa2546c42758989c691c296dca10f4d0f004315\"" Sep 12 17:15:18.414159 systemd[1]: Started cri-containerd-bfe17e107a1713a244a4a263eaa2546c42758989c691c296dca10f4d0f004315.scope - libcontainer container bfe17e107a1713a244a4a263eaa2546c42758989c691c296dca10f4d0f004315. Sep 12 17:15:18.465541 containerd[1476]: time="2025-09-12T17:15:18.465480811Z" level=info msg="StartContainer for \"bfe17e107a1713a244a4a263eaa2546c42758989c691c296dca10f4d0f004315\" returns successfully" Sep 12 17:15:18.619090 containerd[1476]: time="2025-09-12T17:15:18.618673847Z" level=info msg="StopPodSandbox for \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\"" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.687 [WARNING][5110] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0", GenerateName:"calico-kube-controllers-bfcb5d966-", Namespace:"calico-system", SelfLink:"", UID:"7997260a-d514-4a5c-b1af-8a79861928e5", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bfcb5d966", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882", Pod:"calico-kube-controllers-bfcb5d966-vxpgp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.7.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0cbdf3dae2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.687 [INFO][5110] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.687 [INFO][5110] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" iface="eth0" netns="" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.687 [INFO][5110] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.687 [INFO][5110] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.713 [INFO][5119] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.714 [INFO][5119] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.714 [INFO][5119] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.756 [WARNING][5119] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.756 [INFO][5119] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.762 [INFO][5119] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:18.769894 containerd[1476]: 2025-09-12 17:15:18.766 [INFO][5110] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.772290 containerd[1476]: time="2025-09-12T17:15:18.770042020Z" level=info msg="TearDown network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\" successfully" Sep 12 17:15:18.772290 containerd[1476]: time="2025-09-12T17:15:18.770105341Z" level=info msg="StopPodSandbox for \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\" returns successfully" Sep 12 17:15:18.772290 containerd[1476]: time="2025-09-12T17:15:18.772013845Z" level=info msg="RemovePodSandbox for \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\"" Sep 12 17:15:18.772290 containerd[1476]: time="2025-09-12T17:15:18.772095846Z" level=info msg="Forcibly stopping sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\"" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.829 [WARNING][5134] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0", GenerateName:"calico-kube-controllers-bfcb5d966-", Namespace:"calico-system", SelfLink:"", UID:"7997260a-d514-4a5c-b1af-8a79861928e5", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bfcb5d966", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882", Pod:"calico-kube-controllers-bfcb5d966-vxpgp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.7.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0cbdf3dae2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.829 [INFO][5134] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.830 [INFO][5134] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" iface="eth0" netns="" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.830 [INFO][5134] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.830 [INFO][5134] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.864 [INFO][5141] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.864 [INFO][5141] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.864 [INFO][5141] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.879 [WARNING][5141] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.880 [INFO][5141] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" HandleID="k8s-pod-network.f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--kube--controllers--bfcb5d966--vxpgp-eth0" Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.883 [INFO][5141] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:18.894088 containerd[1476]: 2025-09-12 17:15:18.889 [INFO][5134] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4" Sep 12 17:15:18.894088 containerd[1476]: time="2025-09-12T17:15:18.893203201Z" level=info msg="TearDown network for sandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\" successfully" Sep 12 17:15:18.910145 containerd[1476]: time="2025-09-12T17:15:18.910070852Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:18.910597 containerd[1476]: time="2025-09-12T17:15:18.910336255Z" level=info msg="RemovePodSandbox \"f5ca896a84f5c38d9b3e9a1065c52e3989358dd8d7d68d1cdb5619b97d2db8d4\" returns successfully" Sep 12 17:15:18.912953 containerd[1476]: time="2025-09-12T17:15:18.912893447Z" level=info msg="StopPodSandbox for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\"" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:18.992 [WARNING][5158] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5670fa1-14ff-4b4b-93e9-778888c14647", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666", Pod:"csi-node-driver-cmgr5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.7.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidb37d61c2c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:18.993 [INFO][5158] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:18.993 [INFO][5158] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" iface="eth0" netns="" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:18.993 [INFO][5158] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:18.993 [INFO][5158] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:19.039 [INFO][5166] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:19.039 [INFO][5166] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:19.039 [INFO][5166] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:19.053 [WARNING][5166] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:19.053 [INFO][5166] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:19.059 [INFO][5166] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.064286 containerd[1476]: 2025-09-12 17:15:19.061 [INFO][5158] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.065679 containerd[1476]: time="2025-09-12T17:15:19.064312930Z" level=info msg="TearDown network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" successfully" Sep 12 17:15:19.065679 containerd[1476]: time="2025-09-12T17:15:19.064349890Z" level=info msg="StopPodSandbox for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" returns successfully" Sep 12 17:15:19.065679 containerd[1476]: time="2025-09-12T17:15:19.065211381Z" level=info msg="RemovePodSandbox for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\"" Sep 12 17:15:19.065679 containerd[1476]: time="2025-09-12T17:15:19.065259062Z" level=info msg="Forcibly stopping sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\"" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.123 [WARNING][5183] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5670fa1-14ff-4b4b-93e9-778888c14647", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666", Pod:"csi-node-driver-cmgr5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.7.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidb37d61c2c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.125 [INFO][5183] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.125 [INFO][5183] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" iface="eth0" netns="" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.125 [INFO][5183] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.125 [INFO][5183] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.154 [INFO][5190] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.154 [INFO][5190] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.154 [INFO][5190] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.167 [WARNING][5190] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.167 [INFO][5190] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" HandleID="k8s-pod-network.b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-csi--node--driver--cmgr5-eth0" Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.172 [INFO][5190] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.177591 containerd[1476]: 2025-09-12 17:15:19.175 [INFO][5183] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982" Sep 12 17:15:19.179047 containerd[1476]: time="2025-09-12T17:15:19.177921612Z" level=info msg="TearDown network for sandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" successfully" Sep 12 17:15:19.194807 containerd[1476]: time="2025-09-12T17:15:19.194711099Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:19.194807 containerd[1476]: time="2025-09-12T17:15:19.194872781Z" level=info msg="RemovePodSandbox \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" returns successfully" Sep 12 17:15:19.196010 containerd[1476]: time="2025-09-12T17:15:19.195956074Z" level=info msg="StopPodSandbox for \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\"" Sep 12 17:15:19.196145 containerd[1476]: time="2025-09-12T17:15:19.196029235Z" level=error msg="PodSandboxStatus for \"b6131ca5a06cd979d7fc44ebb4d47d6e6852730845aef20f0c707ceba2270982\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find sandbox: not found" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.253 [WARNING][5204] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"c181bfd7-3816-4413-900a-22a73e4de4f5", ResourceVersion:"1000", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e", Pod:"calico-apiserver-6dcc7b598-kld6v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali773af562ac8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.254 [INFO][5204] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.255 [INFO][5204] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" iface="eth0" netns="" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.255 [INFO][5204] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.255 [INFO][5204] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.291 [INFO][5212] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.292 [INFO][5212] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.292 [INFO][5212] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.306 [WARNING][5212] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.306 [INFO][5212] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.309 [INFO][5212] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.314416 containerd[1476]: 2025-09-12 17:15:19.312 [INFO][5204] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.314416 containerd[1476]: time="2025-09-12T17:15:19.314133212Z" level=info msg="TearDown network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\" successfully" Sep 12 17:15:19.314416 containerd[1476]: time="2025-09-12T17:15:19.314175292Z" level=info msg="StopPodSandbox for \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\" returns successfully" Sep 12 17:15:19.315367 containerd[1476]: time="2025-09-12T17:15:19.315175705Z" level=info msg="RemovePodSandbox for \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\"" Sep 12 17:15:19.315367 containerd[1476]: time="2025-09-12T17:15:19.315222865Z" level=info msg="Forcibly stopping sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\"" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.370 [WARNING][5226] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"c181bfd7-3816-4413-900a-22a73e4de4f5", ResourceVersion:"1000", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e", Pod:"calico-apiserver-6dcc7b598-kld6v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali773af562ac8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.370 [INFO][5226] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.370 [INFO][5226] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" iface="eth0" netns="" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.370 [INFO][5226] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.370 [INFO][5226] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.404 [INFO][5233] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.405 [INFO][5233] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.405 [INFO][5233] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.418 [WARNING][5233] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.418 [INFO][5233] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" HandleID="k8s-pod-network.6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--kld6v-eth0" Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.421 [INFO][5233] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.426596 containerd[1476]: 2025-09-12 17:15:19.423 [INFO][5226] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75" Sep 12 17:15:19.426596 containerd[1476]: time="2025-09-12T17:15:19.426375637Z" level=info msg="TearDown network for sandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\" successfully" Sep 12 17:15:19.430036 systemd-networkd[1376]: vxlan.calico: Gained IPv6LL Sep 12 17:15:19.446471 containerd[1476]: time="2025-09-12T17:15:19.446161761Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:19.446471 containerd[1476]: time="2025-09-12T17:15:19.446284202Z" level=info msg="RemovePodSandbox \"6dca51ec3358dcae7c0894977e8eb49c0b40a331ff8152e26c5cc4d0c6a83c75\" returns successfully" Sep 12 17:15:19.447588 containerd[1476]: time="2025-09-12T17:15:19.446992731Z" level=info msg="StopPodSandbox for \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\"" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.502 [WARNING][5247] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"5cfa2648-c93e-445e-9e1f-6da367fb2890", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae", Pod:"calico-apiserver-6dcc7b598-zt2zj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia38d68c3e10", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.502 [INFO][5247] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.502 [INFO][5247] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" iface="eth0" netns="" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.502 [INFO][5247] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.502 [INFO][5247] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.539 [INFO][5254] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.540 [INFO][5254] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.540 [INFO][5254] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.563 [WARNING][5254] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.564 [INFO][5254] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.567 [INFO][5254] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.573247 containerd[1476]: 2025-09-12 17:15:19.569 [INFO][5247] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.573247 containerd[1476]: time="2025-09-12T17:15:19.573087366Z" level=info msg="TearDown network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\" successfully" Sep 12 17:15:19.573247 containerd[1476]: time="2025-09-12T17:15:19.573129647Z" level=info msg="StopPodSandbox for \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\" returns successfully" Sep 12 17:15:19.575406 containerd[1476]: time="2025-09-12T17:15:19.574709707Z" level=info msg="RemovePodSandbox for \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\"" Sep 12 17:15:19.575406 containerd[1476]: time="2025-09-12T17:15:19.574785987Z" level=info msg="Forcibly stopping sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\"" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.629 [WARNING][5268] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0", GenerateName:"calico-apiserver-6dcc7b598-", Namespace:"calico-apiserver", SelfLink:"", UID:"5cfa2648-c93e-445e-9e1f-6da367fb2890", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dcc7b598", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae", Pod:"calico-apiserver-6dcc7b598-zt2zj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia38d68c3e10", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.630 [INFO][5268] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.630 [INFO][5268] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" iface="eth0" netns="" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.630 [INFO][5268] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.630 [INFO][5268] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.659 [INFO][5276] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.659 [INFO][5276] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.659 [INFO][5276] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.674 [WARNING][5276] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.674 [INFO][5276] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" HandleID="k8s-pod-network.1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-calico--apiserver--6dcc7b598--zt2zj-eth0" Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.677 [INFO][5276] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.682954 containerd[1476]: 2025-09-12 17:15:19.680 [INFO][5268] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96" Sep 12 17:15:19.684794 containerd[1476]: time="2025-09-12T17:15:19.683431048Z" level=info msg="TearDown network for sandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\" successfully" Sep 12 17:15:19.689748 containerd[1476]: time="2025-09-12T17:15:19.689348241Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:19.689748 containerd[1476]: time="2025-09-12T17:15:19.689465162Z" level=info msg="RemovePodSandbox \"1ac00cc2768c31062b36c9fce9a6b020a7a8b4b1af051eb0b0db2419a195ba96\" returns successfully" Sep 12 17:15:19.690225 containerd[1476]: time="2025-09-12T17:15:19.690182731Z" level=info msg="StopPodSandbox for \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\"" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.758 [WARNING][5290] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"829c1f4c-85d8-405a-a4f5-f939b129360b", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a", Pod:"goldmane-54d579b49d-96m69", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.7.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2dad3e17c21", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.758 [INFO][5290] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.758 [INFO][5290] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" iface="eth0" netns="" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.758 [INFO][5290] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.758 [INFO][5290] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.791 [INFO][5297] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.791 [INFO][5297] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.791 [INFO][5297] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.804 [WARNING][5297] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.804 [INFO][5297] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.807 [INFO][5297] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.811245 containerd[1476]: 2025-09-12 17:15:19.809 [INFO][5290] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.812078 containerd[1476]: time="2025-09-12T17:15:19.811321065Z" level=info msg="TearDown network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\" successfully" Sep 12 17:15:19.812078 containerd[1476]: time="2025-09-12T17:15:19.811366706Z" level=info msg="StopPodSandbox for \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\" returns successfully" Sep 12 17:15:19.812876 containerd[1476]: time="2025-09-12T17:15:19.812661642Z" level=info msg="RemovePodSandbox for \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\"" Sep 12 17:15:19.812876 containerd[1476]: time="2025-09-12T17:15:19.812716363Z" level=info msg="Forcibly stopping sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\"" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.860 [WARNING][5311] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"829c1f4c-85d8-405a-a4f5-f939b129360b", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a", Pod:"goldmane-54d579b49d-96m69", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.7.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2dad3e17c21", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.861 [INFO][5311] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.861 [INFO][5311] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" iface="eth0" netns="" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.861 [INFO][5311] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.861 [INFO][5311] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.923 [INFO][5318] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.924 [INFO][5318] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.925 [INFO][5318] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.943 [WARNING][5318] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.944 [INFO][5318] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" HandleID="k8s-pod-network.29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-goldmane--54d579b49d--96m69-eth0" Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.950 [INFO][5318] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.958983 containerd[1476]: 2025-09-12 17:15:19.952 [INFO][5311] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b" Sep 12 17:15:19.958983 containerd[1476]: time="2025-09-12T17:15:19.956085691Z" level=info msg="TearDown network for sandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\" successfully" Sep 12 17:15:19.964598 containerd[1476]: time="2025-09-12T17:15:19.964535276Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:19.964933 containerd[1476]: time="2025-09-12T17:15:19.964909880Z" level=info msg="RemovePodSandbox \"29359224c5d20b1beba7a22ca02ba01fa13e72225570f60b7b71f20b03a0306b\" returns successfully" Sep 12 17:15:19.968861 containerd[1476]: time="2025-09-12T17:15:19.968764488Z" level=info msg="StopPodSandbox for \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\"" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.060 [WARNING][5332] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"01137908-9b3f-43f3-895b-bb6ad5c520d9", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423", Pod:"coredns-674b8bbfcf-tcms6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie4626d7d07a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.061 [INFO][5332] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.061 [INFO][5332] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" iface="eth0" netns="" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.061 [INFO][5332] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.061 [INFO][5332] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.120 [INFO][5339] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.123 [INFO][5339] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.124 [INFO][5339] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.156 [WARNING][5339] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.157 [INFO][5339] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.161 [INFO][5339] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.167307 containerd[1476]: 2025-09-12 17:15:20.164 [INFO][5332] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.168924 containerd[1476]: time="2025-09-12T17:15:20.167371591Z" level=info msg="TearDown network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\" successfully" Sep 12 17:15:20.168924 containerd[1476]: time="2025-09-12T17:15:20.167407432Z" level=info msg="StopPodSandbox for \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\" returns successfully" Sep 12 17:15:20.169977 containerd[1476]: time="2025-09-12T17:15:20.169484337Z" level=info msg="RemovePodSandbox for \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\"" Sep 12 17:15:20.169977 containerd[1476]: time="2025-09-12T17:15:20.169536658Z" level=info msg="Forcibly stopping sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\"" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.279 [WARNING][5354] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"01137908-9b3f-43f3-895b-bb6ad5c520d9", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"f4b55a2d603cee81b04d10229ef8eea5c23a76bfc29ed90336da3ac8a2c66423", Pod:"coredns-674b8bbfcf-tcms6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie4626d7d07a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.281 [INFO][5354] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.281 [INFO][5354] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" iface="eth0" netns="" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.281 [INFO][5354] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.281 [INFO][5354] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.318 [INFO][5361] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.319 [INFO][5361] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.319 [INFO][5361] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.336 [WARNING][5361] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.336 [INFO][5361] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" HandleID="k8s-pod-network.1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tcms6-eth0" Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.339 [INFO][5361] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.345124 containerd[1476]: 2025-09-12 17:15:20.341 [INFO][5354] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e" Sep 12 17:15:20.345124 containerd[1476]: time="2025-09-12T17:15:20.344025902Z" level=info msg="TearDown network for sandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\" successfully" Sep 12 17:15:20.351681 containerd[1476]: time="2025-09-12T17:15:20.351602275Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:20.352095 containerd[1476]: time="2025-09-12T17:15:20.352067880Z" level=info msg="RemovePodSandbox \"1c3af7e12634eac645c731d2a4daebfe7b07f34a74d34f8ac3a97edb98c0703e\" returns successfully" Sep 12 17:15:20.353124 containerd[1476]: time="2025-09-12T17:15:20.353021372Z" level=info msg="StopPodSandbox for \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\"" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.414 [WARNING][5375] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c0d7b056-4972-4c60-a388-fd78bc10058f", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6", Pod:"coredns-674b8bbfcf-tgfmg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib085a15dc0a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.414 [INFO][5375] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.415 [INFO][5375] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" iface="eth0" netns="" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.415 [INFO][5375] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.415 [INFO][5375] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.457 [INFO][5383] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.457 [INFO][5383] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.457 [INFO][5383] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.471 [WARNING][5383] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.471 [INFO][5383] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.474 [INFO][5383] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.478649 containerd[1476]: 2025-09-12 17:15:20.476 [INFO][5375] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.479311 containerd[1476]: time="2025-09-12T17:15:20.478718942Z" level=info msg="TearDown network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\" successfully" Sep 12 17:15:20.479311 containerd[1476]: time="2025-09-12T17:15:20.478807143Z" level=info msg="StopPodSandbox for \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\" returns successfully" Sep 12 17:15:20.480628 containerd[1476]: time="2025-09-12T17:15:20.480081959Z" level=info msg="RemovePodSandbox for \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\"" Sep 12 17:15:20.480628 containerd[1476]: time="2025-09-12T17:15:20.480130080Z" level=info msg="Forcibly stopping sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\"" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.532 [WARNING][5397] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c0d7b056-4972-4c60-a388-fd78bc10058f", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-e-c5bf4513f4", ContainerID:"a3004e761324594fc100db700ffa3a9a0797345a39ad9f12dff78f48cb28cab6", Pod:"coredns-674b8bbfcf-tgfmg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib085a15dc0a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.533 [INFO][5397] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.533 [INFO][5397] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" iface="eth0" netns="" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.533 [INFO][5397] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.533 [INFO][5397] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.571 [INFO][5404] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.571 [INFO][5404] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.571 [INFO][5404] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.586 [WARNING][5404] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.586 [INFO][5404] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" HandleID="k8s-pod-network.4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-coredns--674b8bbfcf--tgfmg-eth0" Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.588 [INFO][5404] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.593016 containerd[1476]: 2025-09-12 17:15:20.590 [INFO][5397] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22" Sep 12 17:15:20.594641 containerd[1476]: time="2025-09-12T17:15:20.593794584Z" level=info msg="TearDown network for sandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\" successfully" Sep 12 17:15:20.599374 containerd[1476]: time="2025-09-12T17:15:20.599199689Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:20.599757 containerd[1476]: time="2025-09-12T17:15:20.599607814Z" level=info msg="RemovePodSandbox \"4cd64275b2b54c09f28026400be15719f5b7bb6396a1087e7db4475ecc696d22\" returns successfully" Sep 12 17:15:20.601187 containerd[1476]: time="2025-09-12T17:15:20.601141513Z" level=info msg="StopPodSandbox for \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\"" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.659 [WARNING][5418] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.660 [INFO][5418] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.660 [INFO][5418] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" iface="eth0" netns="" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.660 [INFO][5418] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.660 [INFO][5418] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.687 [INFO][5426] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.687 [INFO][5426] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.687 [INFO][5426] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.700 [WARNING][5426] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.700 [INFO][5426] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.704 [INFO][5426] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.712353 containerd[1476]: 2025-09-12 17:15:20.709 [INFO][5418] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.712934 containerd[1476]: time="2025-09-12T17:15:20.712434148Z" level=info msg="TearDown network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\" successfully" Sep 12 17:15:20.712934 containerd[1476]: time="2025-09-12T17:15:20.712476829Z" level=info msg="StopPodSandbox for \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\" returns successfully" Sep 12 17:15:20.713956 containerd[1476]: time="2025-09-12T17:15:20.713918886Z" level=info msg="RemovePodSandbox for \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\"" Sep 12 17:15:20.714107 containerd[1476]: time="2025-09-12T17:15:20.713974767Z" level=info msg="Forcibly stopping sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\"" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.769 [WARNING][5440] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" WorkloadEndpoint="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.769 [INFO][5440] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.769 [INFO][5440] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" iface="eth0" netns="" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.769 [INFO][5440] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.769 [INFO][5440] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.800 [INFO][5447] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.801 [INFO][5447] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.801 [INFO][5447] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.813 [WARNING][5447] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.813 [INFO][5447] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" HandleID="k8s-pod-network.fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Workload="ci--4081--3--6--e--c5bf4513f4-k8s-whisker--54d6bb5877--49g7j-eth0" Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.816 [INFO][5447] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.823885 containerd[1476]: 2025-09-12 17:15:20.819 [INFO][5440] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d" Sep 12 17:15:20.823885 containerd[1476]: time="2025-09-12T17:15:20.822763132Z" level=info msg="TearDown network for sandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\" successfully" Sep 12 17:15:20.832682 containerd[1476]: time="2025-09-12T17:15:20.832590171Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:20.833093 containerd[1476]: time="2025-09-12T17:15:20.833060017Z" level=info msg="RemovePodSandbox \"fd1ff0fc6fb8643b05be4c3032eeac2e74e47b954680c4b927229ca27c50984d\" returns successfully" Sep 12 17:15:21.722415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount421103392.mount: Deactivated successfully. Sep 12 17:15:22.160438 containerd[1476]: time="2025-09-12T17:15:22.160178701Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:22.163239 containerd[1476]: time="2025-09-12T17:15:22.162957454Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 17:15:22.163239 containerd[1476]: time="2025-09-12T17:15:22.163143496Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:22.167718 containerd[1476]: time="2025-09-12T17:15:22.167330746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:22.168381 containerd[1476]: time="2025-09-12T17:15:22.168330438Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.841687923s" Sep 12 17:15:22.168381 containerd[1476]: time="2025-09-12T17:15:22.168378078Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 17:15:22.172282 containerd[1476]: time="2025-09-12T17:15:22.172110723Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:15:22.178029 containerd[1476]: time="2025-09-12T17:15:22.177967312Z" level=info msg="CreateContainer within sandbox \"aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:15:22.200217 containerd[1476]: time="2025-09-12T17:15:22.200152736Z" level=info msg="CreateContainer within sandbox \"aed1f180b1199c30dc65ffbefae04df86376ca50d3f4519cee53ff680a4d8a7a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"b4a6e9939f283a53da5e4aff67ac261c641cb28e7cb26ca232a3178f4bb0572c\"" Sep 12 17:15:22.204674 containerd[1476]: time="2025-09-12T17:15:22.201175268Z" level=info msg="StartContainer for \"b4a6e9939f283a53da5e4aff67ac261c641cb28e7cb26ca232a3178f4bb0572c\"" Sep 12 17:15:22.260204 systemd[1]: Started cri-containerd-b4a6e9939f283a53da5e4aff67ac261c641cb28e7cb26ca232a3178f4bb0572c.scope - libcontainer container b4a6e9939f283a53da5e4aff67ac261c641cb28e7cb26ca232a3178f4bb0572c. Sep 12 17:15:22.310335 containerd[1476]: time="2025-09-12T17:15:22.309005190Z" level=info msg="StartContainer for \"b4a6e9939f283a53da5e4aff67ac261c641cb28e7cb26ca232a3178f4bb0572c\" returns successfully" Sep 12 17:15:23.293118 kubelet[2581]: I0912 17:15:23.292621 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-96m69" podStartSLOduration=26.544081727 podStartE2EDuration="35.292592163s" podCreationTimestamp="2025-09-12 17:14:48 +0000 UTC" firstStartedPulling="2025-09-12 17:15:13.422467873 +0000 UTC m=+54.978291664" lastFinishedPulling="2025-09-12 17:15:22.170978309 +0000 UTC m=+63.726802100" observedRunningTime="2025-09-12 17:15:23.291319108 +0000 UTC m=+64.847142899" watchObservedRunningTime="2025-09-12 17:15:23.292592163 +0000 UTC m=+64.848415954" Sep 12 17:15:24.592082 containerd[1476]: time="2025-09-12T17:15:24.591995523Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:24.593876 containerd[1476]: time="2025-09-12T17:15:24.593788904Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 17:15:24.594356 containerd[1476]: time="2025-09-12T17:15:24.594283350Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:24.597595 containerd[1476]: time="2025-09-12T17:15:24.597182224Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:24.598260 containerd[1476]: time="2025-09-12T17:15:24.598199196Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.426025192s" Sep 12 17:15:24.598260 containerd[1476]: time="2025-09-12T17:15:24.598249756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 17:15:24.602412 containerd[1476]: time="2025-09-12T17:15:24.602141681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:15:24.627955 containerd[1476]: time="2025-09-12T17:15:24.627901941Z" level=info msg="CreateContainer within sandbox \"c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:15:24.646318 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount379709728.mount: Deactivated successfully. Sep 12 17:15:24.651425 containerd[1476]: time="2025-09-12T17:15:24.651209572Z" level=info msg="CreateContainer within sandbox \"c0a50f0a5792319d127e43fa01cb61c107d683b81ac08b1f5acccf50a3abb882\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e\"" Sep 12 17:15:24.657653 containerd[1476]: time="2025-09-12T17:15:24.657033080Z" level=info msg="StartContainer for \"eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e\"" Sep 12 17:15:24.738188 systemd[1]: Started cri-containerd-eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e.scope - libcontainer container eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e. Sep 12 17:15:24.788123 containerd[1476]: time="2025-09-12T17:15:24.788032244Z" level=info msg="StartContainer for \"eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e\" returns successfully" Sep 12 17:15:25.314270 kubelet[2581]: I0912 17:15:25.313269 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-bfcb5d966-vxpgp" podStartSLOduration=25.257961791 podStartE2EDuration="36.313238116s" podCreationTimestamp="2025-09-12 17:14:49 +0000 UTC" firstStartedPulling="2025-09-12 17:15:13.54627423 +0000 UTC m=+55.102098021" lastFinishedPulling="2025-09-12 17:15:24.601550595 +0000 UTC m=+66.157374346" observedRunningTime="2025-09-12 17:15:25.313088314 +0000 UTC m=+66.868912105" watchObservedRunningTime="2025-09-12 17:15:25.313238116 +0000 UTC m=+66.869061907" Sep 12 17:15:27.794206 containerd[1476]: time="2025-09-12T17:15:27.794065848Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:27.796290 containerd[1476]: time="2025-09-12T17:15:27.796113031Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 17:15:27.797856 containerd[1476]: time="2025-09-12T17:15:27.797440006Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:27.802142 containerd[1476]: time="2025-09-12T17:15:27.802065299Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:27.803546 containerd[1476]: time="2025-09-12T17:15:27.803458994Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.200702386s" Sep 12 17:15:27.803546 containerd[1476]: time="2025-09-12T17:15:27.803508155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:15:27.805888 containerd[1476]: time="2025-09-12T17:15:27.805445697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:15:27.810896 containerd[1476]: time="2025-09-12T17:15:27.810773957Z" level=info msg="CreateContainer within sandbox \"f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:15:27.833904 containerd[1476]: time="2025-09-12T17:15:27.833801697Z" level=info msg="CreateContainer within sandbox \"f4d7ae099887f87462d699c9f3cd1bc48a9e4f10dadd6e880f99f40a812512ae\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7367f6dd1a1f28b446872f35c1b009a102d6e46123c6108cb43fac13b160c1e6\"" Sep 12 17:15:27.834611 containerd[1476]: time="2025-09-12T17:15:27.834582906Z" level=info msg="StartContainer for \"7367f6dd1a1f28b446872f35c1b009a102d6e46123c6108cb43fac13b160c1e6\"" Sep 12 17:15:27.888413 systemd[1]: run-containerd-runc-k8s.io-7367f6dd1a1f28b446872f35c1b009a102d6e46123c6108cb43fac13b160c1e6-runc.cY7rgh.mount: Deactivated successfully. Sep 12 17:15:27.898674 systemd[1]: Started cri-containerd-7367f6dd1a1f28b446872f35c1b009a102d6e46123c6108cb43fac13b160c1e6.scope - libcontainer container 7367f6dd1a1f28b446872f35c1b009a102d6e46123c6108cb43fac13b160c1e6. Sep 12 17:15:27.956386 containerd[1476]: time="2025-09-12T17:15:27.956152161Z" level=info msg="StartContainer for \"7367f6dd1a1f28b446872f35c1b009a102d6e46123c6108cb43fac13b160c1e6\" returns successfully" Sep 12 17:15:28.230694 containerd[1476]: time="2025-09-12T17:15:28.230524840Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:28.233275 containerd[1476]: time="2025-09-12T17:15:28.233216151Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:15:28.235422 containerd[1476]: time="2025-09-12T17:15:28.235359055Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 429.860238ms" Sep 12 17:15:28.235482 containerd[1476]: time="2025-09-12T17:15:28.235428495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:15:28.237567 containerd[1476]: time="2025-09-12T17:15:28.237186115Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:15:28.257907 containerd[1476]: time="2025-09-12T17:15:28.257779946Z" level=info msg="CreateContainer within sandbox \"bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:15:28.276227 containerd[1476]: time="2025-09-12T17:15:28.276159752Z" level=info msg="CreateContainer within sandbox \"bbd5416fc30c832b0c9c33aef001a2628c2a28dc485417febfb2ec00d192312e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"be05e74bdf713932c601bcba004962dd36ec51d46a341a1371df0e9958322673\"" Sep 12 17:15:28.278886 containerd[1476]: time="2025-09-12T17:15:28.277305685Z" level=info msg="StartContainer for \"be05e74bdf713932c601bcba004962dd36ec51d46a341a1371df0e9958322673\"" Sep 12 17:15:28.332058 systemd[1]: Started cri-containerd-be05e74bdf713932c601bcba004962dd36ec51d46a341a1371df0e9958322673.scope - libcontainer container be05e74bdf713932c601bcba004962dd36ec51d46a341a1371df0e9958322673. Sep 12 17:15:28.438871 containerd[1476]: time="2025-09-12T17:15:28.438441891Z" level=info msg="StartContainer for \"be05e74bdf713932c601bcba004962dd36ec51d46a341a1371df0e9958322673\" returns successfully" Sep 12 17:15:29.339065 kubelet[2581]: I0912 17:15:29.337431 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6dcc7b598-zt2zj" podStartSLOduration=35.184114497 podStartE2EDuration="48.337402378s" podCreationTimestamp="2025-09-12 17:14:41 +0000 UTC" firstStartedPulling="2025-09-12 17:15:14.651437848 +0000 UTC m=+56.207261599" lastFinishedPulling="2025-09-12 17:15:27.804725689 +0000 UTC m=+69.360549480" observedRunningTime="2025-09-12 17:15:28.340299231 +0000 UTC m=+69.896123022" watchObservedRunningTime="2025-09-12 17:15:29.337402378 +0000 UTC m=+70.893226169" Sep 12 17:15:29.954255 containerd[1476]: time="2025-09-12T17:15:29.953245506Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:29.955837 containerd[1476]: time="2025-09-12T17:15:29.955769174Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 17:15:29.958714 containerd[1476]: time="2025-09-12T17:15:29.958648086Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:29.963930 containerd[1476]: time="2025-09-12T17:15:29.963428340Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:29.965929 containerd[1476]: time="2025-09-12T17:15:29.965622687Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.728382451s" Sep 12 17:15:29.965929 containerd[1476]: time="2025-09-12T17:15:29.965684568Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 17:15:29.983206 containerd[1476]: time="2025-09-12T17:15:29.983132834Z" level=info msg="CreateContainer within sandbox \"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:15:30.012572 containerd[1476]: time="2025-09-12T17:15:30.012487809Z" level=info msg="CreateContainer within sandbox \"23940b136e57beacbc5279666d40819e1b5deecfc07c6c6e84d8f316dfdae666\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"480467e6cc3d44244bfd0c2829771d023f0a58d806abe3926bba3eeb4cc87820\"" Sep 12 17:15:30.016719 containerd[1476]: time="2025-09-12T17:15:30.014274816Z" level=info msg="StartContainer for \"480467e6cc3d44244bfd0c2829771d023f0a58d806abe3926bba3eeb4cc87820\"" Sep 12 17:15:30.095217 systemd[1]: run-containerd-runc-k8s.io-480467e6cc3d44244bfd0c2829771d023f0a58d806abe3926bba3eeb4cc87820-runc.9fXIsR.mount: Deactivated successfully. Sep 12 17:15:30.111528 systemd[1]: Started cri-containerd-480467e6cc3d44244bfd0c2829771d023f0a58d806abe3926bba3eeb4cc87820.scope - libcontainer container 480467e6cc3d44244bfd0c2829771d023f0a58d806abe3926bba3eeb4cc87820. Sep 12 17:15:30.197778 kubelet[2581]: I0912 17:15:30.195694 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6dcc7b598-kld6v" podStartSLOduration=38.102303585 podStartE2EDuration="49.195661776s" podCreationTimestamp="2025-09-12 17:14:41 +0000 UTC" firstStartedPulling="2025-09-12 17:15:17.143354199 +0000 UTC m=+58.699177990" lastFinishedPulling="2025-09-12 17:15:28.23671239 +0000 UTC m=+69.792536181" observedRunningTime="2025-09-12 17:15:29.339121397 +0000 UTC m=+70.894945188" watchObservedRunningTime="2025-09-12 17:15:30.195661776 +0000 UTC m=+71.751485567" Sep 12 17:15:30.267570 containerd[1476]: time="2025-09-12T17:15:30.267367777Z" level=info msg="StartContainer for \"480467e6cc3d44244bfd0c2829771d023f0a58d806abe3926bba3eeb4cc87820\" returns successfully" Sep 12 17:15:30.773124 kubelet[2581]: I0912 17:15:30.773044 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:15:30.773124 kubelet[2581]: I0912 17:15:30.773143 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:15:31.230118 kubelet[2581]: I0912 17:15:31.229967 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-cmgr5" podStartSLOduration=25.449333351 podStartE2EDuration="43.22992569s" podCreationTimestamp="2025-09-12 17:14:48 +0000 UTC" firstStartedPulling="2025-09-12 17:15:12.187093655 +0000 UTC m=+53.742917446" lastFinishedPulling="2025-09-12 17:15:29.967685994 +0000 UTC m=+71.523509785" observedRunningTime="2025-09-12 17:15:30.378387411 +0000 UTC m=+71.934211202" watchObservedRunningTime="2025-09-12 17:15:31.22992569 +0000 UTC m=+72.785749561" Sep 12 17:15:33.811924 update_engine[1457]: I20250912 17:15:33.811280 1457 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 17:15:33.811924 update_engine[1457]: I20250912 17:15:33.811381 1457 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 17:15:33.812540 update_engine[1457]: I20250912 17:15:33.811805 1457 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 17:15:33.813602 update_engine[1457]: I20250912 17:15:33.813530 1457 omaha_request_params.cc:62] Current group set to lts Sep 12 17:15:33.813792 update_engine[1457]: I20250912 17:15:33.813728 1457 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 17:15:33.813792 update_engine[1457]: I20250912 17:15:33.813747 1457 update_attempter.cc:643] Scheduling an action processor start. Sep 12 17:15:33.813792 update_engine[1457]: I20250912 17:15:33.813781 1457 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:15:33.818794 update_engine[1457]: I20250912 17:15:33.817039 1457 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 17:15:33.818794 update_engine[1457]: I20250912 17:15:33.817225 1457 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:15:33.818794 update_engine[1457]: I20250912 17:15:33.817236 1457 omaha_request_action.cc:272] Request: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: Sep 12 17:15:33.818794 update_engine[1457]: I20250912 17:15:33.817243 1457 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:15:33.822318 update_engine[1457]: I20250912 17:15:33.820871 1457 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:15:33.822318 update_engine[1457]: I20250912 17:15:33.821522 1457 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:15:33.822569 locksmithd[1497]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 17:15:33.824972 update_engine[1457]: E20250912 17:15:33.824880 1457 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:15:33.825086 update_engine[1457]: I20250912 17:15:33.825023 1457 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 17:15:39.064378 systemd[1]: run-containerd-runc-k8s.io-0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd-runc.9ifiq3.mount: Deactivated successfully. Sep 12 17:15:43.815233 update_engine[1457]: I20250912 17:15:43.815093 1457 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:15:43.816150 update_engine[1457]: I20250912 17:15:43.815536 1457 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:15:43.816150 update_engine[1457]: I20250912 17:15:43.815929 1457 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:15:43.816922 update_engine[1457]: E20250912 17:15:43.816864 1457 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:15:43.817026 update_engine[1457]: I20250912 17:15:43.816984 1457 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 12 17:15:53.817134 update_engine[1457]: I20250912 17:15:53.817036 1457 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:15:53.817619 update_engine[1457]: I20250912 17:15:53.817325 1457 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:15:53.817619 update_engine[1457]: I20250912 17:15:53.817585 1457 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:15:53.821014 update_engine[1457]: E20250912 17:15:53.820936 1457 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:15:53.821203 update_engine[1457]: I20250912 17:15:53.821048 1457 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 12 17:15:54.313021 systemd[1]: run-containerd-runc-k8s.io-b4a6e9939f283a53da5e4aff67ac261c641cb28e7cb26ca232a3178f4bb0572c-runc.3Kk1AV.mount: Deactivated successfully. Sep 12 17:15:55.316117 systemd[1]: run-containerd-runc-k8s.io-eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e-runc.x9GzAM.mount: Deactivated successfully. Sep 12 17:16:03.817220 update_engine[1457]: I20250912 17:16:03.817103 1457 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:16:03.818438 update_engine[1457]: I20250912 17:16:03.817416 1457 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:16:03.818438 update_engine[1457]: I20250912 17:16:03.817689 1457 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:16:03.819045 update_engine[1457]: E20250912 17:16:03.818892 1457 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:16:03.819368 update_engine[1457]: I20250912 17:16:03.818998 1457 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:16:03.819682 update_engine[1457]: I20250912 17:16:03.819272 1457 omaha_request_action.cc:617] Omaha request response: Sep 12 17:16:03.820929 update_engine[1457]: E20250912 17:16:03.819792 1457 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.819955 1457 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.819975 1457 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.819988 1457 update_attempter.cc:306] Processing Done. Sep 12 17:16:03.820929 update_engine[1457]: E20250912 17:16:03.820012 1457 update_attempter.cc:619] Update failed. Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820024 1457 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820034 1457 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820044 1457 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820160 1457 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820199 1457 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820211 1457 omaha_request_action.cc:272] Request: Sep 12 17:16:03.820929 update_engine[1457]: Sep 12 17:16:03.820929 update_engine[1457]: Sep 12 17:16:03.820929 update_engine[1457]: Sep 12 17:16:03.820929 update_engine[1457]: Sep 12 17:16:03.820929 update_engine[1457]: Sep 12 17:16:03.820929 update_engine[1457]: Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820222 1457 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:16:03.820929 update_engine[1457]: I20250912 17:16:03.820572 1457 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:16:03.822541 update_engine[1457]: I20250912 17:16:03.822464 1457 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:16:03.823078 update_engine[1457]: E20250912 17:16:03.823038 1457 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:16:03.823239 update_engine[1457]: I20250912 17:16:03.823204 1457 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:16:03.823541 update_engine[1457]: I20250912 17:16:03.823221 1457 omaha_request_action.cc:617] Omaha request response: Sep 12 17:16:03.823541 update_engine[1457]: I20250912 17:16:03.823324 1457 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:16:03.823541 update_engine[1457]: I20250912 17:16:03.823334 1457 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:16:03.823541 update_engine[1457]: I20250912 17:16:03.823339 1457 update_attempter.cc:306] Processing Done. Sep 12 17:16:03.823541 update_engine[1457]: I20250912 17:16:03.823346 1457 update_attempter.cc:310] Error event sent. Sep 12 17:16:03.823541 update_engine[1457]: I20250912 17:16:03.823356 1457 update_check_scheduler.cc:74] Next update check in 42m51s Sep 12 17:16:03.823966 locksmithd[1497]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 12 17:16:03.824450 locksmithd[1497]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 12 17:16:39.059115 systemd[1]: run-containerd-runc-k8s.io-0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd-runc.qdMTKU.mount: Deactivated successfully. Sep 12 17:16:55.300685 systemd[1]: run-containerd-runc-k8s.io-eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e-runc.uJwld7.mount: Deactivated successfully. Sep 12 17:16:59.903282 systemd[1]: Started sshd@7-188.245.115.118:22-139.178.89.65:52152.service - OpenSSH per-connection server daemon (139.178.89.65:52152). Sep 12 17:17:00.887806 sshd[6087]: Accepted publickey for core from 139.178.89.65 port 52152 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:00.890599 sshd[6087]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:00.896860 systemd-logind[1456]: New session 8 of user core. Sep 12 17:17:00.899013 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:17:01.690196 sshd[6087]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:01.694476 systemd[1]: sshd@7-188.245.115.118:22-139.178.89.65:52152.service: Deactivated successfully. Sep 12 17:17:01.699636 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:17:01.706067 systemd-logind[1456]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:17:01.707354 systemd-logind[1456]: Removed session 8. Sep 12 17:17:06.774055 systemd[1]: Started sshd@8-188.245.115.118:22-80.94.95.116:55382.service - OpenSSH per-connection server daemon (80.94.95.116:55382). Sep 12 17:17:06.872183 systemd[1]: Started sshd@9-188.245.115.118:22-139.178.89.65:56102.service - OpenSSH per-connection server daemon (139.178.89.65:56102). Sep 12 17:17:07.842678 sshd[6104]: Accepted publickey for core from 139.178.89.65 port 56102 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:07.844724 sshd[6104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:07.850219 systemd-logind[1456]: New session 9 of user core. Sep 12 17:17:07.859148 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:17:08.588075 sshd[6104]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:08.594296 systemd[1]: sshd@9-188.245.115.118:22-139.178.89.65:56102.service: Deactivated successfully. Sep 12 17:17:08.597683 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:17:08.598736 systemd-logind[1456]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:17:08.600525 systemd-logind[1456]: Removed session 9. Sep 12 17:17:11.098190 sshd[6102]: Connection closed by authenticating user root 80.94.95.116 port 55382 [preauth] Sep 12 17:17:11.102850 systemd[1]: sshd@8-188.245.115.118:22-80.94.95.116:55382.service: Deactivated successfully. Sep 12 17:17:13.762249 systemd[1]: Started sshd@10-188.245.115.118:22-139.178.89.65:36584.service - OpenSSH per-connection server daemon (139.178.89.65:36584). Sep 12 17:17:14.739325 sshd[6142]: Accepted publickey for core from 139.178.89.65 port 36584 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:14.741872 sshd[6142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:14.747635 systemd-logind[1456]: New session 10 of user core. Sep 12 17:17:14.756150 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:17:15.491682 sshd[6142]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:15.496765 systemd-logind[1456]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:17:15.497181 systemd[1]: sshd@10-188.245.115.118:22-139.178.89.65:36584.service: Deactivated successfully. Sep 12 17:17:15.499463 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:17:15.501267 systemd-logind[1456]: Removed session 10. Sep 12 17:17:15.672343 systemd[1]: Started sshd@11-188.245.115.118:22-139.178.89.65:36590.service - OpenSSH per-connection server daemon (139.178.89.65:36590). Sep 12 17:17:16.663906 sshd[6179]: Accepted publickey for core from 139.178.89.65 port 36590 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:16.666485 sshd[6179]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:16.674188 systemd-logind[1456]: New session 11 of user core. Sep 12 17:17:16.682547 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:17:17.467227 sshd[6179]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:17.472624 systemd[1]: sshd@11-188.245.115.118:22-139.178.89.65:36590.service: Deactivated successfully. Sep 12 17:17:17.476807 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:17:17.477829 systemd-logind[1456]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:17:17.480295 systemd-logind[1456]: Removed session 11. Sep 12 17:17:17.648262 systemd[1]: Started sshd@12-188.245.115.118:22-139.178.89.65:36606.service - OpenSSH per-connection server daemon (139.178.89.65:36606). Sep 12 17:17:18.647369 sshd[6190]: Accepted publickey for core from 139.178.89.65 port 36606 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:18.650109 sshd[6190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:18.658045 systemd-logind[1456]: New session 12 of user core. Sep 12 17:17:18.663088 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:17:19.409585 sshd[6190]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:19.414761 systemd[1]: sshd@12-188.245.115.118:22-139.178.89.65:36606.service: Deactivated successfully. Sep 12 17:17:19.416789 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:17:19.419219 systemd-logind[1456]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:17:19.420304 systemd-logind[1456]: Removed session 12. Sep 12 17:17:24.597389 systemd[1]: Started sshd@13-188.245.115.118:22-139.178.89.65:47770.service - OpenSSH per-connection server daemon (139.178.89.65:47770). Sep 12 17:17:25.303778 systemd[1]: run-containerd-runc-k8s.io-eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e-runc.viMMUc.mount: Deactivated successfully. Sep 12 17:17:25.644150 sshd[6227]: Accepted publickey for core from 139.178.89.65 port 47770 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:25.646643 sshd[6227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:25.652581 systemd-logind[1456]: New session 13 of user core. Sep 12 17:17:25.657112 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:17:26.442403 sshd[6227]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:26.446017 systemd-logind[1456]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:17:26.446392 systemd[1]: sshd@13-188.245.115.118:22-139.178.89.65:47770.service: Deactivated successfully. Sep 12 17:17:26.451682 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:17:26.456729 systemd-logind[1456]: Removed session 13. Sep 12 17:17:26.613169 systemd[1]: Started sshd@14-188.245.115.118:22-139.178.89.65:47778.service - OpenSSH per-connection server daemon (139.178.89.65:47778). Sep 12 17:17:27.601812 sshd[6260]: Accepted publickey for core from 139.178.89.65 port 47778 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:27.604746 sshd[6260]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:27.611699 systemd-logind[1456]: New session 14 of user core. Sep 12 17:17:27.618046 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:17:28.497796 sshd[6260]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:28.503156 systemd-logind[1456]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:17:28.504288 systemd[1]: sshd@14-188.245.115.118:22-139.178.89.65:47778.service: Deactivated successfully. Sep 12 17:17:28.506878 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:17:28.508552 systemd-logind[1456]: Removed session 14. Sep 12 17:17:28.679461 systemd[1]: Started sshd@15-188.245.115.118:22-139.178.89.65:47782.service - OpenSSH per-connection server daemon (139.178.89.65:47782). Sep 12 17:17:29.664616 sshd[6271]: Accepted publickey for core from 139.178.89.65 port 47782 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:29.667260 sshd[6271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:29.673613 systemd-logind[1456]: New session 15 of user core. Sep 12 17:17:29.679408 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:17:30.932643 sshd[6271]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:30.940904 systemd[1]: sshd@15-188.245.115.118:22-139.178.89.65:47782.service: Deactivated successfully. Sep 12 17:17:30.946409 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:17:30.948560 systemd-logind[1456]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:17:30.953892 systemd-logind[1456]: Removed session 15. Sep 12 17:17:31.110170 systemd[1]: Started sshd@16-188.245.115.118:22-139.178.89.65:42118.service - OpenSSH per-connection server daemon (139.178.89.65:42118). Sep 12 17:17:32.106034 sshd[6290]: Accepted publickey for core from 139.178.89.65 port 42118 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:32.108797 sshd[6290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:32.118022 systemd-logind[1456]: New session 16 of user core. Sep 12 17:17:32.121319 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:17:33.054959 sshd[6290]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:33.061021 systemd-logind[1456]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:17:33.061419 systemd[1]: sshd@16-188.245.115.118:22-139.178.89.65:42118.service: Deactivated successfully. Sep 12 17:17:33.065500 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:17:33.068455 systemd-logind[1456]: Removed session 16. Sep 12 17:17:33.226199 systemd[1]: Started sshd@17-188.245.115.118:22-139.178.89.65:42130.service - OpenSSH per-connection server daemon (139.178.89.65:42130). Sep 12 17:17:34.206088 sshd[6301]: Accepted publickey for core from 139.178.89.65 port 42130 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:34.208274 sshd[6301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:34.215694 systemd-logind[1456]: New session 17 of user core. Sep 12 17:17:34.221019 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:17:34.960365 sshd[6301]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:34.965417 systemd[1]: sshd@17-188.245.115.118:22-139.178.89.65:42130.service: Deactivated successfully. Sep 12 17:17:34.967575 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:17:34.970526 systemd-logind[1456]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:17:34.973181 systemd-logind[1456]: Removed session 17. Sep 12 17:17:39.057469 systemd[1]: run-containerd-runc-k8s.io-0a0c1b12c6f481a98a1fc5e8af60cae974d88dff97820a7ac608e67dd3c0d7dd-runc.9xldbv.mount: Deactivated successfully. Sep 12 17:17:40.131260 systemd[1]: Started sshd@18-188.245.115.118:22-139.178.89.65:37464.service - OpenSSH per-connection server daemon (139.178.89.65:37464). Sep 12 17:17:41.117012 sshd[6338]: Accepted publickey for core from 139.178.89.65 port 37464 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:41.118737 sshd[6338]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:41.124913 systemd-logind[1456]: New session 18 of user core. Sep 12 17:17:41.128073 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:17:41.872537 sshd[6338]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:41.878474 systemd[1]: sshd@18-188.245.115.118:22-139.178.89.65:37464.service: Deactivated successfully. Sep 12 17:17:41.881913 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:17:41.886336 systemd-logind[1456]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:17:41.887406 systemd-logind[1456]: Removed session 18. Sep 12 17:17:47.057119 systemd[1]: Started sshd@19-188.245.115.118:22-139.178.89.65:37476.service - OpenSSH per-connection server daemon (139.178.89.65:37476). Sep 12 17:17:48.103136 sshd[6371]: Accepted publickey for core from 139.178.89.65 port 37476 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:48.105573 sshd[6371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:48.112150 systemd-logind[1456]: New session 19 of user core. Sep 12 17:17:48.116033 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:17:48.901140 sshd[6371]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:48.907060 systemd[1]: sshd@19-188.245.115.118:22-139.178.89.65:37476.service: Deactivated successfully. Sep 12 17:17:48.910164 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:17:48.911475 systemd-logind[1456]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:17:48.913223 systemd-logind[1456]: Removed session 19. Sep 12 17:18:03.692703 systemd[1]: cri-containerd-286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7.scope: Deactivated successfully. Sep 12 17:18:03.693470 systemd[1]: cri-containerd-286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7.scope: Consumed 23.756s CPU time. Sep 12 17:18:03.722443 containerd[1476]: time="2025-09-12T17:18:03.722240841Z" level=info msg="shim disconnected" id=286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7 namespace=k8s.io Sep 12 17:18:03.722443 containerd[1476]: time="2025-09-12T17:18:03.722296201Z" level=warning msg="cleaning up after shim disconnected" id=286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7 namespace=k8s.io Sep 12 17:18:03.722443 containerd[1476]: time="2025-09-12T17:18:03.722304601Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:18:03.722448 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7-rootfs.mount: Deactivated successfully. Sep 12 17:18:03.809953 kubelet[2581]: I0912 17:18:03.809888 2581 scope.go:117] "RemoveContainer" containerID="286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7" Sep 12 17:18:03.812462 containerd[1476]: time="2025-09-12T17:18:03.812345239Z" level=info msg="CreateContainer within sandbox \"5e33970f5127ded1869d7113c18762937b978a2358ed8a66767a6abe05ddbfc7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 17:18:03.829979 containerd[1476]: time="2025-09-12T17:18:03.829759047Z" level=info msg="CreateContainer within sandbox \"5e33970f5127ded1869d7113c18762937b978a2358ed8a66767a6abe05ddbfc7\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23\"" Sep 12 17:18:03.830433 containerd[1476]: time="2025-09-12T17:18:03.830410695Z" level=info msg="StartContainer for \"c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23\"" Sep 12 17:18:03.863987 systemd[1]: Started cri-containerd-c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23.scope - libcontainer container c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23. Sep 12 17:18:03.890046 containerd[1476]: time="2025-09-12T17:18:03.890001048Z" level=info msg="StartContainer for \"c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23\" returns successfully" Sep 12 17:18:04.104951 kubelet[2581]: E0912 17:18:04.104794 2581 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:58534->10.0.0.2:2379: read: connection timed out" Sep 12 17:18:04.854307 systemd[1]: cri-containerd-2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761.scope: Deactivated successfully. Sep 12 17:18:04.855070 systemd[1]: cri-containerd-2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761.scope: Consumed 6.100s CPU time, 18.1M memory peak, 0B memory swap peak. Sep 12 17:18:04.882985 containerd[1476]: time="2025-09-12T17:18:04.881389225Z" level=info msg="shim disconnected" id=2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761 namespace=k8s.io Sep 12 17:18:04.882985 containerd[1476]: time="2025-09-12T17:18:04.881442825Z" level=warning msg="cleaning up after shim disconnected" id=2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761 namespace=k8s.io Sep 12 17:18:04.882985 containerd[1476]: time="2025-09-12T17:18:04.881454425Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:18:04.883561 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761-rootfs.mount: Deactivated successfully. Sep 12 17:18:05.806332 kubelet[2581]: I0912 17:18:05.806283 2581 scope.go:117] "RemoveContainer" containerID="2ba8d609154cc3d173349fe29146cfc66846d3ac8d44f70e76a9854049637761" Sep 12 17:18:05.808871 containerd[1476]: time="2025-09-12T17:18:05.808614950Z" level=info msg="CreateContainer within sandbox \"52ddb3e6d7f90345dd1c8e779349aa269775ef930c45ba12af70657448a6d775\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 17:18:05.825037 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1967093019.mount: Deactivated successfully. Sep 12 17:18:05.826319 containerd[1476]: time="2025-09-12T17:18:05.826101679Z" level=info msg="CreateContainer within sandbox \"52ddb3e6d7f90345dd1c8e779349aa269775ef930c45ba12af70657448a6d775\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"5bb2bf1e4b086a5a634db36c5fb958814460a3f761b7c074bfba3c1d23b2e761\"" Sep 12 17:18:05.827335 containerd[1476]: time="2025-09-12T17:18:05.827163332Z" level=info msg="StartContainer for \"5bb2bf1e4b086a5a634db36c5fb958814460a3f761b7c074bfba3c1d23b2e761\"" Sep 12 17:18:05.867031 systemd[1]: Started cri-containerd-5bb2bf1e4b086a5a634db36c5fb958814460a3f761b7c074bfba3c1d23b2e761.scope - libcontainer container 5bb2bf1e4b086a5a634db36c5fb958814460a3f761b7c074bfba3c1d23b2e761. Sep 12 17:18:05.903274 containerd[1476]: time="2025-09-12T17:18:05.903231441Z" level=info msg="StartContainer for \"5bb2bf1e4b086a5a634db36c5fb958814460a3f761b7c074bfba3c1d23b2e761\" returns successfully" Sep 12 17:18:08.208554 kubelet[2581]: E0912 17:18:08.206207 2581 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:58348->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-6-e-c5bf4513f4.1864988b15de578d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-6-e-c5bf4513f4,UID:23ceac683a3ba4e96382b176df7bd8b1,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-e-c5bf4513f4,},FirstTimestamp:2025-09-12 17:17:57.716105101 +0000 UTC m=+219.271928932,LastTimestamp:2025-09-12 17:17:57.716105101 +0000 UTC m=+219.271928932,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-e-c5bf4513f4,}" Sep 12 17:18:09.972377 systemd[1]: cri-containerd-3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a.scope: Deactivated successfully. Sep 12 17:18:09.972675 systemd[1]: cri-containerd-3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a.scope: Consumed 4.212s CPU time, 16.0M memory peak, 0B memory swap peak. Sep 12 17:18:09.999951 containerd[1476]: time="2025-09-12T17:18:09.999876770Z" level=info msg="shim disconnected" id=3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a namespace=k8s.io Sep 12 17:18:10.000253 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a-rootfs.mount: Deactivated successfully. Sep 12 17:18:10.000532 containerd[1476]: time="2025-09-12T17:18:10.000484057Z" level=warning msg="cleaning up after shim disconnected" id=3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a namespace=k8s.io Sep 12 17:18:10.000532 containerd[1476]: time="2025-09-12T17:18:10.000513258Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:18:10.824292 kubelet[2581]: I0912 17:18:10.824265 2581 scope.go:117] "RemoveContainer" containerID="3ab3259ec9a683c58b635fef9e479d5726050b7f588491cf34de91d70eda553a" Sep 12 17:18:10.826684 containerd[1476]: time="2025-09-12T17:18:10.826649476Z" level=info msg="CreateContainer within sandbox \"e8bbf014ec2eca08466943c1433a4dea65eac422e921fea294167e1b0f28fe65\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 12 17:18:10.843613 containerd[1476]: time="2025-09-12T17:18:10.843532917Z" level=info msg="CreateContainer within sandbox \"e8bbf014ec2eca08466943c1433a4dea65eac422e921fea294167e1b0f28fe65\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"651438b2334fc98312f35314bfe2c3bbfd35ed14d146e9fbf5dd327ac3a67ec1\"" Sep 12 17:18:10.845494 containerd[1476]: time="2025-09-12T17:18:10.845021895Z" level=info msg="StartContainer for \"651438b2334fc98312f35314bfe2c3bbfd35ed14d146e9fbf5dd327ac3a67ec1\"" Sep 12 17:18:10.848475 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3376657295.mount: Deactivated successfully. Sep 12 17:18:10.876035 systemd[1]: Started cri-containerd-651438b2334fc98312f35314bfe2c3bbfd35ed14d146e9fbf5dd327ac3a67ec1.scope - libcontainer container 651438b2334fc98312f35314bfe2c3bbfd35ed14d146e9fbf5dd327ac3a67ec1. Sep 12 17:18:10.907894 containerd[1476]: time="2025-09-12T17:18:10.907797724Z" level=info msg="StartContainer for \"651438b2334fc98312f35314bfe2c3bbfd35ed14d146e9fbf5dd327ac3a67ec1\" returns successfully" Sep 12 17:18:12.040557 kubelet[2581]: I0912 17:18:12.040169 2581 status_manager.go:895] "Failed to get status for pod" podUID="46b1f06e-7701-4bc5-b45a-e7496d8fb665" pod="tigera-operator/tigera-operator-755d956888-2pj5v" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:58468->10.0.0.2:2379: read: connection timed out" Sep 12 17:18:14.105845 kubelet[2581]: E0912 17:18:14.105534 2581 controller.go:195] "Failed to update lease" err="Put \"https://188.245.115.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-e-c5bf4513f4?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 12 17:18:15.106877 systemd[1]: cri-containerd-c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23.scope: Deactivated successfully. Sep 12 17:18:15.128690 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23-rootfs.mount: Deactivated successfully. Sep 12 17:18:15.136046 containerd[1476]: time="2025-09-12T17:18:15.135950733Z" level=info msg="shim disconnected" id=c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23 namespace=k8s.io Sep 12 17:18:15.136046 containerd[1476]: time="2025-09-12T17:18:15.136010013Z" level=warning msg="cleaning up after shim disconnected" id=c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23 namespace=k8s.io Sep 12 17:18:15.136046 containerd[1476]: time="2025-09-12T17:18:15.136022813Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:18:15.846180 kubelet[2581]: I0912 17:18:15.846070 2581 scope.go:117] "RemoveContainer" containerID="286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7" Sep 12 17:18:15.847275 kubelet[2581]: I0912 17:18:15.846895 2581 scope.go:117] "RemoveContainer" containerID="c95402806939b4b62610457a23cac47a3c494b9808f7210cd4af85113f3faf23" Sep 12 17:18:15.847275 kubelet[2581]: E0912 17:18:15.847190 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-755d956888-2pj5v_tigera-operator(46b1f06e-7701-4bc5-b45a-e7496d8fb665)\"" pod="tigera-operator/tigera-operator-755d956888-2pj5v" podUID="46b1f06e-7701-4bc5-b45a-e7496d8fb665" Sep 12 17:18:15.848598 containerd[1476]: time="2025-09-12T17:18:15.848322539Z" level=info msg="RemoveContainer for \"286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7\"" Sep 12 17:18:15.852601 containerd[1476]: time="2025-09-12T17:18:15.852574229Z" level=info msg="RemoveContainer for \"286214e90008cc36dde4ec1138ccdea0d15d2fb1a2de79ed8623c9a153a0acc7\" returns successfully" Sep 12 17:18:24.107097 kubelet[2581]: E0912 17:18:24.106657 2581 controller.go:195] "Failed to update lease" err="Put \"https://188.245.115.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-e-c5bf4513f4?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 12 17:18:25.296735 systemd[1]: run-containerd-runc-k8s.io-eededcb6398b77c89e927443307febc30ed6f91f91bded08a641dc76e7562a2e-runc.xM83UY.mount: Deactivated successfully.