Aug 13 00:16:30.889554 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Aug 13 00:16:30.889587 kernel: Linux version 6.6.100-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Tue Aug 12 22:21:53 -00 2025 Aug 13 00:16:30.889600 kernel: KASLR enabled Aug 13 00:16:30.889606 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Aug 13 00:16:30.889612 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390c1018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Aug 13 00:16:30.889618 kernel: random: crng init done Aug 13 00:16:30.889625 kernel: ACPI: Early table checksum verification disabled Aug 13 00:16:30.889630 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Aug 13 00:16:30.889637 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Aug 13 00:16:30.889645 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889651 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889657 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889663 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889669 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889676 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889684 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889690 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889697 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:16:30.889704 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Aug 13 00:16:30.889710 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Aug 13 00:16:30.889716 kernel: NUMA: Failed to initialise from firmware Aug 13 00:16:30.889723 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Aug 13 00:16:30.889729 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] Aug 13 00:16:30.889735 kernel: Zone ranges: Aug 13 00:16:30.889741 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Aug 13 00:16:30.889750 kernel: DMA32 empty Aug 13 00:16:30.889756 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Aug 13 00:16:30.889762 kernel: Movable zone start for each node Aug 13 00:16:30.889768 kernel: Early memory node ranges Aug 13 00:16:30.889775 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Aug 13 00:16:30.889781 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Aug 13 00:16:30.889787 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Aug 13 00:16:30.889794 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Aug 13 00:16:30.889800 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Aug 13 00:16:30.889806 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Aug 13 00:16:30.889812 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Aug 13 00:16:30.889819 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Aug 13 00:16:30.889826 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Aug 13 00:16:30.889833 kernel: psci: probing for conduit method from ACPI. Aug 13 00:16:30.889839 kernel: psci: PSCIv1.1 detected in firmware. Aug 13 00:16:30.889848 kernel: psci: Using standard PSCI v0.2 function IDs Aug 13 00:16:30.889855 kernel: psci: Trusted OS migration not required Aug 13 00:16:30.889861 kernel: psci: SMC Calling Convention v1.1 Aug 13 00:16:30.889870 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Aug 13 00:16:30.889878 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Aug 13 00:16:30.889885 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Aug 13 00:16:30.889892 kernel: pcpu-alloc: [0] 0 [0] 1 Aug 13 00:16:30.889899 kernel: Detected PIPT I-cache on CPU0 Aug 13 00:16:30.889905 kernel: CPU features: detected: GIC system register CPU interface Aug 13 00:16:30.889912 kernel: CPU features: detected: Hardware dirty bit management Aug 13 00:16:30.889919 kernel: CPU features: detected: Spectre-v4 Aug 13 00:16:30.889925 kernel: CPU features: detected: Spectre-BHB Aug 13 00:16:30.889932 kernel: CPU features: kernel page table isolation forced ON by KASLR Aug 13 00:16:30.889940 kernel: CPU features: detected: Kernel page table isolation (KPTI) Aug 13 00:16:30.889947 kernel: CPU features: detected: ARM erratum 1418040 Aug 13 00:16:30.889954 kernel: CPU features: detected: SSBS not fully self-synchronizing Aug 13 00:16:30.889961 kernel: alternatives: applying boot alternatives Aug 13 00:16:30.889969 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=2f9df6e9e6c671c457040a64675390bbff42294b08c628cd2dc472ed8120146a Aug 13 00:16:30.889976 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 13 00:16:30.889983 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 13 00:16:30.889989 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 00:16:30.889996 kernel: Fallback order for Node 0: 0 Aug 13 00:16:30.890003 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Aug 13 00:16:30.890009 kernel: Policy zone: Normal Aug 13 00:16:30.890018 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 13 00:16:30.890025 kernel: software IO TLB: area num 2. Aug 13 00:16:30.890031 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Aug 13 00:16:30.890038 kernel: Memory: 3882808K/4096000K available (10304K kernel code, 2186K rwdata, 8108K rodata, 39424K init, 897K bss, 213192K reserved, 0K cma-reserved) Aug 13 00:16:30.890045 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 13 00:16:30.890052 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 13 00:16:30.890060 kernel: rcu: RCU event tracing is enabled. Aug 13 00:16:30.890066 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 13 00:16:30.890073 kernel: Trampoline variant of Tasks RCU enabled. Aug 13 00:16:30.890081 kernel: Tracing variant of Tasks RCU enabled. Aug 13 00:16:30.890087 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 13 00:16:30.890096 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 13 00:16:30.890103 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Aug 13 00:16:30.890109 kernel: GICv3: 256 SPIs implemented Aug 13 00:16:30.890116 kernel: GICv3: 0 Extended SPIs implemented Aug 13 00:16:30.890123 kernel: Root IRQ handler: gic_handle_irq Aug 13 00:16:30.890129 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Aug 13 00:16:30.890136 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Aug 13 00:16:30.890143 kernel: ITS [mem 0x08080000-0x0809ffff] Aug 13 00:16:30.890150 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Aug 13 00:16:30.890157 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Aug 13 00:16:30.890163 kernel: GICv3: using LPI property table @0x00000001000e0000 Aug 13 00:16:30.890170 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Aug 13 00:16:30.890179 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 13 00:16:30.890185 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 13 00:16:30.890192 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Aug 13 00:16:30.890199 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Aug 13 00:16:30.890206 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Aug 13 00:16:30.890213 kernel: Console: colour dummy device 80x25 Aug 13 00:16:30.890220 kernel: ACPI: Core revision 20230628 Aug 13 00:16:30.890227 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Aug 13 00:16:30.890234 kernel: pid_max: default: 32768 minimum: 301 Aug 13 00:16:30.890241 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Aug 13 00:16:30.890250 kernel: landlock: Up and running. Aug 13 00:16:30.890257 kernel: SELinux: Initializing. Aug 13 00:16:30.890264 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 13 00:16:30.890271 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 13 00:16:30.890278 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:16:30.890285 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:16:30.890292 kernel: rcu: Hierarchical SRCU implementation. Aug 13 00:16:30.890299 kernel: rcu: Max phase no-delay instances is 400. Aug 13 00:16:30.890306 kernel: Platform MSI: ITS@0x8080000 domain created Aug 13 00:16:30.890315 kernel: PCI/MSI: ITS@0x8080000 domain created Aug 13 00:16:30.890323 kernel: Remapping and enabling EFI services. Aug 13 00:16:30.890330 kernel: smp: Bringing up secondary CPUs ... Aug 13 00:16:30.890337 kernel: Detected PIPT I-cache on CPU1 Aug 13 00:16:30.890344 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Aug 13 00:16:30.890352 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Aug 13 00:16:30.890359 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 13 00:16:30.890365 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Aug 13 00:16:30.890411 kernel: smp: Brought up 1 node, 2 CPUs Aug 13 00:16:30.890420 kernel: SMP: Total of 2 processors activated. Aug 13 00:16:30.890429 kernel: CPU features: detected: 32-bit EL0 Support Aug 13 00:16:30.890447 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Aug 13 00:16:30.890461 kernel: CPU features: detected: Common not Private translations Aug 13 00:16:30.890471 kernel: CPU features: detected: CRC32 instructions Aug 13 00:16:30.890478 kernel: CPU features: detected: Enhanced Virtualization Traps Aug 13 00:16:30.890485 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Aug 13 00:16:30.890493 kernel: CPU features: detected: LSE atomic instructions Aug 13 00:16:30.890500 kernel: CPU features: detected: Privileged Access Never Aug 13 00:16:30.890508 kernel: CPU features: detected: RAS Extension Support Aug 13 00:16:30.890517 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Aug 13 00:16:30.890525 kernel: CPU: All CPU(s) started at EL1 Aug 13 00:16:30.890532 kernel: alternatives: applying system-wide alternatives Aug 13 00:16:30.890540 kernel: devtmpfs: initialized Aug 13 00:16:30.890547 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 13 00:16:30.890555 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 13 00:16:30.890562 kernel: pinctrl core: initialized pinctrl subsystem Aug 13 00:16:30.890571 kernel: SMBIOS 3.0.0 present. Aug 13 00:16:30.890579 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Aug 13 00:16:30.890586 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 13 00:16:30.890594 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Aug 13 00:16:30.890601 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Aug 13 00:16:30.890609 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Aug 13 00:16:30.890616 kernel: audit: initializing netlink subsys (disabled) Aug 13 00:16:30.890624 kernel: audit: type=2000 audit(0.018:1): state=initialized audit_enabled=0 res=1 Aug 13 00:16:30.890631 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 13 00:16:30.890640 kernel: cpuidle: using governor menu Aug 13 00:16:30.890647 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Aug 13 00:16:30.890655 kernel: ASID allocator initialised with 32768 entries Aug 13 00:16:30.890662 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 13 00:16:30.890670 kernel: Serial: AMBA PL011 UART driver Aug 13 00:16:30.890677 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Aug 13 00:16:30.890685 kernel: Modules: 0 pages in range for non-PLT usage Aug 13 00:16:30.890692 kernel: Modules: 509008 pages in range for PLT usage Aug 13 00:16:30.890700 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 13 00:16:30.890710 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Aug 13 00:16:30.890717 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Aug 13 00:16:30.890724 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Aug 13 00:16:30.890732 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 13 00:16:30.890739 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Aug 13 00:16:30.890746 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Aug 13 00:16:30.890754 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Aug 13 00:16:30.890761 kernel: ACPI: Added _OSI(Module Device) Aug 13 00:16:30.890768 kernel: ACPI: Added _OSI(Processor Device) Aug 13 00:16:30.890777 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 13 00:16:30.890784 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 13 00:16:30.890792 kernel: ACPI: Interpreter enabled Aug 13 00:16:30.890800 kernel: ACPI: Using GIC for interrupt routing Aug 13 00:16:30.890807 kernel: ACPI: MCFG table detected, 1 entries Aug 13 00:16:30.890814 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Aug 13 00:16:30.890821 kernel: printk: console [ttyAMA0] enabled Aug 13 00:16:30.890829 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 13 00:16:30.891000 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 13 00:16:30.891082 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Aug 13 00:16:30.891150 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Aug 13 00:16:30.891220 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Aug 13 00:16:30.891289 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Aug 13 00:16:30.891299 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Aug 13 00:16:30.891307 kernel: PCI host bridge to bus 0000:00 Aug 13 00:16:30.891402 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Aug 13 00:16:30.893572 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Aug 13 00:16:30.893665 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Aug 13 00:16:30.893726 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 13 00:16:30.893815 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Aug 13 00:16:30.893898 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Aug 13 00:16:30.893970 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Aug 13 00:16:30.894046 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Aug 13 00:16:30.894123 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.894189 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Aug 13 00:16:30.894265 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.894332 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Aug 13 00:16:30.894428 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.894523 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Aug 13 00:16:30.894598 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.894664 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Aug 13 00:16:30.894741 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.894808 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Aug 13 00:16:30.894880 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.894952 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Aug 13 00:16:30.895026 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.895093 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Aug 13 00:16:30.895163 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.895228 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Aug 13 00:16:30.895304 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Aug 13 00:16:30.895408 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Aug 13 00:16:30.898683 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Aug 13 00:16:30.898798 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Aug 13 00:16:30.898883 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Aug 13 00:16:30.898956 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Aug 13 00:16:30.899027 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Aug 13 00:16:30.899096 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Aug 13 00:16:30.899186 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Aug 13 00:16:30.899258 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Aug 13 00:16:30.899344 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Aug 13 00:16:30.899451 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Aug 13 00:16:30.899531 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Aug 13 00:16:30.899611 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Aug 13 00:16:30.899682 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Aug 13 00:16:30.899764 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Aug 13 00:16:30.899834 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Aug 13 00:16:30.899912 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Aug 13 00:16:30.899982 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Aug 13 00:16:30.900051 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Aug 13 00:16:30.900131 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Aug 13 00:16:30.900203 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Aug 13 00:16:30.900273 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Aug 13 00:16:30.900341 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Aug 13 00:16:30.900429 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Aug 13 00:16:30.902640 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Aug 13 00:16:30.902721 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Aug 13 00:16:30.902806 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Aug 13 00:16:30.902874 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Aug 13 00:16:30.902940 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Aug 13 00:16:30.903011 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Aug 13 00:16:30.903078 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Aug 13 00:16:30.903144 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Aug 13 00:16:30.903214 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Aug 13 00:16:30.903280 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Aug 13 00:16:30.903361 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Aug 13 00:16:30.904524 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Aug 13 00:16:30.904631 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Aug 13 00:16:30.904698 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Aug 13 00:16:30.904770 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Aug 13 00:16:30.904837 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Aug 13 00:16:30.904901 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Aug 13 00:16:30.904977 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Aug 13 00:16:30.905043 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Aug 13 00:16:30.905108 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Aug 13 00:16:30.905176 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Aug 13 00:16:30.905240 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Aug 13 00:16:30.905303 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Aug 13 00:16:30.905391 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Aug 13 00:16:30.906603 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Aug 13 00:16:30.906702 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Aug 13 00:16:30.906773 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Aug 13 00:16:30.906840 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Aug 13 00:16:30.906910 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Aug 13 00:16:30.906976 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Aug 13 00:16:30.907044 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Aug 13 00:16:30.907117 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Aug 13 00:16:30.907185 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Aug 13 00:16:30.907250 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Aug 13 00:16:30.907318 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Aug 13 00:16:30.907406 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Aug 13 00:16:30.907496 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Aug 13 00:16:30.907566 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Aug 13 00:16:30.907644 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Aug 13 00:16:30.907711 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Aug 13 00:16:30.907779 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Aug 13 00:16:30.907845 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Aug 13 00:16:30.907910 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Aug 13 00:16:30.907975 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Aug 13 00:16:30.908045 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Aug 13 00:16:30.908114 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Aug 13 00:16:30.908179 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Aug 13 00:16:30.908244 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Aug 13 00:16:30.908308 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Aug 13 00:16:30.908381 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Aug 13 00:16:30.910552 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Aug 13 00:16:30.910665 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Aug 13 00:16:30.910738 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Aug 13 00:16:30.910811 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Aug 13 00:16:30.910880 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Aug 13 00:16:30.910945 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Aug 13 00:16:30.911013 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Aug 13 00:16:30.911078 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Aug 13 00:16:30.911146 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Aug 13 00:16:30.911212 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Aug 13 00:16:30.911280 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Aug 13 00:16:30.911350 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Aug 13 00:16:30.912526 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Aug 13 00:16:30.912613 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Aug 13 00:16:30.912684 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Aug 13 00:16:30.912759 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Aug 13 00:16:30.912827 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Aug 13 00:16:30.912895 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Aug 13 00:16:30.912963 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 13 00:16:30.913035 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Aug 13 00:16:30.913099 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Aug 13 00:16:30.913164 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Aug 13 00:16:30.913236 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Aug 13 00:16:30.913314 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 13 00:16:30.913402 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Aug 13 00:16:30.914610 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Aug 13 00:16:30.914697 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Aug 13 00:16:30.914784 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Aug 13 00:16:30.914853 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Aug 13 00:16:30.914922 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 13 00:16:30.914987 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Aug 13 00:16:30.915061 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Aug 13 00:16:30.915123 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Aug 13 00:16:30.915196 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Aug 13 00:16:30.915262 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 13 00:16:30.915327 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Aug 13 00:16:30.915405 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Aug 13 00:16:30.915485 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Aug 13 00:16:30.916614 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Aug 13 00:16:30.916701 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 13 00:16:30.916768 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Aug 13 00:16:30.916834 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Aug 13 00:16:30.916898 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Aug 13 00:16:30.916972 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Aug 13 00:16:30.917040 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Aug 13 00:16:30.917108 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 13 00:16:30.917175 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Aug 13 00:16:30.917246 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Aug 13 00:16:30.917313 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Aug 13 00:16:30.917409 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Aug 13 00:16:30.918225 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Aug 13 00:16:30.918311 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Aug 13 00:16:30.918402 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 13 00:16:30.919600 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Aug 13 00:16:30.919685 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Aug 13 00:16:30.919771 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Aug 13 00:16:30.919842 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 13 00:16:30.919911 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Aug 13 00:16:30.919981 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Aug 13 00:16:30.920046 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Aug 13 00:16:30.920114 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 13 00:16:30.920177 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Aug 13 00:16:30.920244 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Aug 13 00:16:30.920313 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Aug 13 00:16:30.920395 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Aug 13 00:16:30.920478 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Aug 13 00:16:30.920540 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Aug 13 00:16:30.920614 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Aug 13 00:16:30.920677 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Aug 13 00:16:30.920740 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Aug 13 00:16:30.920813 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Aug 13 00:16:30.920873 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Aug 13 00:16:30.920930 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Aug 13 00:16:30.921000 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Aug 13 00:16:30.921063 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Aug 13 00:16:30.921124 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Aug 13 00:16:30.921207 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Aug 13 00:16:30.921274 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Aug 13 00:16:30.922503 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Aug 13 00:16:30.922646 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Aug 13 00:16:30.922716 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Aug 13 00:16:30.922776 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Aug 13 00:16:30.922846 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Aug 13 00:16:30.922912 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Aug 13 00:16:30.922974 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Aug 13 00:16:30.923047 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Aug 13 00:16:30.923112 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Aug 13 00:16:30.923180 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Aug 13 00:16:30.923251 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Aug 13 00:16:30.923315 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Aug 13 00:16:30.923391 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Aug 13 00:16:30.924610 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Aug 13 00:16:30.924699 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Aug 13 00:16:30.924763 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Aug 13 00:16:30.924781 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Aug 13 00:16:30.924789 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Aug 13 00:16:30.924797 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Aug 13 00:16:30.924806 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Aug 13 00:16:30.924814 kernel: iommu: Default domain type: Translated Aug 13 00:16:30.924823 kernel: iommu: DMA domain TLB invalidation policy: strict mode Aug 13 00:16:30.924832 kernel: efivars: Registered efivars operations Aug 13 00:16:30.924842 kernel: vgaarb: loaded Aug 13 00:16:30.924850 kernel: clocksource: Switched to clocksource arch_sys_counter Aug 13 00:16:30.924860 kernel: VFS: Disk quotas dquot_6.6.0 Aug 13 00:16:30.924869 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 13 00:16:30.924877 kernel: pnp: PnP ACPI init Aug 13 00:16:30.924958 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Aug 13 00:16:30.924971 kernel: pnp: PnP ACPI: found 1 devices Aug 13 00:16:30.924980 kernel: NET: Registered PF_INET protocol family Aug 13 00:16:30.924988 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 13 00:16:30.924997 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Aug 13 00:16:30.925009 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 13 00:16:30.925017 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 13 00:16:30.925026 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Aug 13 00:16:30.925034 kernel: TCP: Hash tables configured (established 32768 bind 32768) Aug 13 00:16:30.925043 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 13 00:16:30.925052 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 13 00:16:30.925060 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 13 00:16:30.925139 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Aug 13 00:16:30.925151 kernel: PCI: CLS 0 bytes, default 64 Aug 13 00:16:30.925161 kernel: kvm [1]: HYP mode not available Aug 13 00:16:30.925169 kernel: Initialise system trusted keyrings Aug 13 00:16:30.925176 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Aug 13 00:16:30.925185 kernel: Key type asymmetric registered Aug 13 00:16:30.925193 kernel: Asymmetric key parser 'x509' registered Aug 13 00:16:30.925201 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 13 00:16:30.925209 kernel: io scheduler mq-deadline registered Aug 13 00:16:30.925217 kernel: io scheduler kyber registered Aug 13 00:16:30.925224 kernel: io scheduler bfq registered Aug 13 00:16:30.925235 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Aug 13 00:16:30.925307 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Aug 13 00:16:30.925395 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Aug 13 00:16:30.926609 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.926715 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Aug 13 00:16:30.926788 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Aug 13 00:16:30.926866 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.926940 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Aug 13 00:16:30.927005 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Aug 13 00:16:30.927078 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.927161 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Aug 13 00:16:30.927238 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Aug 13 00:16:30.927310 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.927430 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Aug 13 00:16:30.929537 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Aug 13 00:16:30.929613 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.929686 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Aug 13 00:16:30.929763 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Aug 13 00:16:30.929841 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.929910 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Aug 13 00:16:30.929976 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Aug 13 00:16:30.930043 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.930133 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Aug 13 00:16:30.930213 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Aug 13 00:16:30.930283 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.930294 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Aug 13 00:16:30.930368 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Aug 13 00:16:30.930546 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Aug 13 00:16:30.930617 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 13 00:16:30.930628 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Aug 13 00:16:30.930637 kernel: ACPI: button: Power Button [PWRB] Aug 13 00:16:30.930649 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Aug 13 00:16:30.930721 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Aug 13 00:16:30.930792 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Aug 13 00:16:30.930803 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 13 00:16:30.930811 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Aug 13 00:16:30.930875 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Aug 13 00:16:30.930891 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Aug 13 00:16:30.930901 kernel: thunder_xcv, ver 1.0 Aug 13 00:16:30.930909 kernel: thunder_bgx, ver 1.0 Aug 13 00:16:30.930920 kernel: nicpf, ver 1.0 Aug 13 00:16:30.930928 kernel: nicvf, ver 1.0 Aug 13 00:16:30.931025 kernel: rtc-efi rtc-efi.0: registered as rtc0 Aug 13 00:16:30.931104 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-08-13T00:16:30 UTC (1755044190) Aug 13 00:16:30.931117 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 13 00:16:30.931125 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Aug 13 00:16:30.931133 kernel: watchdog: Delayed init of the lockup detector failed: -19 Aug 13 00:16:30.931141 kernel: watchdog: Hard watchdog permanently disabled Aug 13 00:16:30.931152 kernel: NET: Registered PF_INET6 protocol family Aug 13 00:16:30.931160 kernel: Segment Routing with IPv6 Aug 13 00:16:30.931168 kernel: In-situ OAM (IOAM) with IPv6 Aug 13 00:16:30.931176 kernel: NET: Registered PF_PACKET protocol family Aug 13 00:16:30.931183 kernel: Key type dns_resolver registered Aug 13 00:16:30.931191 kernel: registered taskstats version 1 Aug 13 00:16:30.931199 kernel: Loading compiled-in X.509 certificates Aug 13 00:16:30.931207 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.100-flatcar: 7263800c6d21650660e2b030c1023dce09b1e8b6' Aug 13 00:16:30.931215 kernel: Key type .fscrypt registered Aug 13 00:16:30.931224 kernel: Key type fscrypt-provisioning registered Aug 13 00:16:30.931232 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 13 00:16:30.931240 kernel: ima: Allocated hash algorithm: sha1 Aug 13 00:16:30.931248 kernel: ima: No architecture policies found Aug 13 00:16:30.931256 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Aug 13 00:16:30.931263 kernel: clk: Disabling unused clocks Aug 13 00:16:30.931271 kernel: Freeing unused kernel memory: 39424K Aug 13 00:16:30.931279 kernel: Run /init as init process Aug 13 00:16:30.931286 kernel: with arguments: Aug 13 00:16:30.931296 kernel: /init Aug 13 00:16:30.931303 kernel: with environment: Aug 13 00:16:30.931310 kernel: HOME=/ Aug 13 00:16:30.931319 kernel: TERM=linux Aug 13 00:16:30.931326 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 13 00:16:30.931336 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Aug 13 00:16:30.931346 systemd[1]: Detected virtualization kvm. Aug 13 00:16:30.931355 systemd[1]: Detected architecture arm64. Aug 13 00:16:30.931365 systemd[1]: Running in initrd. Aug 13 00:16:30.931387 systemd[1]: No hostname configured, using default hostname. Aug 13 00:16:30.931395 systemd[1]: Hostname set to . Aug 13 00:16:30.931404 systemd[1]: Initializing machine ID from VM UUID. Aug 13 00:16:30.931412 systemd[1]: Queued start job for default target initrd.target. Aug 13 00:16:30.931420 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:16:30.931429 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:16:30.931449 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 13 00:16:30.931464 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:16:30.931473 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 13 00:16:30.931481 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 13 00:16:30.931491 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 13 00:16:30.931504 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 13 00:16:30.931513 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:16:30.931524 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:16:30.931532 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:16:30.931540 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:16:30.931548 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:16:30.931562 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:16:30.931570 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:16:30.931578 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:16:30.931587 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 13 00:16:30.931595 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Aug 13 00:16:30.931605 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:16:30.931614 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:16:30.931623 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:16:30.931631 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:16:30.931639 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 13 00:16:30.931647 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:16:30.931656 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 13 00:16:30.931664 systemd[1]: Starting systemd-fsck-usr.service... Aug 13 00:16:30.931672 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:16:30.931682 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:16:30.931691 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:16:30.931703 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 13 00:16:30.931717 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:16:30.931725 systemd[1]: Finished systemd-fsck-usr.service. Aug 13 00:16:30.931737 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 00:16:30.931769 systemd-journald[237]: Collecting audit messages is disabled. Aug 13 00:16:30.931791 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:16:30.931802 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 00:16:30.931811 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:16:30.931820 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:16:30.931829 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 13 00:16:30.931837 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:16:30.931847 systemd-journald[237]: Journal started Aug 13 00:16:30.931869 systemd-journald[237]: Runtime Journal (/run/log/journal/232da3798e92480e8ded9a0c13f73371) is 8.0M, max 76.6M, 68.6M free. Aug 13 00:16:30.905863 systemd-modules-load[238]: Inserted module 'overlay' Aug 13 00:16:30.935805 systemd-modules-load[238]: Inserted module 'br_netfilter' Aug 13 00:16:30.936977 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:16:30.937003 kernel: Bridge firewalling registered Aug 13 00:16:30.937532 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:16:30.940850 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:16:30.947711 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 13 00:16:30.950813 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:16:30.958722 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:16:30.965491 dracut-cmdline[265]: dracut-dracut-053 Aug 13 00:16:30.970211 dracut-cmdline[265]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=2f9df6e9e6c671c457040a64675390bbff42294b08c628cd2dc472ed8120146a Aug 13 00:16:30.975856 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:16:30.977610 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:16:30.988020 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:16:31.017161 systemd-resolved[289]: Positive Trust Anchors: Aug 13 00:16:31.017177 systemd-resolved[289]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:16:31.017208 systemd-resolved[289]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:16:31.027258 systemd-resolved[289]: Defaulting to hostname 'linux'. Aug 13 00:16:31.028555 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:16:31.030624 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:16:31.067541 kernel: SCSI subsystem initialized Aug 13 00:16:31.072479 kernel: Loading iSCSI transport class v2.0-870. Aug 13 00:16:31.080499 kernel: iscsi: registered transport (tcp) Aug 13 00:16:31.094479 kernel: iscsi: registered transport (qla4xxx) Aug 13 00:16:31.094541 kernel: QLogic iSCSI HBA Driver Aug 13 00:16:31.154566 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 13 00:16:31.162699 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 13 00:16:31.183492 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 13 00:16:31.183564 kernel: device-mapper: uevent: version 1.0.3 Aug 13 00:16:31.184498 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Aug 13 00:16:31.239498 kernel: raid6: neonx8 gen() 15228 MB/s Aug 13 00:16:31.253526 kernel: raid6: neonx4 gen() 14576 MB/s Aug 13 00:16:31.270507 kernel: raid6: neonx2 gen() 11586 MB/s Aug 13 00:16:31.287516 kernel: raid6: neonx1 gen() 10174 MB/s Aug 13 00:16:31.304506 kernel: raid6: int64x8 gen() 6637 MB/s Aug 13 00:16:31.321497 kernel: raid6: int64x4 gen() 7174 MB/s Aug 13 00:16:31.338479 kernel: raid6: int64x2 gen() 6011 MB/s Aug 13 00:16:31.355500 kernel: raid6: int64x1 gen() 4957 MB/s Aug 13 00:16:31.355580 kernel: raid6: using algorithm neonx8 gen() 15228 MB/s Aug 13 00:16:31.372498 kernel: raid6: .... xor() 11657 MB/s, rmw enabled Aug 13 00:16:31.372577 kernel: raid6: using neon recovery algorithm Aug 13 00:16:31.377482 kernel: xor: measuring software checksum speed Aug 13 00:16:31.377558 kernel: 8regs : 19759 MB/sec Aug 13 00:16:31.378736 kernel: 32regs : 11063 MB/sec Aug 13 00:16:31.378778 kernel: arm64_neon : 27052 MB/sec Aug 13 00:16:31.378812 kernel: xor: using function: arm64_neon (27052 MB/sec) Aug 13 00:16:31.430627 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 13 00:16:31.445916 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:16:31.454726 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:16:31.469241 systemd-udevd[456]: Using default interface naming scheme 'v255'. Aug 13 00:16:31.473892 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:16:31.482639 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 13 00:16:31.497896 dracut-pre-trigger[459]: rd.md=0: removing MD RAID activation Aug 13 00:16:31.538732 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:16:31.546688 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:16:31.605099 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:16:31.612749 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 13 00:16:31.633741 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 13 00:16:31.635024 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:16:31.638411 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:16:31.639333 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:16:31.644635 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 13 00:16:31.669771 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:16:31.717120 kernel: scsi host0: Virtio SCSI HBA Aug 13 00:16:31.729478 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Aug 13 00:16:31.729580 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Aug 13 00:16:31.739226 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 13 00:16:31.739419 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:16:31.743827 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 00:16:31.747404 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:16:31.753296 kernel: ACPI: bus type USB registered Aug 13 00:16:31.753322 kernel: usbcore: registered new interface driver usbfs Aug 13 00:16:31.753334 kernel: usbcore: registered new interface driver hub Aug 13 00:16:31.753346 kernel: usbcore: registered new device driver usb Aug 13 00:16:31.747956 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:16:31.750119 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:16:31.757762 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:16:31.787286 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:16:31.794751 kernel: sr 0:0:0:0: Power-on or device reset occurred Aug 13 00:16:31.794989 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Aug 13 00:16:31.797068 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 13 00:16:31.797123 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 13 00:16:31.797299 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Aug 13 00:16:31.798473 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Aug 13 00:16:31.798836 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 00:16:31.802808 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Aug 13 00:16:31.802985 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 13 00:16:31.803072 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Aug 13 00:16:31.803156 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Aug 13 00:16:31.803241 kernel: hub 1-0:1.0: USB hub found Aug 13 00:16:31.804542 kernel: hub 1-0:1.0: 4 ports detected Aug 13 00:16:31.804707 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Aug 13 00:16:31.809631 kernel: hub 2-0:1.0: USB hub found Aug 13 00:16:31.809832 kernel: hub 2-0:1.0: 4 ports detected Aug 13 00:16:31.813001 kernel: sd 0:0:0:1: Power-on or device reset occurred Aug 13 00:16:31.813209 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Aug 13 00:16:31.813296 kernel: sd 0:0:0:1: [sda] Write Protect is off Aug 13 00:16:31.813498 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Aug 13 00:16:31.813610 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 13 00:16:31.816659 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 13 00:16:31.816699 kernel: GPT:17805311 != 80003071 Aug 13 00:16:31.816709 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 13 00:16:31.817493 kernel: GPT:17805311 != 80003071 Aug 13 00:16:31.817528 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 13 00:16:31.818466 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:16:31.820514 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Aug 13 00:16:31.830375 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:16:31.875461 kernel: BTRFS: device fsid 03408483-5051-409a-aab4-4e6d5027e982 devid 1 transid 41 /dev/sda3 scanned by (udev-worker) (526) Aug 13 00:16:31.875515 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (508) Aug 13 00:16:31.878934 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Aug 13 00:16:31.887406 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Aug 13 00:16:31.900813 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Aug 13 00:16:31.902286 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Aug 13 00:16:31.908971 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 13 00:16:31.917692 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 13 00:16:31.926639 disk-uuid[575]: Primary Header is updated. Aug 13 00:16:31.926639 disk-uuid[575]: Secondary Entries is updated. Aug 13 00:16:31.926639 disk-uuid[575]: Secondary Header is updated. Aug 13 00:16:31.933477 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:16:31.941499 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:16:31.949511 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:16:32.046476 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Aug 13 00:16:32.180501 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Aug 13 00:16:32.180591 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Aug 13 00:16:32.180853 kernel: usbcore: registered new interface driver usbhid Aug 13 00:16:32.181486 kernel: usbhid: USB HID core driver Aug 13 00:16:32.287491 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Aug 13 00:16:32.417509 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Aug 13 00:16:32.470504 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Aug 13 00:16:32.950007 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:16:32.950081 disk-uuid[576]: The operation has completed successfully. Aug 13 00:16:33.016250 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 13 00:16:33.016366 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 13 00:16:33.035796 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 13 00:16:33.040505 sh[594]: Success Aug 13 00:16:33.052469 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Aug 13 00:16:33.112674 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 13 00:16:33.121600 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 13 00:16:33.125583 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 13 00:16:33.144998 kernel: BTRFS info (device dm-0): first mount of filesystem 03408483-5051-409a-aab4-4e6d5027e982 Aug 13 00:16:33.145079 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Aug 13 00:16:33.145102 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Aug 13 00:16:33.145782 kernel: BTRFS info (device dm-0): disabling log replay at mount time Aug 13 00:16:33.146749 kernel: BTRFS info (device dm-0): using free space tree Aug 13 00:16:33.156508 kernel: BTRFS info (device dm-0): enabling ssd optimizations Aug 13 00:16:33.158988 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 13 00:16:33.160702 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 13 00:16:33.166744 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 13 00:16:33.172718 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 13 00:16:33.183172 kernel: BTRFS info (device sda6): first mount of filesystem dbce4b09-c4b8-4cc9-bd11-416717f60c7d Aug 13 00:16:33.183237 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 13 00:16:33.183875 kernel: BTRFS info (device sda6): using free space tree Aug 13 00:16:33.190462 kernel: BTRFS info (device sda6): enabling ssd optimizations Aug 13 00:16:33.190524 kernel: BTRFS info (device sda6): auto enabling async discard Aug 13 00:16:33.203159 systemd[1]: mnt-oem.mount: Deactivated successfully. Aug 13 00:16:33.204480 kernel: BTRFS info (device sda6): last unmount of filesystem dbce4b09-c4b8-4cc9-bd11-416717f60c7d Aug 13 00:16:33.215842 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 13 00:16:33.221858 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 13 00:16:33.335338 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:16:33.343800 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:16:33.345190 ignition[676]: Ignition 2.19.0 Aug 13 00:16:33.345198 ignition[676]: Stage: fetch-offline Aug 13 00:16:33.345237 ignition[676]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:16:33.345246 ignition[676]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:16:33.345429 ignition[676]: parsed url from cmdline: "" Aug 13 00:16:33.345432 ignition[676]: no config URL provided Aug 13 00:16:33.345450 ignition[676]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:16:33.349618 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:16:33.345458 ignition[676]: no config at "/usr/lib/ignition/user.ign" Aug 13 00:16:33.345469 ignition[676]: failed to fetch config: resource requires networking Aug 13 00:16:33.345688 ignition[676]: Ignition finished successfully Aug 13 00:16:33.369360 systemd-networkd[783]: lo: Link UP Aug 13 00:16:33.369375 systemd-networkd[783]: lo: Gained carrier Aug 13 00:16:33.371043 systemd-networkd[783]: Enumeration completed Aug 13 00:16:33.371194 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:16:33.371690 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:33.371694 systemd-networkd[783]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:16:33.373071 systemd[1]: Reached target network.target - Network. Aug 13 00:16:33.373762 systemd-networkd[783]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:33.373765 systemd-networkd[783]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:16:33.374333 systemd-networkd[783]: eth0: Link UP Aug 13 00:16:33.374337 systemd-networkd[783]: eth0: Gained carrier Aug 13 00:16:33.374359 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:33.377959 systemd-networkd[783]: eth1: Link UP Aug 13 00:16:33.377963 systemd-networkd[783]: eth1: Gained carrier Aug 13 00:16:33.377974 systemd-networkd[783]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:33.379780 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 13 00:16:33.399743 ignition[786]: Ignition 2.19.0 Aug 13 00:16:33.399757 ignition[786]: Stage: fetch Aug 13 00:16:33.400039 ignition[786]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:16:33.400054 ignition[786]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:16:33.400178 ignition[786]: parsed url from cmdline: "" Aug 13 00:16:33.400183 ignition[786]: no config URL provided Aug 13 00:16:33.400189 ignition[786]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:16:33.400200 ignition[786]: no config at "/usr/lib/ignition/user.ign" Aug 13 00:16:33.400225 ignition[786]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Aug 13 00:16:33.402601 ignition[786]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Aug 13 00:16:33.406536 systemd-networkd[783]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 13 00:16:33.440555 systemd-networkd[783]: eth0: DHCPv4 address 91.99.89.242/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 13 00:16:33.602897 ignition[786]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Aug 13 00:16:33.615622 ignition[786]: GET result: OK Aug 13 00:16:33.615745 ignition[786]: parsing config with SHA512: 3e513969f4c13a528c137f0e8399d494db0fbbc227b4ff2b4e403780f7bf08d1b85a3df6383b90755da636816dc219828b7c38d0f4c8f9de08cdeab2ca7b7bd8 Aug 13 00:16:33.623890 unknown[786]: fetched base config from "system" Aug 13 00:16:33.623902 unknown[786]: fetched base config from "system" Aug 13 00:16:33.624639 ignition[786]: fetch: fetch complete Aug 13 00:16:33.623907 unknown[786]: fetched user config from "hetzner" Aug 13 00:16:33.624645 ignition[786]: fetch: fetch passed Aug 13 00:16:33.624713 ignition[786]: Ignition finished successfully Aug 13 00:16:33.630267 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 13 00:16:33.637807 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 13 00:16:33.655492 ignition[793]: Ignition 2.19.0 Aug 13 00:16:33.655504 ignition[793]: Stage: kargs Aug 13 00:16:33.655704 ignition[793]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:16:33.655713 ignition[793]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:16:33.658706 ignition[793]: kargs: kargs passed Aug 13 00:16:33.658787 ignition[793]: Ignition finished successfully Aug 13 00:16:33.661426 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 13 00:16:33.667824 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 13 00:16:33.690271 ignition[800]: Ignition 2.19.0 Aug 13 00:16:33.691072 ignition[800]: Stage: disks Aug 13 00:16:33.691398 ignition[800]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:16:33.691411 ignition[800]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:16:33.694541 ignition[800]: disks: disks passed Aug 13 00:16:33.694621 ignition[800]: Ignition finished successfully Aug 13 00:16:33.696749 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 13 00:16:33.700028 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 13 00:16:33.701239 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 13 00:16:33.703932 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:16:33.705282 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:16:33.706174 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:16:33.717047 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 13 00:16:33.740275 systemd-fsck[808]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Aug 13 00:16:33.745165 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 13 00:16:33.750810 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 13 00:16:33.810744 kernel: EXT4-fs (sda9): mounted filesystem 128aec8b-f05d-48ed-8996-c9e8b21a7810 r/w with ordered data mode. Quota mode: none. Aug 13 00:16:33.811836 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 13 00:16:33.813162 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 13 00:16:33.830799 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:16:33.835609 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 13 00:16:33.839767 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 13 00:16:33.841557 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 13 00:16:33.843024 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:16:33.849066 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 13 00:16:33.855871 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 13 00:16:33.862129 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (816) Aug 13 00:16:33.862154 kernel: BTRFS info (device sda6): first mount of filesystem dbce4b09-c4b8-4cc9-bd11-416717f60c7d Aug 13 00:16:33.862165 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 13 00:16:33.862174 kernel: BTRFS info (device sda6): using free space tree Aug 13 00:16:33.869228 kernel: BTRFS info (device sda6): enabling ssd optimizations Aug 13 00:16:33.869299 kernel: BTRFS info (device sda6): auto enabling async discard Aug 13 00:16:33.875739 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:16:33.920969 coreos-metadata[818]: Aug 13 00:16:33.920 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Aug 13 00:16:33.922786 coreos-metadata[818]: Aug 13 00:16:33.922 INFO Fetch successful Aug 13 00:16:33.924103 coreos-metadata[818]: Aug 13 00:16:33.923 INFO wrote hostname ci-4081-3-5-8-f2ca23fedd to /sysroot/etc/hostname Aug 13 00:16:33.925408 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:16:33.927391 initrd-setup-root[843]: cut: /sysroot/etc/passwd: No such file or directory Aug 13 00:16:33.931536 initrd-setup-root[851]: cut: /sysroot/etc/group: No such file or directory Aug 13 00:16:33.936384 initrd-setup-root[858]: cut: /sysroot/etc/shadow: No such file or directory Aug 13 00:16:33.941112 initrd-setup-root[865]: cut: /sysroot/etc/gshadow: No such file or directory Aug 13 00:16:34.052240 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 13 00:16:34.062751 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 13 00:16:34.067816 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 13 00:16:34.078498 kernel: BTRFS info (device sda6): last unmount of filesystem dbce4b09-c4b8-4cc9-bd11-416717f60c7d Aug 13 00:16:34.107895 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 13 00:16:34.109058 ignition[933]: INFO : Ignition 2.19.0 Aug 13 00:16:34.109058 ignition[933]: INFO : Stage: mount Aug 13 00:16:34.109058 ignition[933]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:16:34.109058 ignition[933]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:16:34.113273 ignition[933]: INFO : mount: mount passed Aug 13 00:16:34.113273 ignition[933]: INFO : Ignition finished successfully Aug 13 00:16:34.111713 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 13 00:16:34.118558 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 13 00:16:34.145696 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 13 00:16:34.157762 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:16:34.167696 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (944) Aug 13 00:16:34.169673 kernel: BTRFS info (device sda6): first mount of filesystem dbce4b09-c4b8-4cc9-bd11-416717f60c7d Aug 13 00:16:34.169792 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 13 00:16:34.169880 kernel: BTRFS info (device sda6): using free space tree Aug 13 00:16:34.173476 kernel: BTRFS info (device sda6): enabling ssd optimizations Aug 13 00:16:34.173542 kernel: BTRFS info (device sda6): auto enabling async discard Aug 13 00:16:34.176962 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:16:34.202953 ignition[961]: INFO : Ignition 2.19.0 Aug 13 00:16:34.202953 ignition[961]: INFO : Stage: files Aug 13 00:16:34.204547 ignition[961]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:16:34.204547 ignition[961]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:16:34.206416 ignition[961]: DEBUG : files: compiled without relabeling support, skipping Aug 13 00:16:34.206416 ignition[961]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 13 00:16:34.206416 ignition[961]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 13 00:16:34.210570 ignition[961]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 13 00:16:34.212109 ignition[961]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 13 00:16:34.212109 ignition[961]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 13 00:16:34.211039 unknown[961]: wrote ssh authorized keys file for user: core Aug 13 00:16:34.215477 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 13 00:16:34.215477 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Aug 13 00:16:34.327049 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 13 00:16:34.671688 systemd-networkd[783]: eth0: Gained IPv6LL Aug 13 00:16:35.375761 systemd-networkd[783]: eth1: Gained IPv6LL Aug 13 00:16:37.544036 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 13 00:16:37.546202 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Aug 13 00:16:37.819416 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 13 00:16:38.066826 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 13 00:16:38.066826 ignition[961]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Aug 13 00:16:38.070895 ignition[961]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:16:38.070895 ignition[961]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:16:38.070895 ignition[961]: INFO : files: files passed Aug 13 00:16:38.070895 ignition[961]: INFO : Ignition finished successfully Aug 13 00:16:38.072146 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 13 00:16:38.082476 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 13 00:16:38.087751 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 13 00:16:38.104226 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 13 00:16:38.104422 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 13 00:16:38.123048 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:16:38.123048 initrd-setup-root-after-ignition[989]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:16:38.125928 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:16:38.127200 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:16:38.129536 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 13 00:16:38.135667 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 13 00:16:38.173668 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 13 00:16:38.173866 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 13 00:16:38.178509 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 13 00:16:38.181080 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 13 00:16:38.182157 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 13 00:16:38.185711 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 13 00:16:38.215801 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:16:38.220663 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 13 00:16:38.234876 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:16:38.235769 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:16:38.237231 systemd[1]: Stopped target timers.target - Timer Units. Aug 13 00:16:38.238497 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 13 00:16:38.238626 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:16:38.240231 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 13 00:16:38.240966 systemd[1]: Stopped target basic.target - Basic System. Aug 13 00:16:38.242937 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 13 00:16:38.244730 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:16:38.246096 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 13 00:16:38.247168 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 13 00:16:38.248300 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:16:38.249484 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 13 00:16:38.250542 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 13 00:16:38.251714 systemd[1]: Stopped target swap.target - Swaps. Aug 13 00:16:38.252731 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 13 00:16:38.252856 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:16:38.254155 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:16:38.254858 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:16:38.256000 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 13 00:16:38.256429 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:16:38.257089 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 13 00:16:38.257205 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 13 00:16:38.258985 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 13 00:16:38.259118 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:16:38.260367 systemd[1]: ignition-files.service: Deactivated successfully. Aug 13 00:16:38.260501 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 13 00:16:38.262714 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 13 00:16:38.262935 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:16:38.273827 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 13 00:16:38.274891 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 13 00:16:38.278391 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:16:38.280653 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 13 00:16:38.282615 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 13 00:16:38.283098 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:16:38.287636 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 13 00:16:38.287798 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:16:38.294549 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 13 00:16:38.294656 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 13 00:16:38.301068 ignition[1013]: INFO : Ignition 2.19.0 Aug 13 00:16:38.301068 ignition[1013]: INFO : Stage: umount Aug 13 00:16:38.302559 ignition[1013]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:16:38.302559 ignition[1013]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:16:38.302559 ignition[1013]: INFO : umount: umount passed Aug 13 00:16:38.302559 ignition[1013]: INFO : Ignition finished successfully Aug 13 00:16:38.309219 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 13 00:16:38.310887 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 13 00:16:38.310992 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 13 00:16:38.312169 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 13 00:16:38.312273 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 13 00:16:38.314260 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 13 00:16:38.315555 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 13 00:16:38.316535 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 13 00:16:38.316594 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 13 00:16:38.317514 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 13 00:16:38.317560 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 13 00:16:38.318578 systemd[1]: Stopped target network.target - Network. Aug 13 00:16:38.319408 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 13 00:16:38.319488 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:16:38.320500 systemd[1]: Stopped target paths.target - Path Units. Aug 13 00:16:38.321380 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 13 00:16:38.323478 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:16:38.324561 systemd[1]: Stopped target slices.target - Slice Units. Aug 13 00:16:38.325308 systemd[1]: Stopped target sockets.target - Socket Units. Aug 13 00:16:38.326406 systemd[1]: iscsid.socket: Deactivated successfully. Aug 13 00:16:38.326461 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:16:38.327903 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 13 00:16:38.327949 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:16:38.328907 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 13 00:16:38.328978 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 13 00:16:38.330020 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 13 00:16:38.330069 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 13 00:16:38.331222 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 13 00:16:38.331269 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 13 00:16:38.332496 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 13 00:16:38.334211 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 13 00:16:38.337581 systemd-networkd[783]: eth0: DHCPv6 lease lost Aug 13 00:16:38.341855 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 13 00:16:38.342008 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 13 00:16:38.343279 systemd-networkd[783]: eth1: DHCPv6 lease lost Aug 13 00:16:38.346946 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 13 00:16:38.347757 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 13 00:16:38.349058 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 13 00:16:38.349119 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:16:38.354674 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 13 00:16:38.355202 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 13 00:16:38.355310 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:16:38.359262 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 13 00:16:38.359341 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:16:38.360000 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 13 00:16:38.360039 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 13 00:16:38.361199 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 13 00:16:38.361252 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:16:38.365482 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:16:38.375764 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 13 00:16:38.375973 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 13 00:16:38.381519 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 13 00:16:38.381749 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:16:38.385310 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 13 00:16:38.385371 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 13 00:16:38.386383 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 13 00:16:38.386434 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:16:38.387720 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 13 00:16:38.387784 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:16:38.389753 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 13 00:16:38.389814 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 13 00:16:38.391278 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 13 00:16:38.391362 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:16:38.404956 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 13 00:16:38.408216 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 13 00:16:38.408680 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:16:38.411555 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:16:38.411661 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:16:38.421873 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 13 00:16:38.422074 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 13 00:16:38.423904 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 13 00:16:38.430913 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 13 00:16:38.443056 systemd[1]: Switching root. Aug 13 00:16:38.484814 systemd-journald[237]: Journal stopped Aug 13 00:16:39.478529 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). Aug 13 00:16:39.478607 kernel: SELinux: policy capability network_peer_controls=1 Aug 13 00:16:39.478622 kernel: SELinux: policy capability open_perms=1 Aug 13 00:16:39.478633 kernel: SELinux: policy capability extended_socket_class=1 Aug 13 00:16:39.478644 kernel: SELinux: policy capability always_check_network=0 Aug 13 00:16:39.478656 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 13 00:16:39.478666 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 13 00:16:39.478676 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 13 00:16:39.478690 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 13 00:16:39.478701 kernel: audit: type=1403 audit(1755044198.644:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 13 00:16:39.478714 systemd[1]: Successfully loaded SELinux policy in 38.425ms. Aug 13 00:16:39.478742 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 11.294ms. Aug 13 00:16:39.478756 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Aug 13 00:16:39.478769 systemd[1]: Detected virtualization kvm. Aug 13 00:16:39.478782 systemd[1]: Detected architecture arm64. Aug 13 00:16:39.478803 systemd[1]: Detected first boot. Aug 13 00:16:39.478821 systemd[1]: Hostname set to . Aug 13 00:16:39.478833 systemd[1]: Initializing machine ID from VM UUID. Aug 13 00:16:39.478847 zram_generator::config[1055]: No configuration found. Aug 13 00:16:39.478860 systemd[1]: Populated /etc with preset unit settings. Aug 13 00:16:39.478872 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 13 00:16:39.478885 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 13 00:16:39.478897 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 13 00:16:39.478909 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 13 00:16:39.478921 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 13 00:16:39.478938 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 13 00:16:39.478950 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 13 00:16:39.478961 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 13 00:16:39.478973 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 13 00:16:39.478984 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 13 00:16:39.478999 systemd[1]: Created slice user.slice - User and Session Slice. Aug 13 00:16:39.479010 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:16:39.479021 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:16:39.479031 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 13 00:16:39.479046 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 13 00:16:39.479063 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 13 00:16:39.479073 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:16:39.479084 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Aug 13 00:16:39.479094 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:16:39.479105 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 13 00:16:39.479116 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 13 00:16:39.479128 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 13 00:16:39.479139 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 13 00:16:39.479149 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:16:39.479160 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:16:39.479171 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:16:39.479181 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:16:39.479197 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 13 00:16:39.479208 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 13 00:16:39.479220 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:16:39.479231 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:16:39.479242 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:16:39.479252 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 13 00:16:39.479263 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 13 00:16:39.479281 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 13 00:16:39.479295 systemd[1]: Mounting media.mount - External Media Directory... Aug 13 00:16:39.479307 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 13 00:16:39.479319 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 13 00:16:39.479335 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 13 00:16:39.479348 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 13 00:16:39.479364 systemd[1]: Reached target machines.target - Containers. Aug 13 00:16:39.479375 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 13 00:16:39.479386 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:16:39.479397 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:16:39.479410 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 13 00:16:39.479424 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:16:39.479453 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:16:39.479466 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:16:39.479480 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 13 00:16:39.479491 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:16:39.479503 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 13 00:16:39.479517 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 13 00:16:39.479529 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 13 00:16:39.479539 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 13 00:16:39.479551 systemd[1]: Stopped systemd-fsck-usr.service. Aug 13 00:16:39.479561 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:16:39.479572 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:16:39.479583 kernel: fuse: init (API version 7.39) Aug 13 00:16:39.479594 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:16:39.479605 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 13 00:16:39.479617 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:16:39.479628 systemd[1]: verity-setup.service: Deactivated successfully. Aug 13 00:16:39.479639 systemd[1]: Stopped verity-setup.service. Aug 13 00:16:39.479650 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 13 00:16:39.479670 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 13 00:16:39.479687 kernel: ACPI: bus type drm_connector registered Aug 13 00:16:39.479698 systemd[1]: Mounted media.mount - External Media Directory. Aug 13 00:16:39.479709 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 13 00:16:39.479720 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 13 00:16:39.479732 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 13 00:16:39.479744 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:16:39.479756 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 13 00:16:39.479767 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 13 00:16:39.479779 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:16:39.479791 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:16:39.479802 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:16:39.479813 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:16:39.479824 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:16:39.479834 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:16:39.479848 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 13 00:16:39.479859 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 13 00:16:39.479870 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 13 00:16:39.479880 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 13 00:16:39.479891 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 13 00:16:39.479901 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 13 00:16:39.479912 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:16:39.479924 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Aug 13 00:16:39.479934 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 13 00:16:39.479949 kernel: loop: module loaded Aug 13 00:16:39.479962 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 13 00:16:39.479975 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:16:39.479986 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 13 00:16:39.479997 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:16:39.480041 systemd-journald[1118]: Collecting audit messages is disabled. Aug 13 00:16:39.480067 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 13 00:16:39.480084 systemd-journald[1118]: Journal started Aug 13 00:16:39.480107 systemd-journald[1118]: Runtime Journal (/run/log/journal/232da3798e92480e8ded9a0c13f73371) is 8.0M, max 76.6M, 68.6M free. Aug 13 00:16:39.171966 systemd[1]: Queued start job for default target multi-user.target. Aug 13 00:16:39.191108 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 13 00:16:39.191843 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 13 00:16:39.491361 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 13 00:16:39.496658 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:16:39.499946 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:16:39.500117 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:16:39.501654 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:16:39.503462 kernel: loop0: detected capacity change from 0 to 114432 Aug 13 00:16:39.504040 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:16:39.505253 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 13 00:16:39.507392 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 13 00:16:39.510083 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 13 00:16:39.523884 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 13 00:16:39.538480 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 13 00:16:39.546468 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:16:39.558573 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 13 00:16:39.565118 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 13 00:16:39.566363 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:16:39.576787 kernel: loop1: detected capacity change from 0 to 8 Aug 13 00:16:39.578703 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 13 00:16:39.589701 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Aug 13 00:16:39.592601 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:16:39.596525 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:16:39.599814 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 13 00:16:39.606843 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Aug 13 00:16:39.615558 kernel: loop2: detected capacity change from 0 to 211168 Aug 13 00:16:39.615732 systemd-journald[1118]: Time spent on flushing to /var/log/journal/232da3798e92480e8ded9a0c13f73371 is 81.498ms for 1133 entries. Aug 13 00:16:39.615732 systemd-journald[1118]: System Journal (/var/log/journal/232da3798e92480e8ded9a0c13f73371) is 8.0M, max 584.8M, 576.8M free. Aug 13 00:16:39.713642 systemd-journald[1118]: Received client request to flush runtime journal. Aug 13 00:16:39.713700 kernel: loop3: detected capacity change from 0 to 114328 Aug 13 00:16:39.676726 udevadm[1180]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Aug 13 00:16:39.689465 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 13 00:16:39.693598 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Aug 13 00:16:39.707038 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:16:39.725878 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 13 00:16:39.737604 kernel: loop4: detected capacity change from 0 to 114432 Aug 13 00:16:39.751492 kernel: loop5: detected capacity change from 0 to 8 Aug 13 00:16:39.751861 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 13 00:16:39.757805 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:16:39.758482 kernel: loop6: detected capacity change from 0 to 211168 Aug 13 00:16:39.779487 kernel: loop7: detected capacity change from 0 to 114328 Aug 13 00:16:39.795996 (sd-merge)[1190]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Aug 13 00:16:39.797610 (sd-merge)[1190]: Merged extensions into '/usr'. Aug 13 00:16:39.805541 systemd[1]: Reloading requested from client PID 1143 ('systemd-sysext') (unit systemd-sysext.service)... Aug 13 00:16:39.805561 systemd[1]: Reloading... Aug 13 00:16:39.835347 systemd-tmpfiles[1192]: ACLs are not supported, ignoring. Aug 13 00:16:39.836511 systemd-tmpfiles[1192]: ACLs are not supported, ignoring. Aug 13 00:16:39.936173 zram_generator::config[1223]: No configuration found. Aug 13 00:16:40.113108 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:16:40.143375 ldconfig[1135]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 13 00:16:40.178858 systemd[1]: Reloading finished in 372 ms. Aug 13 00:16:40.214194 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 13 00:16:40.215332 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:16:40.216866 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 13 00:16:40.231096 systemd[1]: Starting ensure-sysext.service... Aug 13 00:16:40.236911 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:16:40.251518 systemd[1]: Reloading requested from client PID 1258 ('systemctl') (unit ensure-sysext.service)... Aug 13 00:16:40.251549 systemd[1]: Reloading... Aug 13 00:16:40.283037 systemd-tmpfiles[1259]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 13 00:16:40.283477 systemd-tmpfiles[1259]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 13 00:16:40.287352 systemd-tmpfiles[1259]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 13 00:16:40.287930 systemd-tmpfiles[1259]: ACLs are not supported, ignoring. Aug 13 00:16:40.288167 systemd-tmpfiles[1259]: ACLs are not supported, ignoring. Aug 13 00:16:40.291937 systemd-tmpfiles[1259]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:16:40.292049 systemd-tmpfiles[1259]: Skipping /boot Aug 13 00:16:40.304898 systemd-tmpfiles[1259]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:16:40.305044 systemd-tmpfiles[1259]: Skipping /boot Aug 13 00:16:40.358471 zram_generator::config[1289]: No configuration found. Aug 13 00:16:40.458103 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:16:40.504559 systemd[1]: Reloading finished in 252 ms. Aug 13 00:16:40.530366 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 13 00:16:40.532119 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:16:40.547278 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Aug 13 00:16:40.553787 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 13 00:16:40.558799 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 13 00:16:40.566793 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:16:40.573966 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:16:40.584822 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 13 00:16:40.589722 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:16:40.593169 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:16:40.605816 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:16:40.612598 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:16:40.613387 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:16:40.616700 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 13 00:16:40.619496 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 13 00:16:40.632755 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 13 00:16:40.637249 systemd-udevd[1333]: Using default interface naming scheme 'v255'. Aug 13 00:16:40.638305 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:16:40.638672 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:16:40.642185 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 13 00:16:40.645137 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:16:40.646518 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:16:40.654570 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:16:40.654789 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:16:40.654886 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:16:40.661316 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:16:40.661623 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:16:40.667175 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:16:40.668632 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:16:40.671672 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:16:40.675426 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:16:40.677683 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:16:40.677932 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:16:40.684930 systemd[1]: Finished ensure-sysext.service. Aug 13 00:16:40.689862 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 13 00:16:40.707189 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:16:40.714652 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 13 00:16:40.715798 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 13 00:16:40.737879 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:16:40.738073 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:16:40.738488 augenrules[1385]: No rules Aug 13 00:16:40.741597 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Aug 13 00:16:40.745399 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 13 00:16:40.750049 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:16:40.750712 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:16:40.760571 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:16:40.760766 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:16:40.769885 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:16:40.769963 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:16:40.795934 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 13 00:16:40.800739 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Aug 13 00:16:40.930723 systemd-networkd[1377]: lo: Link UP Aug 13 00:16:40.930736 systemd-networkd[1377]: lo: Gained carrier Aug 13 00:16:40.932389 systemd-networkd[1377]: Enumeration completed Aug 13 00:16:40.932887 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:16:40.941307 systemd-networkd[1377]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:40.941318 systemd-networkd[1377]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:16:40.944735 systemd-networkd[1377]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:40.944747 systemd-networkd[1377]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:16:40.945359 systemd-networkd[1377]: eth0: Link UP Aug 13 00:16:40.945370 systemd-networkd[1377]: eth0: Gained carrier Aug 13 00:16:40.945390 systemd-networkd[1377]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:40.945621 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 13 00:16:40.952036 systemd-resolved[1329]: Positive Trust Anchors: Aug 13 00:16:40.954200 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 13 00:16:40.955172 systemd[1]: Reached target time-set.target - System Time Set. Aug 13 00:16:40.955979 systemd-networkd[1377]: eth1: Link UP Aug 13 00:16:40.955982 systemd-networkd[1377]: eth1: Gained carrier Aug 13 00:16:40.956000 systemd-networkd[1377]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:40.956692 systemd-resolved[1329]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:16:40.956731 systemd-resolved[1329]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:16:40.963738 systemd-resolved[1329]: Using system hostname 'ci-4081-3-5-8-f2ca23fedd'. Aug 13 00:16:40.967219 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:16:40.968312 systemd[1]: Reached target network.target - Network. Aug 13 00:16:40.969279 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:16:40.984463 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1356) Aug 13 00:16:40.987825 systemd-networkd[1377]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 13 00:16:40.988519 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Aug 13 00:16:41.009531 systemd-networkd[1377]: eth0: DHCPv4 address 91.99.89.242/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 13 00:16:41.010415 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Aug 13 00:16:41.010517 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Aug 13 00:16:41.014962 systemd-networkd[1377]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:16:41.029483 kernel: mousedev: PS/2 mouse device common for all mice Aug 13 00:16:41.041018 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Aug 13 00:16:41.041151 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:16:41.049682 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:16:41.053679 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:16:41.058666 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:16:41.060366 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:16:41.060411 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 13 00:16:41.065731 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:16:41.066028 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:16:41.073695 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:16:41.074297 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:16:41.078036 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:16:41.079026 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:16:41.080794 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:16:41.080851 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:16:41.110781 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Aug 13 00:16:41.110862 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Aug 13 00:16:41.110876 kernel: [drm] features: -context_init Aug 13 00:16:41.119704 kernel: [drm] number of scanouts: 1 Aug 13 00:16:41.119769 kernel: [drm] number of cap sets: 0 Aug 13 00:16:41.126334 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 13 00:16:41.128461 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Aug 13 00:16:41.134724 kernel: Console: switching to colour frame buffer device 160x50 Aug 13 00:16:41.137544 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Aug 13 00:16:41.138325 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 13 00:16:41.141717 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:16:41.156480 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 13 00:16:41.159957 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:16:41.160140 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:16:41.167818 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:16:41.254529 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:16:41.276828 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Aug 13 00:16:41.284817 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Aug 13 00:16:41.299028 lvm[1444]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Aug 13 00:16:41.326001 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Aug 13 00:16:41.328544 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:16:41.330199 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:16:41.332053 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 13 00:16:41.333610 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 13 00:16:41.335074 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 13 00:16:41.336537 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 13 00:16:41.337296 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 13 00:16:41.338210 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 13 00:16:41.338270 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:16:41.338914 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:16:41.340815 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 13 00:16:41.343123 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 13 00:16:41.348999 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 13 00:16:41.352203 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Aug 13 00:16:41.353824 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 13 00:16:41.354730 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:16:41.355494 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:16:41.356232 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:16:41.356336 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:16:41.358563 systemd[1]: Starting containerd.service - containerd container runtime... Aug 13 00:16:41.362904 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 13 00:16:41.368464 lvm[1448]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Aug 13 00:16:41.368957 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 13 00:16:41.371870 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 13 00:16:41.374685 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 13 00:16:41.376536 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 13 00:16:41.379928 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 13 00:16:41.382364 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 13 00:16:41.387654 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Aug 13 00:16:41.389540 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 13 00:16:41.394635 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 13 00:16:41.399067 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 13 00:16:41.401848 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 13 00:16:41.402373 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 13 00:16:41.403845 systemd[1]: Starting update-engine.service - Update Engine... Aug 13 00:16:41.405631 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 13 00:16:41.431134 jq[1452]: false Aug 13 00:16:41.442530 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Aug 13 00:16:41.443565 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 13 00:16:41.443740 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 13 00:16:41.467586 jq[1463]: true Aug 13 00:16:41.466601 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 13 00:16:41.466771 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 13 00:16:41.483802 (ntainerd)[1477]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 13 00:16:41.499236 extend-filesystems[1453]: Found loop4 Aug 13 00:16:41.520671 tar[1470]: linux-arm64/LICENSE Aug 13 00:16:41.520671 tar[1470]: linux-arm64/helm Aug 13 00:16:41.520931 coreos-metadata[1450]: Aug 13 00:16:41.519 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Aug 13 00:16:41.509687 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 13 00:16:41.509480 dbus-daemon[1451]: [system] SELinux support is enabled Aug 13 00:16:41.521366 extend-filesystems[1453]: Found loop5 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found loop6 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found loop7 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda1 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda2 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda3 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found usr Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda4 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda6 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda7 Aug 13 00:16:41.521366 extend-filesystems[1453]: Found sda9 Aug 13 00:16:41.521366 extend-filesystems[1453]: Checking size of /dev/sda9 Aug 13 00:16:41.513677 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 13 00:16:41.589532 jq[1481]: true Aug 13 00:16:41.589717 coreos-metadata[1450]: Aug 13 00:16:41.527 INFO Fetch successful Aug 13 00:16:41.589717 coreos-metadata[1450]: Aug 13 00:16:41.528 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Aug 13 00:16:41.589717 coreos-metadata[1450]: Aug 13 00:16:41.533 INFO Fetch successful Aug 13 00:16:41.513706 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 13 00:16:41.516566 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 13 00:16:41.516587 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 13 00:16:41.566388 systemd[1]: motdgen.service: Deactivated successfully. Aug 13 00:16:41.566614 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 13 00:16:41.590282 extend-filesystems[1453]: Resized partition /dev/sda9 Aug 13 00:16:41.603463 update_engine[1462]: I20250813 00:16:41.596597 1462 main.cc:92] Flatcar Update Engine starting Aug 13 00:16:41.604102 extend-filesystems[1500]: resize2fs 1.47.1 (20-May-2024) Aug 13 00:16:41.617824 update_engine[1462]: I20250813 00:16:41.616806 1462 update_check_scheduler.cc:74] Next update check in 8m19s Aug 13 00:16:41.617162 systemd[1]: Started update-engine.service - Update Engine. Aug 13 00:16:41.629501 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Aug 13 00:16:41.628662 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 13 00:16:41.680387 systemd-logind[1461]: New seat seat0. Aug 13 00:16:41.682684 systemd-logind[1461]: Watching system buttons on /dev/input/event0 (Power Button) Aug 13 00:16:41.682921 systemd-logind[1461]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Aug 13 00:16:41.683180 systemd[1]: Started systemd-logind.service - User Login Management. Aug 13 00:16:41.710233 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 13 00:16:41.716524 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1361) Aug 13 00:16:41.724360 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 13 00:16:41.754539 bash[1524]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:16:41.758007 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 13 00:16:41.766701 systemd[1]: Starting sshkeys.service... Aug 13 00:16:41.811133 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Aug 13 00:16:41.833034 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 13 00:16:41.834514 extend-filesystems[1500]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Aug 13 00:16:41.834514 extend-filesystems[1500]: old_desc_blocks = 1, new_desc_blocks = 5 Aug 13 00:16:41.834514 extend-filesystems[1500]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Aug 13 00:16:41.838045 extend-filesystems[1453]: Resized filesystem in /dev/sda9 Aug 13 00:16:41.838045 extend-filesystems[1453]: Found sr0 Aug 13 00:16:41.848978 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 13 00:16:41.850140 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 13 00:16:41.850383 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 13 00:16:41.907473 coreos-metadata[1532]: Aug 13 00:16:41.907 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Aug 13 00:16:41.912470 coreos-metadata[1532]: Aug 13 00:16:41.911 INFO Fetch successful Aug 13 00:16:41.915469 containerd[1477]: time="2025-08-13T00:16:41.914198920Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Aug 13 00:16:41.920359 unknown[1532]: wrote ssh authorized keys file for user: core Aug 13 00:16:41.961750 update-ssh-keys[1541]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:16:41.962753 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 13 00:16:41.967185 systemd[1]: Finished sshkeys.service. Aug 13 00:16:41.982880 locksmithd[1506]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 13 00:16:41.989169 containerd[1477]: time="2025-08-13T00:16:41.988515120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Aug 13 00:16:41.997008 containerd[1477]: time="2025-08-13T00:16:41.996954960Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.100-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Aug 13 00:16:41.997983 containerd[1477]: time="2025-08-13T00:16:41.997121120Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Aug 13 00:16:41.997983 containerd[1477]: time="2025-08-13T00:16:41.997146120Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Aug 13 00:16:41.997983 containerd[1477]: time="2025-08-13T00:16:41.997365240Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Aug 13 00:16:41.997983 containerd[1477]: time="2025-08-13T00:16:41.997391720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Aug 13 00:16:41.997983 containerd[1477]: time="2025-08-13T00:16:41.997535840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Aug 13 00:16:41.997983 containerd[1477]: time="2025-08-13T00:16:41.997554120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Aug 13 00:16:41.998669 containerd[1477]: time="2025-08-13T00:16:41.998634240Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Aug 13 00:16:41.998861 containerd[1477]: time="2025-08-13T00:16:41.998842680Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Aug 13 00:16:41.998986 containerd[1477]: time="2025-08-13T00:16:41.998967080Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Aug 13 00:16:41.999882 containerd[1477]: time="2025-08-13T00:16:41.999480120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Aug 13 00:16:41.999882 containerd[1477]: time="2025-08-13T00:16:41.999617480Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Aug 13 00:16:41.999882 containerd[1477]: time="2025-08-13T00:16:41.999841720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Aug 13 00:16:42.000348 containerd[1477]: time="2025-08-13T00:16:42.000255720Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Aug 13 00:16:42.000625 containerd[1477]: time="2025-08-13T00:16:42.000604520Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Aug 13 00:16:42.001072 containerd[1477]: time="2025-08-13T00:16:42.000778920Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Aug 13 00:16:42.001072 containerd[1477]: time="2025-08-13T00:16:42.000830040Z" level=info msg="metadata content store policy set" policy=shared Aug 13 00:16:42.008461 containerd[1477]: time="2025-08-13T00:16:42.008370840Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Aug 13 00:16:42.008907 containerd[1477]: time="2025-08-13T00:16:42.008567480Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Aug 13 00:16:42.008907 containerd[1477]: time="2025-08-13T00:16:42.008592560Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Aug 13 00:16:42.008907 containerd[1477]: time="2025-08-13T00:16:42.008609120Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Aug 13 00:16:42.008907 containerd[1477]: time="2025-08-13T00:16:42.008693920Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Aug 13 00:16:42.008907 containerd[1477]: time="2025-08-13T00:16:42.008853800Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010037680Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010209560Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010229480Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010284200Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010318040Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010332720Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010347280Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010363080Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010391280Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010406040Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010419520Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010431880Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010488960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.011173 containerd[1477]: time="2025-08-13T00:16:42.010509160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010523000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010538800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010551560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010567440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010580120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010595360Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010608640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010630040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010649440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010664080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010677080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010710560Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010735120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010747000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.012321 containerd[1477]: time="2025-08-13T00:16:42.010757880Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Aug 13 00:16:42.012673 containerd[1477]: time="2025-08-13T00:16:42.012636560Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Aug 13 00:16:42.012887 containerd[1477]: time="2025-08-13T00:16:42.012870080Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Aug 13 00:16:42.012944 containerd[1477]: time="2025-08-13T00:16:42.012930520Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Aug 13 00:16:42.012999 containerd[1477]: time="2025-08-13T00:16:42.012983720Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Aug 13 00:16:42.013054 containerd[1477]: time="2025-08-13T00:16:42.013041240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.013107 containerd[1477]: time="2025-08-13T00:16:42.013095160Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Aug 13 00:16:42.013156 containerd[1477]: time="2025-08-13T00:16:42.013145120Z" level=info msg="NRI interface is disabled by configuration." Aug 13 00:16:42.013209 containerd[1477]: time="2025-08-13T00:16:42.013195800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Aug 13 00:16:42.016205 containerd[1477]: time="2025-08-13T00:16:42.013786200Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Aug 13 00:16:42.016205 containerd[1477]: time="2025-08-13T00:16:42.013869000Z" level=info msg="Connect containerd service" Aug 13 00:16:42.016205 containerd[1477]: time="2025-08-13T00:16:42.013917200Z" level=info msg="using legacy CRI server" Aug 13 00:16:42.016205 containerd[1477]: time="2025-08-13T00:16:42.013924880Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 13 00:16:42.016205 containerd[1477]: time="2025-08-13T00:16:42.014019240Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Aug 13 00:16:42.016205 containerd[1477]: time="2025-08-13T00:16:42.014896880Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 00:16:42.017793 containerd[1477]: time="2025-08-13T00:16:42.017747200Z" level=info msg="Start subscribing containerd event" Aug 13 00:16:42.018167 containerd[1477]: time="2025-08-13T00:16:42.018149760Z" level=info msg="Start recovering state" Aug 13 00:16:42.018365 containerd[1477]: time="2025-08-13T00:16:42.018346400Z" level=info msg="Start event monitor" Aug 13 00:16:42.018434 containerd[1477]: time="2025-08-13T00:16:42.018421800Z" level=info msg="Start snapshots syncer" Aug 13 00:16:42.018573 containerd[1477]: time="2025-08-13T00:16:42.018557120Z" level=info msg="Start cni network conf syncer for default" Aug 13 00:16:42.018629 containerd[1477]: time="2025-08-13T00:16:42.018618160Z" level=info msg="Start streaming server" Aug 13 00:16:42.019819 containerd[1477]: time="2025-08-13T00:16:42.019792000Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 13 00:16:42.020527 containerd[1477]: time="2025-08-13T00:16:42.020506400Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 13 00:16:42.024319 systemd[1]: Started containerd.service - containerd container runtime. Aug 13 00:16:42.025367 containerd[1477]: time="2025-08-13T00:16:42.025335000Z" level=info msg="containerd successfully booted in 0.117434s" Aug 13 00:16:42.150517 sshd_keygen[1482]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 13 00:16:42.159648 systemd-networkd[1377]: eth0: Gained IPv6LL Aug 13 00:16:42.160165 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Aug 13 00:16:42.164341 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 13 00:16:42.166040 systemd[1]: Reached target network-online.target - Network is Online. Aug 13 00:16:42.177712 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:16:42.185920 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 13 00:16:42.196401 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 13 00:16:42.204088 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 13 00:16:42.222716 systemd[1]: issuegen.service: Deactivated successfully. Aug 13 00:16:42.222904 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 13 00:16:42.230954 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 13 00:16:42.244013 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 13 00:16:42.264793 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 13 00:16:42.276038 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 13 00:16:42.283815 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Aug 13 00:16:42.284667 systemd[1]: Reached target getty.target - Login Prompts. Aug 13 00:16:42.300496 tar[1470]: linux-arm64/README.md Aug 13 00:16:42.315487 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 13 00:16:42.863761 systemd-networkd[1377]: eth1: Gained IPv6LL Aug 13 00:16:42.864351 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Aug 13 00:16:43.110804 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:16:43.113877 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 13 00:16:43.114842 (kubelet)[1582]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:16:43.117737 systemd[1]: Startup finished in 815ms (kernel) + 7.950s (initrd) + 4.510s (userspace) = 13.277s. Aug 13 00:16:43.750961 kubelet[1582]: E0813 00:16:43.750884 1582 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:16:43.754150 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:16:43.754406 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:16:43.755102 systemd[1]: kubelet.service: Consumed 1.026s CPU time. Aug 13 00:16:54.005084 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 13 00:16:54.017781 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:16:54.142790 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:16:54.145271 (kubelet)[1601]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:16:54.191460 kubelet[1601]: E0813 00:16:54.191376 1601 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:16:54.196842 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:16:54.197271 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:17:04.447872 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 13 00:17:04.460856 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:17:04.599719 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:17:04.613514 (kubelet)[1617]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:17:04.680581 kubelet[1617]: E0813 00:17:04.680470 1617 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:17:04.683552 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:17:04.683786 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:17:13.305018 systemd-timesyncd[1379]: Contacted time server 5.45.97.204:123 (2.flatcar.pool.ntp.org). Aug 13 00:17:13.305134 systemd-timesyncd[1379]: Initial clock synchronization to Wed 2025-08-13 00:17:13.129610 UTC. Aug 13 00:17:14.934604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 13 00:17:14.946776 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:17:15.079435 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:17:15.092005 (kubelet)[1631]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:17:15.145147 kubelet[1631]: E0813 00:17:15.145068 1631 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:17:15.149009 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:17:15.149211 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:17:25.399637 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Aug 13 00:17:25.412827 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:17:25.525704 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:17:25.530166 (kubelet)[1646]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:17:25.576132 kubelet[1646]: E0813 00:17:25.576066 1646 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:17:25.579359 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:17:25.579551 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:17:26.811843 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 13 00:17:26.817773 systemd[1]: Started sshd@0-91.99.89.242:22-139.178.89.65:47182.service - OpenSSH per-connection server daemon (139.178.89.65:47182). Aug 13 00:17:26.861666 update_engine[1462]: I20250813 00:17:26.861523 1462 update_attempter.cc:509] Updating boot flags... Aug 13 00:17:26.911596 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1665) Aug 13 00:17:26.956488 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1666) Aug 13 00:17:26.999476 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1666) Aug 13 00:17:27.812237 sshd[1654]: Accepted publickey for core from 139.178.89.65 port 47182 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:17:27.815951 sshd[1654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:17:27.825791 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 13 00:17:27.833136 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 13 00:17:27.836343 systemd-logind[1461]: New session 1 of user core. Aug 13 00:17:27.846212 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 13 00:17:27.853007 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 13 00:17:27.858500 (systemd)[1679]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 13 00:17:27.970977 systemd[1679]: Queued start job for default target default.target. Aug 13 00:17:27.986647 systemd[1679]: Created slice app.slice - User Application Slice. Aug 13 00:17:27.986884 systemd[1679]: Reached target paths.target - Paths. Aug 13 00:17:27.987003 systemd[1679]: Reached target timers.target - Timers. Aug 13 00:17:27.989092 systemd[1679]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 13 00:17:28.004890 systemd[1679]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 13 00:17:28.005087 systemd[1679]: Reached target sockets.target - Sockets. Aug 13 00:17:28.005114 systemd[1679]: Reached target basic.target - Basic System. Aug 13 00:17:28.005188 systemd[1679]: Reached target default.target - Main User Target. Aug 13 00:17:28.005235 systemd[1679]: Startup finished in 139ms. Aug 13 00:17:28.005362 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 13 00:17:28.016751 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 13 00:17:28.740157 systemd[1]: Started sshd@1-91.99.89.242:22-139.178.89.65:47188.service - OpenSSH per-connection server daemon (139.178.89.65:47188). Aug 13 00:17:29.797324 sshd[1690]: Accepted publickey for core from 139.178.89.65 port 47188 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:17:29.799389 sshd[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:17:29.804772 systemd-logind[1461]: New session 2 of user core. Aug 13 00:17:29.812765 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 13 00:17:30.524563 sshd[1690]: pam_unix(sshd:session): session closed for user core Aug 13 00:17:30.530611 systemd-logind[1461]: Session 2 logged out. Waiting for processes to exit. Aug 13 00:17:30.531366 systemd[1]: sshd@1-91.99.89.242:22-139.178.89.65:47188.service: Deactivated successfully. Aug 13 00:17:30.533650 systemd[1]: session-2.scope: Deactivated successfully. Aug 13 00:17:30.535086 systemd-logind[1461]: Removed session 2. Aug 13 00:17:30.689793 systemd[1]: Started sshd@2-91.99.89.242:22-139.178.89.65:35050.service - OpenSSH per-connection server daemon (139.178.89.65:35050). Aug 13 00:17:31.685706 sshd[1697]: Accepted publickey for core from 139.178.89.65 port 35050 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:17:31.687790 sshd[1697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:17:31.694362 systemd-logind[1461]: New session 3 of user core. Aug 13 00:17:31.697673 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 13 00:17:32.372529 sshd[1697]: pam_unix(sshd:session): session closed for user core Aug 13 00:17:32.378644 systemd[1]: sshd@2-91.99.89.242:22-139.178.89.65:35050.service: Deactivated successfully. Aug 13 00:17:32.380421 systemd[1]: session-3.scope: Deactivated successfully. Aug 13 00:17:32.381313 systemd-logind[1461]: Session 3 logged out. Waiting for processes to exit. Aug 13 00:17:32.382863 systemd-logind[1461]: Removed session 3. Aug 13 00:17:32.555903 systemd[1]: Started sshd@3-91.99.89.242:22-139.178.89.65:35058.service - OpenSSH per-connection server daemon (139.178.89.65:35058). Aug 13 00:17:33.546811 sshd[1704]: Accepted publickey for core from 139.178.89.65 port 35058 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:17:33.550349 sshd[1704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:17:33.559289 systemd-logind[1461]: New session 4 of user core. Aug 13 00:17:33.565819 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 13 00:17:34.237884 sshd[1704]: pam_unix(sshd:session): session closed for user core Aug 13 00:17:34.242367 systemd-logind[1461]: Session 4 logged out. Waiting for processes to exit. Aug 13 00:17:34.242564 systemd[1]: sshd@3-91.99.89.242:22-139.178.89.65:35058.service: Deactivated successfully. Aug 13 00:17:34.244229 systemd[1]: session-4.scope: Deactivated successfully. Aug 13 00:17:34.247196 systemd-logind[1461]: Removed session 4. Aug 13 00:17:34.422998 systemd[1]: Started sshd@4-91.99.89.242:22-139.178.89.65:35064.service - OpenSSH per-connection server daemon (139.178.89.65:35064). Aug 13 00:17:35.416267 sshd[1711]: Accepted publickey for core from 139.178.89.65 port 35064 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:17:35.418839 sshd[1711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:17:35.424939 systemd-logind[1461]: New session 5 of user core. Aug 13 00:17:35.431761 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 13 00:17:35.772269 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Aug 13 00:17:35.783770 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:17:35.920294 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:17:35.925728 (kubelet)[1722]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:17:35.960998 sudo[1727]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 13 00:17:35.961901 sudo[1727]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:17:35.974951 sudo[1727]: pam_unix(sudo:session): session closed for user root Aug 13 00:17:35.985025 kubelet[1722]: E0813 00:17:35.984937 1722 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:17:35.990146 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:17:35.990396 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:17:36.137705 sshd[1711]: pam_unix(sshd:session): session closed for user core Aug 13 00:17:36.142905 systemd[1]: sshd@4-91.99.89.242:22-139.178.89.65:35064.service: Deactivated successfully. Aug 13 00:17:36.144701 systemd[1]: session-5.scope: Deactivated successfully. Aug 13 00:17:36.146621 systemd-logind[1461]: Session 5 logged out. Waiting for processes to exit. Aug 13 00:17:36.148087 systemd-logind[1461]: Removed session 5. Aug 13 00:17:36.309790 systemd[1]: Started sshd@5-91.99.89.242:22-139.178.89.65:35076.service - OpenSSH per-connection server daemon (139.178.89.65:35076). Aug 13 00:17:37.311895 sshd[1735]: Accepted publickey for core from 139.178.89.65 port 35076 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:17:37.313915 sshd[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:17:37.320540 systemd-logind[1461]: New session 6 of user core. Aug 13 00:17:37.326787 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 13 00:17:37.846740 sudo[1739]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 13 00:17:37.847805 sudo[1739]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:17:37.853911 sudo[1739]: pam_unix(sudo:session): session closed for user root Aug 13 00:17:37.860837 sudo[1738]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Aug 13 00:17:37.861127 sudo[1738]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:17:37.877813 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Aug 13 00:17:37.888423 auditctl[1742]: No rules Aug 13 00:17:37.889262 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:17:37.889613 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Aug 13 00:17:37.897097 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Aug 13 00:17:37.933927 augenrules[1760]: No rules Aug 13 00:17:37.935860 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Aug 13 00:17:37.937876 sudo[1738]: pam_unix(sudo:session): session closed for user root Aug 13 00:17:38.099944 sshd[1735]: pam_unix(sshd:session): session closed for user core Aug 13 00:17:38.105107 systemd[1]: sshd@5-91.99.89.242:22-139.178.89.65:35076.service: Deactivated successfully. Aug 13 00:17:38.107219 systemd[1]: session-6.scope: Deactivated successfully. Aug 13 00:17:38.109356 systemd-logind[1461]: Session 6 logged out. Waiting for processes to exit. Aug 13 00:17:38.110467 systemd-logind[1461]: Removed session 6. Aug 13 00:17:38.277374 systemd[1]: Started sshd@6-91.99.89.242:22-139.178.89.65:35078.service - OpenSSH per-connection server daemon (139.178.89.65:35078). Aug 13 00:17:39.288979 sshd[1768]: Accepted publickey for core from 139.178.89.65 port 35078 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:17:39.291328 sshd[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:17:39.297804 systemd-logind[1461]: New session 7 of user core. Aug 13 00:17:39.313925 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 13 00:17:39.820067 sudo[1771]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 13 00:17:39.820414 sudo[1771]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:17:40.138967 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 13 00:17:40.141338 (dockerd)[1787]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 13 00:17:40.403697 dockerd[1787]: time="2025-08-13T00:17:40.402499849Z" level=info msg="Starting up" Aug 13 00:17:40.505795 dockerd[1787]: time="2025-08-13T00:17:40.505713347Z" level=info msg="Loading containers: start." Aug 13 00:17:40.632478 kernel: Initializing XFRM netlink socket Aug 13 00:17:40.723411 systemd-networkd[1377]: docker0: Link UP Aug 13 00:17:40.742937 dockerd[1787]: time="2025-08-13T00:17:40.742854847Z" level=info msg="Loading containers: done." Aug 13 00:17:40.757902 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3476857720-merged.mount: Deactivated successfully. Aug 13 00:17:40.759400 dockerd[1787]: time="2025-08-13T00:17:40.758931949Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 13 00:17:40.759400 dockerd[1787]: time="2025-08-13T00:17:40.759059782Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Aug 13 00:17:40.759400 dockerd[1787]: time="2025-08-13T00:17:40.759178062Z" level=info msg="Daemon has completed initialization" Aug 13 00:17:40.803391 dockerd[1787]: time="2025-08-13T00:17:40.803225494Z" level=info msg="API listen on /run/docker.sock" Aug 13 00:17:40.803925 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 13 00:17:41.597003 containerd[1477]: time="2025-08-13T00:17:41.596823071Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\"" Aug 13 00:17:42.298154 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount36517527.mount: Deactivated successfully. Aug 13 00:17:43.899687 containerd[1477]: time="2025-08-13T00:17:43.899607400Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:43.901579 containerd[1477]: time="2025-08-13T00:17:43.901500577Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.3: active requests=0, bytes read=27352186" Aug 13 00:17:43.902741 containerd[1477]: time="2025-08-13T00:17:43.902611991Z" level=info msg="ImageCreate event name:\"sha256:c0425f3fe3fbf33c17a14d49c43d4fd0b60b2254511902d5b2c29e53ca684fc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:43.906476 containerd[1477]: time="2025-08-13T00:17:43.906386990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:43.909138 containerd[1477]: time="2025-08-13T00:17:43.908325667Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.3\" with image id \"sha256:c0425f3fe3fbf33c17a14d49c43d4fd0b60b2254511902d5b2c29e53ca684fc9\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\", size \"27348894\" in 2.311441669s" Aug 13 00:17:43.909138 containerd[1477]: time="2025-08-13T00:17:43.908429580Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\" returns image reference \"sha256:c0425f3fe3fbf33c17a14d49c43d4fd0b60b2254511902d5b2c29e53ca684fc9\"" Aug 13 00:17:43.911073 containerd[1477]: time="2025-08-13T00:17:43.910958307Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\"" Aug 13 00:17:45.877810 containerd[1477]: time="2025-08-13T00:17:45.877733469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:45.879852 containerd[1477]: time="2025-08-13T00:17:45.879782186Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.3: active requests=0, bytes read=23537866" Aug 13 00:17:45.880640 containerd[1477]: time="2025-08-13T00:17:45.880158174Z" level=info msg="ImageCreate event name:\"sha256:ef439b94d49d41d1b377c316fb053adb88bf6b26ec7e63aaf3deba953b7c766f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:45.884667 containerd[1477]: time="2025-08-13T00:17:45.884578415Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:45.886016 containerd[1477]: time="2025-08-13T00:17:45.885795425Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.3\" with image id \"sha256:ef439b94d49d41d1b377c316fb053adb88bf6b26ec7e63aaf3deba953b7c766f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\", size \"25092764\" in 1.974482958s" Aug 13 00:17:45.886016 containerd[1477]: time="2025-08-13T00:17:45.885847047Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\" returns image reference \"sha256:ef439b94d49d41d1b377c316fb053adb88bf6b26ec7e63aaf3deba953b7c766f\"" Aug 13 00:17:45.886747 containerd[1477]: time="2025-08-13T00:17:45.886623013Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\"" Aug 13 00:17:46.021657 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Aug 13 00:17:46.034318 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:17:46.149735 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:17:46.154826 (kubelet)[1990]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:17:46.197470 kubelet[1990]: E0813 00:17:46.197393 1990 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:17:46.200341 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:17:46.200522 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:17:47.528491 containerd[1477]: time="2025-08-13T00:17:47.528223357Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:47.530057 containerd[1477]: time="2025-08-13T00:17:47.529994498Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.3: active requests=0, bytes read=18293544" Aug 13 00:17:47.531979 containerd[1477]: time="2025-08-13T00:17:47.531920354Z" level=info msg="ImageCreate event name:\"sha256:c03972dff86ba78247043f2b6171ce436ab9323da7833b18924c3d8e29ea37a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:47.537583 containerd[1477]: time="2025-08-13T00:17:47.537482049Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:47.541508 containerd[1477]: time="2025-08-13T00:17:47.540647424Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.3\" with image id \"sha256:c03972dff86ba78247043f2b6171ce436ab9323da7833b18924c3d8e29ea37a5\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\", size \"19848460\" in 1.65398458s" Aug 13 00:17:47.541508 containerd[1477]: time="2025-08-13T00:17:47.540773580Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\" returns image reference \"sha256:c03972dff86ba78247043f2b6171ce436ab9323da7833b18924c3d8e29ea37a5\"" Aug 13 00:17:47.542075 containerd[1477]: time="2025-08-13T00:17:47.541978620Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\"" Aug 13 00:17:48.665595 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2429031030.mount: Deactivated successfully. Aug 13 00:17:48.997294 containerd[1477]: time="2025-08-13T00:17:48.997120415Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:48.998781 containerd[1477]: time="2025-08-13T00:17:48.998678206Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.3: active requests=0, bytes read=28199498" Aug 13 00:17:49.000244 containerd[1477]: time="2025-08-13T00:17:49.000181439Z" level=info msg="ImageCreate event name:\"sha256:738e99dbd7325e2cdd650d83d59a79c7ecb005ab0d5bf029fc15c54ee9359306\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:49.004086 containerd[1477]: time="2025-08-13T00:17:49.003915248Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:49.005519 containerd[1477]: time="2025-08-13T00:17:49.005162931Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.3\" with image id \"sha256:738e99dbd7325e2cdd650d83d59a79c7ecb005ab0d5bf029fc15c54ee9359306\", repo tag \"registry.k8s.io/kube-proxy:v1.33.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\", size \"28198491\" in 1.463109073s" Aug 13 00:17:49.005519 containerd[1477]: time="2025-08-13T00:17:49.005207850Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\" returns image reference \"sha256:738e99dbd7325e2cdd650d83d59a79c7ecb005ab0d5bf029fc15c54ee9359306\"" Aug 13 00:17:49.006042 containerd[1477]: time="2025-08-13T00:17:49.005848751Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Aug 13 00:17:49.617495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3490462830.mount: Deactivated successfully. Aug 13 00:17:50.387137 containerd[1477]: time="2025-08-13T00:17:50.387035976Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:50.389270 containerd[1477]: time="2025-08-13T00:17:50.389089559Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Aug 13 00:17:50.390697 containerd[1477]: time="2025-08-13T00:17:50.390629915Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:50.396969 containerd[1477]: time="2025-08-13T00:17:50.396907939Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:50.399092 containerd[1477]: time="2025-08-13T00:17:50.398929362Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.393037973s" Aug 13 00:17:50.399092 containerd[1477]: time="2025-08-13T00:17:50.398983121Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Aug 13 00:17:50.399500 containerd[1477]: time="2025-08-13T00:17:50.399473787Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 13 00:17:50.998163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3000482649.mount: Deactivated successfully. Aug 13 00:17:51.007803 containerd[1477]: time="2025-08-13T00:17:51.007570688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:51.009476 containerd[1477]: time="2025-08-13T00:17:51.009400919Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Aug 13 00:17:51.011194 containerd[1477]: time="2025-08-13T00:17:51.011097594Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:51.013682 containerd[1477]: time="2025-08-13T00:17:51.013623447Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:51.014709 containerd[1477]: time="2025-08-13T00:17:51.014495624Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 614.88808ms" Aug 13 00:17:51.014709 containerd[1477]: time="2025-08-13T00:17:51.014538982Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Aug 13 00:17:51.016002 containerd[1477]: time="2025-08-13T00:17:51.015709831Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Aug 13 00:17:51.554135 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3309375443.mount: Deactivated successfully. Aug 13 00:17:54.487997 containerd[1477]: time="2025-08-13T00:17:54.487872410Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:54.490302 containerd[1477]: time="2025-08-13T00:17:54.490147919Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69334637" Aug 13 00:17:54.491381 containerd[1477]: time="2025-08-13T00:17:54.490877102Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:54.495612 containerd[1477]: time="2025-08-13T00:17:54.495551717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:17:54.498304 containerd[1477]: time="2025-08-13T00:17:54.497036083Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 3.481285692s" Aug 13 00:17:54.498304 containerd[1477]: time="2025-08-13T00:17:54.497084162Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Aug 13 00:17:56.272577 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Aug 13 00:17:56.281838 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:17:56.426829 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:17:56.430057 (kubelet)[2150]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:17:56.477404 kubelet[2150]: E0813 00:17:56.477357 2150 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:17:56.481121 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:17:56.481560 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:17:59.704115 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:17:59.713971 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:17:59.747385 systemd[1]: Reloading requested from client PID 2164 ('systemctl') (unit session-7.scope)... Aug 13 00:17:59.747403 systemd[1]: Reloading... Aug 13 00:17:59.896764 zram_generator::config[2204]: No configuration found. Aug 13 00:17:59.978560 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:18:00.051132 systemd[1]: Reloading finished in 302 ms. Aug 13 00:18:00.118379 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:18:00.124001 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:18:00.127670 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 00:18:00.128364 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:18:00.137901 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:18:00.284401 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:18:00.294961 (kubelet)[2254]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:18:00.343553 kubelet[2254]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:18:00.344527 kubelet[2254]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 00:18:00.344527 kubelet[2254]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:18:00.344527 kubelet[2254]: I0813 00:18:00.343980 2254 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:18:00.959955 kubelet[2254]: I0813 00:18:00.959874 2254 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 13 00:18:00.959955 kubelet[2254]: I0813 00:18:00.959932 2254 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:18:00.960390 kubelet[2254]: I0813 00:18:00.960344 2254 server.go:956] "Client rotation is on, will bootstrap in background" Aug 13 00:18:00.994016 kubelet[2254]: E0813 00:18:00.993975 2254 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://91.99.89.242:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 13 00:18:00.995694 kubelet[2254]: I0813 00:18:00.995401 2254 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:18:01.008579 kubelet[2254]: E0813 00:18:01.008474 2254 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Aug 13 00:18:01.008579 kubelet[2254]: I0813 00:18:01.008540 2254 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Aug 13 00:18:01.012996 kubelet[2254]: I0813 00:18:01.012783 2254 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:18:01.013293 kubelet[2254]: I0813 00:18:01.013240 2254 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:18:01.013542 kubelet[2254]: I0813 00:18:01.013280 2254 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-8-f2ca23fedd","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:18:01.013655 kubelet[2254]: I0813 00:18:01.013612 2254 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:18:01.013655 kubelet[2254]: I0813 00:18:01.013623 2254 container_manager_linux.go:303] "Creating device plugin manager" Aug 13 00:18:01.013904 kubelet[2254]: I0813 00:18:01.013869 2254 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:18:01.017927 kubelet[2254]: I0813 00:18:01.017841 2254 kubelet.go:480] "Attempting to sync node with API server" Aug 13 00:18:01.017927 kubelet[2254]: I0813 00:18:01.017878 2254 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:18:01.018597 kubelet[2254]: I0813 00:18:01.018527 2254 kubelet.go:386] "Adding apiserver pod source" Aug 13 00:18:01.018597 kubelet[2254]: I0813 00:18:01.018566 2254 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:18:01.027242 kubelet[2254]: E0813 00:18:01.026678 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://91.99.89.242:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-8-f2ca23fedd&limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 13 00:18:01.029118 kubelet[2254]: E0813 00:18:01.028284 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://91.99.89.242:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 13 00:18:01.029118 kubelet[2254]: I0813 00:18:01.028435 2254 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Aug 13 00:18:01.029287 kubelet[2254]: I0813 00:18:01.029208 2254 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 13 00:18:01.029449 kubelet[2254]: W0813 00:18:01.029405 2254 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 13 00:18:01.034017 kubelet[2254]: I0813 00:18:01.033975 2254 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 00:18:01.034184 kubelet[2254]: I0813 00:18:01.034146 2254 server.go:1289] "Started kubelet" Aug 13 00:18:01.036684 kubelet[2254]: I0813 00:18:01.036592 2254 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:18:01.038471 kubelet[2254]: E0813 00:18:01.037073 2254 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.89.242:6443/api/v1/namespaces/default/events\": dial tcp 91.99.89.242:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-5-8-f2ca23fedd.185b2b79981cc774 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-5-8-f2ca23fedd,UID:ci-4081-3-5-8-f2ca23fedd,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-5-8-f2ca23fedd,},FirstTimestamp:2025-08-13 00:18:01.034000244 +0000 UTC m=+0.733600063,LastTimestamp:2025-08-13 00:18:01.034000244 +0000 UTC m=+0.733600063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-5-8-f2ca23fedd,}" Aug 13 00:18:01.040776 kubelet[2254]: I0813 00:18:01.040645 2254 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:18:01.042020 kubelet[2254]: I0813 00:18:01.041981 2254 server.go:317] "Adding debug handlers to kubelet server" Aug 13 00:18:01.045945 kubelet[2254]: I0813 00:18:01.045573 2254 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:18:01.045945 kubelet[2254]: I0813 00:18:01.045876 2254 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:18:01.046179 kubelet[2254]: I0813 00:18:01.046128 2254 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:18:01.047578 kubelet[2254]: I0813 00:18:01.047023 2254 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 00:18:01.047578 kubelet[2254]: E0813 00:18:01.047367 2254 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" Aug 13 00:18:01.050063 kubelet[2254]: I0813 00:18:01.050030 2254 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 00:18:01.051689 kubelet[2254]: I0813 00:18:01.050357 2254 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:18:01.051689 kubelet[2254]: E0813 00:18:01.050959 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://91.99.89.242:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 13 00:18:01.051689 kubelet[2254]: E0813 00:18:01.051043 2254 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.89.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-8-f2ca23fedd?timeout=10s\": dial tcp 91.99.89.242:6443: connect: connection refused" interval="200ms" Aug 13 00:18:01.052277 kubelet[2254]: E0813 00:18:01.052093 2254 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 00:18:01.053122 kubelet[2254]: I0813 00:18:01.053057 2254 factory.go:223] Registration of the systemd container factory successfully Aug 13 00:18:01.053257 kubelet[2254]: I0813 00:18:01.053159 2254 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:18:01.056472 kubelet[2254]: I0813 00:18:01.055338 2254 factory.go:223] Registration of the containerd container factory successfully Aug 13 00:18:01.073317 kubelet[2254]: I0813 00:18:01.073240 2254 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 13 00:18:01.074654 kubelet[2254]: I0813 00:18:01.074613 2254 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 13 00:18:01.074654 kubelet[2254]: I0813 00:18:01.074646 2254 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 13 00:18:01.074813 kubelet[2254]: I0813 00:18:01.074666 2254 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 00:18:01.074813 kubelet[2254]: I0813 00:18:01.074673 2254 kubelet.go:2436] "Starting kubelet main sync loop" Aug 13 00:18:01.074813 kubelet[2254]: E0813 00:18:01.074717 2254 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:18:01.082119 kubelet[2254]: E0813 00:18:01.082082 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://91.99.89.242:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 13 00:18:01.083743 kubelet[2254]: I0813 00:18:01.083710 2254 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 00:18:01.083743 kubelet[2254]: I0813 00:18:01.083731 2254 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 00:18:01.083743 kubelet[2254]: I0813 00:18:01.083754 2254 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:18:01.086079 kubelet[2254]: I0813 00:18:01.086034 2254 policy_none.go:49] "None policy: Start" Aug 13 00:18:01.086079 kubelet[2254]: I0813 00:18:01.086071 2254 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 00:18:01.086079 kubelet[2254]: I0813 00:18:01.086086 2254 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:18:01.092324 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 13 00:18:01.110740 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 13 00:18:01.126887 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 13 00:18:01.130840 kubelet[2254]: E0813 00:18:01.129142 2254 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 13 00:18:01.130840 kubelet[2254]: I0813 00:18:01.129384 2254 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:18:01.130840 kubelet[2254]: I0813 00:18:01.129395 2254 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:18:01.130840 kubelet[2254]: I0813 00:18:01.130074 2254 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:18:01.132961 kubelet[2254]: E0813 00:18:01.132759 2254 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 00:18:01.132961 kubelet[2254]: E0813 00:18:01.132812 2254 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-5-8-f2ca23fedd\" not found" Aug 13 00:18:01.193048 systemd[1]: Created slice kubepods-burstable-podd9f966240d5aa9a09dbec92f0f992935.slice - libcontainer container kubepods-burstable-podd9f966240d5aa9a09dbec92f0f992935.slice. Aug 13 00:18:01.206399 kubelet[2254]: E0813 00:18:01.206122 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.209661 systemd[1]: Created slice kubepods-burstable-pod6fd1689e83d17465182663f85780d840.slice - libcontainer container kubepods-burstable-pod6fd1689e83d17465182663f85780d840.slice. Aug 13 00:18:01.221022 kubelet[2254]: E0813 00:18:01.220798 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.225663 systemd[1]: Created slice kubepods-burstable-pod5f6f0dc7df8c3512f9c19c173063c3f3.slice - libcontainer container kubepods-burstable-pod5f6f0dc7df8c3512f9c19c173063c3f3.slice. Aug 13 00:18:01.228067 kubelet[2254]: E0813 00:18:01.227944 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.232225 kubelet[2254]: I0813 00:18:01.232007 2254 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.232952 kubelet[2254]: E0813 00:18:01.232896 2254 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.89.242:6443/api/v1/nodes\": dial tcp 91.99.89.242:6443: connect: connection refused" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.250849 kubelet[2254]: I0813 00:18:01.250670 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d9f966240d5aa9a09dbec92f0f992935-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" (UID: \"d9f966240d5aa9a09dbec92f0f992935\") " pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.250849 kubelet[2254]: I0813 00:18:01.250754 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d9f966240d5aa9a09dbec92f0f992935-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" (UID: \"d9f966240d5aa9a09dbec92f0f992935\") " pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.250849 kubelet[2254]: I0813 00:18:01.250795 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d9f966240d5aa9a09dbec92f0f992935-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" (UID: \"d9f966240d5aa9a09dbec92f0f992935\") " pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.251806 kubelet[2254]: E0813 00:18:01.251747 2254 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.89.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-8-f2ca23fedd?timeout=10s\": dial tcp 91.99.89.242:6443: connect: connection refused" interval="400ms" Aug 13 00:18:01.352069 kubelet[2254]: I0813 00:18:01.351982 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.352069 kubelet[2254]: I0813 00:18:01.352059 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.353258 kubelet[2254]: I0813 00:18:01.352106 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5f6f0dc7df8c3512f9c19c173063c3f3-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-8-f2ca23fedd\" (UID: \"5f6f0dc7df8c3512f9c19c173063c3f3\") " pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.353258 kubelet[2254]: I0813 00:18:01.352202 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.353258 kubelet[2254]: I0813 00:18:01.352227 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.353258 kubelet[2254]: I0813 00:18:01.352254 2254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.435868 kubelet[2254]: I0813 00:18:01.435826 2254 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.436408 kubelet[2254]: E0813 00:18:01.436352 2254 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.89.242:6443/api/v1/nodes\": dial tcp 91.99.89.242:6443: connect: connection refused" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.508990 containerd[1477]: time="2025-08-13T00:18:01.508795187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-8-f2ca23fedd,Uid:d9f966240d5aa9a09dbec92f0f992935,Namespace:kube-system,Attempt:0,}" Aug 13 00:18:01.522640 containerd[1477]: time="2025-08-13T00:18:01.522556850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-8-f2ca23fedd,Uid:6fd1689e83d17465182663f85780d840,Namespace:kube-system,Attempt:0,}" Aug 13 00:18:01.530192 containerd[1477]: time="2025-08-13T00:18:01.529728377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-8-f2ca23fedd,Uid:5f6f0dc7df8c3512f9c19c173063c3f3,Namespace:kube-system,Attempt:0,}" Aug 13 00:18:01.652653 kubelet[2254]: E0813 00:18:01.652558 2254 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.89.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-8-f2ca23fedd?timeout=10s\": dial tcp 91.99.89.242:6443: connect: connection refused" interval="800ms" Aug 13 00:18:01.839605 kubelet[2254]: I0813 00:18:01.839551 2254 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.840201 kubelet[2254]: E0813 00:18:01.840140 2254 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.89.242:6443/api/v1/nodes\": dial tcp 91.99.89.242:6443: connect: connection refused" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:01.990464 kubelet[2254]: E0813 00:18:01.990376 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://91.99.89.242:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 13 00:18:02.075941 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1539155489.mount: Deactivated successfully. Aug 13 00:18:02.085854 containerd[1477]: time="2025-08-13T00:18:02.084727196Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:18:02.088430 containerd[1477]: time="2025-08-13T00:18:02.088387381Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Aug 13 00:18:02.090395 containerd[1477]: time="2025-08-13T00:18:02.090236673Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:18:02.094647 containerd[1477]: time="2025-08-13T00:18:02.094600128Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Aug 13 00:18:02.096766 containerd[1477]: time="2025-08-13T00:18:02.096710776Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Aug 13 00:18:02.098647 containerd[1477]: time="2025-08-13T00:18:02.098604827Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:18:02.102690 kubelet[2254]: E0813 00:18:02.102643 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://91.99.89.242:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 13 00:18:02.104569 containerd[1477]: time="2025-08-13T00:18:02.103944627Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:18:02.105493 containerd[1477]: time="2025-08-13T00:18:02.104915972Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 575.066277ms" Aug 13 00:18:02.107544 containerd[1477]: time="2025-08-13T00:18:02.107418775Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 598.48891ms" Aug 13 00:18:02.108371 containerd[1477]: time="2025-08-13T00:18:02.108077125Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:18:02.111019 containerd[1477]: time="2025-08-13T00:18:02.110941042Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 588.254874ms" Aug 13 00:18:02.144695 kubelet[2254]: E0813 00:18:02.144652 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://91.99.89.242:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 13 00:18:02.243492 containerd[1477]: time="2025-08-13T00:18:02.243035494Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:02.243492 containerd[1477]: time="2025-08-13T00:18:02.243100933Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:02.243492 containerd[1477]: time="2025-08-13T00:18:02.243133413Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:02.243492 containerd[1477]: time="2025-08-13T00:18:02.243216451Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:02.248732 containerd[1477]: time="2025-08-13T00:18:02.248622690Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:02.248889 containerd[1477]: time="2025-08-13T00:18:02.248703689Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:02.248889 containerd[1477]: time="2025-08-13T00:18:02.248716889Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:02.248889 containerd[1477]: time="2025-08-13T00:18:02.248811927Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:02.252898 containerd[1477]: time="2025-08-13T00:18:02.252661949Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:02.252898 containerd[1477]: time="2025-08-13T00:18:02.252718868Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:02.252898 containerd[1477]: time="2025-08-13T00:18:02.252734948Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:02.252898 containerd[1477]: time="2025-08-13T00:18:02.252813587Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:02.269649 systemd[1]: Started cri-containerd-f90e70a7ef6e963d55c5992c2be127d81936ac0d7d7d47db54683e0155c5e08d.scope - libcontainer container f90e70a7ef6e963d55c5992c2be127d81936ac0d7d7d47db54683e0155c5e08d. Aug 13 00:18:02.280946 systemd[1]: Started cri-containerd-676618f50faf631d1a14969a4e418d918f0bb4b6e2e34bad5361156f804e5e37.scope - libcontainer container 676618f50faf631d1a14969a4e418d918f0bb4b6e2e34bad5361156f804e5e37. Aug 13 00:18:02.285979 systemd[1]: Started cri-containerd-5ff2786a68d6feaaa8f4649916d54563b761ffca212e27e8f99ea1f281a75d36.scope - libcontainer container 5ff2786a68d6feaaa8f4649916d54563b761ffca212e27e8f99ea1f281a75d36. Aug 13 00:18:02.344233 containerd[1477]: time="2025-08-13T00:18:02.343840257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-8-f2ca23fedd,Uid:d9f966240d5aa9a09dbec92f0f992935,Namespace:kube-system,Attempt:0,} returns sandbox id \"f90e70a7ef6e963d55c5992c2be127d81936ac0d7d7d47db54683e0155c5e08d\"" Aug 13 00:18:02.354473 containerd[1477]: time="2025-08-13T00:18:02.354181822Z" level=info msg="CreateContainer within sandbox \"f90e70a7ef6e963d55c5992c2be127d81936ac0d7d7d47db54683e0155c5e08d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 13 00:18:02.358518 containerd[1477]: time="2025-08-13T00:18:02.357963365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-8-f2ca23fedd,Uid:6fd1689e83d17465182663f85780d840,Namespace:kube-system,Attempt:0,} returns sandbox id \"5ff2786a68d6feaaa8f4649916d54563b761ffca212e27e8f99ea1f281a75d36\"" Aug 13 00:18:02.360214 containerd[1477]: time="2025-08-13T00:18:02.360034494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-8-f2ca23fedd,Uid:5f6f0dc7df8c3512f9c19c173063c3f3,Namespace:kube-system,Attempt:0,} returns sandbox id \"676618f50faf631d1a14969a4e418d918f0bb4b6e2e34bad5361156f804e5e37\"" Aug 13 00:18:02.366451 containerd[1477]: time="2025-08-13T00:18:02.366383918Z" level=info msg="CreateContainer within sandbox \"676618f50faf631d1a14969a4e418d918f0bb4b6e2e34bad5361156f804e5e37\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 13 00:18:02.368943 containerd[1477]: time="2025-08-13T00:18:02.368850401Z" level=info msg="CreateContainer within sandbox \"5ff2786a68d6feaaa8f4649916d54563b761ffca212e27e8f99ea1f281a75d36\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 13 00:18:02.381538 containerd[1477]: time="2025-08-13T00:18:02.381459491Z" level=info msg="CreateContainer within sandbox \"f90e70a7ef6e963d55c5992c2be127d81936ac0d7d7d47db54683e0155c5e08d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"25c9181fc8f4932e3482991715235a499dd2b92f2fb592ee53673ca95ff445ca\"" Aug 13 00:18:02.382389 containerd[1477]: time="2025-08-13T00:18:02.382336518Z" level=info msg="StartContainer for \"25c9181fc8f4932e3482991715235a499dd2b92f2fb592ee53673ca95ff445ca\"" Aug 13 00:18:02.386429 containerd[1477]: time="2025-08-13T00:18:02.386337018Z" level=info msg="CreateContainer within sandbox \"676618f50faf631d1a14969a4e418d918f0bb4b6e2e34bad5361156f804e5e37\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"672e963dec13002ef2f9847d4e950a2e9462a8f5bbf854f0f4d8eb06fb252446\"" Aug 13 00:18:02.388090 containerd[1477]: time="2025-08-13T00:18:02.387961153Z" level=info msg="StartContainer for \"672e963dec13002ef2f9847d4e950a2e9462a8f5bbf854f0f4d8eb06fb252446\"" Aug 13 00:18:02.394612 containerd[1477]: time="2025-08-13T00:18:02.394487295Z" level=info msg="CreateContainer within sandbox \"5ff2786a68d6feaaa8f4649916d54563b761ffca212e27e8f99ea1f281a75d36\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1fe67f40660334a63b41fae6a79653a0e297dabb8a8fb69763ff7d1e7ca3b9f8\"" Aug 13 00:18:02.395488 containerd[1477]: time="2025-08-13T00:18:02.395034847Z" level=info msg="StartContainer for \"1fe67f40660334a63b41fae6a79653a0e297dabb8a8fb69763ff7d1e7ca3b9f8\"" Aug 13 00:18:02.426345 systemd[1]: Started cri-containerd-25c9181fc8f4932e3482991715235a499dd2b92f2fb592ee53673ca95ff445ca.scope - libcontainer container 25c9181fc8f4932e3482991715235a499dd2b92f2fb592ee53673ca95ff445ca. Aug 13 00:18:02.434645 systemd[1]: Started cri-containerd-672e963dec13002ef2f9847d4e950a2e9462a8f5bbf854f0f4d8eb06fb252446.scope - libcontainer container 672e963dec13002ef2f9847d4e950a2e9462a8f5bbf854f0f4d8eb06fb252446. Aug 13 00:18:02.439700 systemd[1]: Started cri-containerd-1fe67f40660334a63b41fae6a79653a0e297dabb8a8fb69763ff7d1e7ca3b9f8.scope - libcontainer container 1fe67f40660334a63b41fae6a79653a0e297dabb8a8fb69763ff7d1e7ca3b9f8. Aug 13 00:18:02.455536 kubelet[2254]: E0813 00:18:02.454683 2254 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.89.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-8-f2ca23fedd?timeout=10s\": dial tcp 91.99.89.242:6443: connect: connection refused" interval="1.6s" Aug 13 00:18:02.483174 containerd[1477]: time="2025-08-13T00:18:02.482219975Z" level=info msg="StartContainer for \"25c9181fc8f4932e3482991715235a499dd2b92f2fb592ee53673ca95ff445ca\" returns successfully" Aug 13 00:18:02.503916 kubelet[2254]: E0813 00:18:02.503847 2254 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://91.99.89.242:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-8-f2ca23fedd&limit=500&resourceVersion=0\": dial tcp 91.99.89.242:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 13 00:18:02.511797 containerd[1477]: time="2025-08-13T00:18:02.511742131Z" level=info msg="StartContainer for \"1fe67f40660334a63b41fae6a79653a0e297dabb8a8fb69763ff7d1e7ca3b9f8\" returns successfully" Aug 13 00:18:02.511797 containerd[1477]: time="2025-08-13T00:18:02.511751931Z" level=info msg="StartContainer for \"672e963dec13002ef2f9847d4e950a2e9462a8f5bbf854f0f4d8eb06fb252446\" returns successfully" Aug 13 00:18:02.646972 kubelet[2254]: I0813 00:18:02.645923 2254 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:03.095931 kubelet[2254]: E0813 00:18:03.095192 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:03.095931 kubelet[2254]: E0813 00:18:03.095694 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:03.101529 kubelet[2254]: E0813 00:18:03.101499 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:04.103550 kubelet[2254]: E0813 00:18:04.103337 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:04.103550 kubelet[2254]: E0813 00:18:04.103378 2254 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-8-f2ca23fedd\" not found" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:06.671794 kubelet[2254]: I0813 00:18:06.671746 2254 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:06.671794 kubelet[2254]: E0813 00:18:06.671794 2254 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4081-3-5-8-f2ca23fedd\": node \"ci-4081-3-5-8-f2ca23fedd\" not found" Aug 13 00:18:06.750478 kubelet[2254]: I0813 00:18:06.750123 2254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:06.758720 kubelet[2254]: E0813 00:18:06.758670 2254 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:06.758720 kubelet[2254]: I0813 00:18:06.758708 2254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:06.763473 kubelet[2254]: E0813 00:18:06.761582 2254 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:06.763473 kubelet[2254]: I0813 00:18:06.761625 2254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:06.764454 kubelet[2254]: E0813 00:18:06.764377 2254 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-5-8-f2ca23fedd\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:07.029599 kubelet[2254]: I0813 00:18:07.029543 2254 apiserver.go:52] "Watching apiserver" Aug 13 00:18:07.050966 kubelet[2254]: I0813 00:18:07.050876 2254 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 00:18:07.051174 kubelet[2254]: I0813 00:18:07.051121 2254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:07.054472 kubelet[2254]: E0813 00:18:07.054372 2254 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-5-8-f2ca23fedd\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:09.263425 systemd[1]: Reloading requested from client PID 2538 ('systemctl') (unit session-7.scope)... Aug 13 00:18:09.263465 systemd[1]: Reloading... Aug 13 00:18:09.339672 kubelet[2254]: I0813 00:18:09.337976 2254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:09.382479 zram_generator::config[2578]: No configuration found. Aug 13 00:18:09.498617 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:18:09.587171 systemd[1]: Reloading finished in 323 ms. Aug 13 00:18:09.637960 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:18:09.653571 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 00:18:09.655577 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:18:09.655709 systemd[1]: kubelet.service: Consumed 1.244s CPU time, 129.8M memory peak, 0B memory swap peak. Aug 13 00:18:09.662999 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:18:09.815690 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:18:09.827924 (kubelet)[2623]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:18:09.893801 kubelet[2623]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:18:09.893801 kubelet[2623]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 00:18:09.893801 kubelet[2623]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:18:09.893801 kubelet[2623]: I0813 00:18:09.892460 2623 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:18:09.903967 kubelet[2623]: I0813 00:18:09.903613 2623 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 13 00:18:09.903967 kubelet[2623]: I0813 00:18:09.903644 2623 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:18:09.903967 kubelet[2623]: I0813 00:18:09.903892 2623 server.go:956] "Client rotation is on, will bootstrap in background" Aug 13 00:18:09.905973 kubelet[2623]: I0813 00:18:09.905435 2623 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Aug 13 00:18:09.909217 kubelet[2623]: I0813 00:18:09.908975 2623 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:18:09.915554 kubelet[2623]: E0813 00:18:09.915515 2623 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Aug 13 00:18:09.915778 kubelet[2623]: I0813 00:18:09.915747 2623 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Aug 13 00:18:09.918971 kubelet[2623]: I0813 00:18:09.918877 2623 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:18:09.919347 kubelet[2623]: I0813 00:18:09.919318 2623 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:18:09.919719 kubelet[2623]: I0813 00:18:09.919419 2623 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-8-f2ca23fedd","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:18:09.919719 kubelet[2623]: I0813 00:18:09.919685 2623 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:18:09.919719 kubelet[2623]: I0813 00:18:09.919696 2623 container_manager_linux.go:303] "Creating device plugin manager" Aug 13 00:18:09.920066 kubelet[2623]: I0813 00:18:09.919926 2623 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:18:09.920294 kubelet[2623]: I0813 00:18:09.920279 2623 kubelet.go:480] "Attempting to sync node with API server" Aug 13 00:18:09.920368 kubelet[2623]: I0813 00:18:09.920359 2623 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:18:09.920457 kubelet[2623]: I0813 00:18:09.920431 2623 kubelet.go:386] "Adding apiserver pod source" Aug 13 00:18:09.920527 kubelet[2623]: I0813 00:18:09.920517 2623 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:18:09.930487 kubelet[2623]: I0813 00:18:09.928227 2623 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Aug 13 00:18:09.930487 kubelet[2623]: I0813 00:18:09.928890 2623 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 13 00:18:09.934463 kubelet[2623]: I0813 00:18:09.932594 2623 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 00:18:09.934463 kubelet[2623]: I0813 00:18:09.932653 2623 server.go:1289] "Started kubelet" Aug 13 00:18:09.943678 kubelet[2623]: I0813 00:18:09.943617 2623 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:18:09.953116 kubelet[2623]: I0813 00:18:09.953060 2623 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 13 00:18:09.957502 kubelet[2623]: I0813 00:18:09.957473 2623 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 00:18:09.957924 kubelet[2623]: I0813 00:18:09.957880 2623 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:18:09.958946 kubelet[2623]: I0813 00:18:09.958838 2623 server.go:317] "Adding debug handlers to kubelet server" Aug 13 00:18:09.964278 kubelet[2623]: I0813 00:18:09.964232 2623 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 00:18:09.964402 kubelet[2623]: I0813 00:18:09.964381 2623 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:18:09.967100 kubelet[2623]: I0813 00:18:09.966970 2623 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:18:09.967266 kubelet[2623]: I0813 00:18:09.967242 2623 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:18:09.968631 kubelet[2623]: I0813 00:18:09.968579 2623 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:18:09.976010 kubelet[2623]: I0813 00:18:09.974621 2623 factory.go:223] Registration of the systemd container factory successfully Aug 13 00:18:09.976010 kubelet[2623]: I0813 00:18:09.974723 2623 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:18:09.980340 kubelet[2623]: I0813 00:18:09.980310 2623 factory.go:223] Registration of the containerd container factory successfully Aug 13 00:18:09.983981 kubelet[2623]: I0813 00:18:09.983881 2623 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 13 00:18:09.984200 kubelet[2623]: I0813 00:18:09.984043 2623 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 13 00:18:09.984200 kubelet[2623]: I0813 00:18:09.984083 2623 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 00:18:09.984200 kubelet[2623]: I0813 00:18:09.984092 2623 kubelet.go:2436] "Starting kubelet main sync loop" Aug 13 00:18:09.984200 kubelet[2623]: E0813 00:18:09.984143 2623 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:18:10.048015 kubelet[2623]: I0813 00:18:10.047985 2623 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 00:18:10.048241 kubelet[2623]: I0813 00:18:10.048220 2623 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 00:18:10.048352 kubelet[2623]: I0813 00:18:10.048339 2623 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:18:10.048666 kubelet[2623]: I0813 00:18:10.048641 2623 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 13 00:18:10.048812 kubelet[2623]: I0813 00:18:10.048777 2623 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 13 00:18:10.049190 kubelet[2623]: I0813 00:18:10.048883 2623 policy_none.go:49] "None policy: Start" Aug 13 00:18:10.049190 kubelet[2623]: I0813 00:18:10.048904 2623 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 00:18:10.049190 kubelet[2623]: I0813 00:18:10.048920 2623 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:18:10.049190 kubelet[2623]: I0813 00:18:10.049067 2623 state_mem.go:75] "Updated machine memory state" Aug 13 00:18:10.057364 kubelet[2623]: E0813 00:18:10.057319 2623 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 13 00:18:10.058564 kubelet[2623]: I0813 00:18:10.058127 2623 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:18:10.058564 kubelet[2623]: I0813 00:18:10.058157 2623 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:18:10.058564 kubelet[2623]: I0813 00:18:10.058463 2623 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:18:10.063404 kubelet[2623]: E0813 00:18:10.063337 2623 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 00:18:10.085652 kubelet[2623]: I0813 00:18:10.085182 2623 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.085801 kubelet[2623]: I0813 00:18:10.085738 2623 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.086130 kubelet[2623]: I0813 00:18:10.086042 2623 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.097345 kubelet[2623]: E0813 00:18:10.097293 2623 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165284 kubelet[2623]: I0813 00:18:10.164612 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d9f966240d5aa9a09dbec92f0f992935-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" (UID: \"d9f966240d5aa9a09dbec92f0f992935\") " pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165284 kubelet[2623]: I0813 00:18:10.164663 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d9f966240d5aa9a09dbec92f0f992935-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" (UID: \"d9f966240d5aa9a09dbec92f0f992935\") " pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165284 kubelet[2623]: I0813 00:18:10.164695 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d9f966240d5aa9a09dbec92f0f992935-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" (UID: \"d9f966240d5aa9a09dbec92f0f992935\") " pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165284 kubelet[2623]: I0813 00:18:10.164720 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165284 kubelet[2623]: I0813 00:18:10.164745 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165555 kubelet[2623]: I0813 00:18:10.164782 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5f6f0dc7df8c3512f9c19c173063c3f3-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-8-f2ca23fedd\" (UID: \"5f6f0dc7df8c3512f9c19c173063c3f3\") " pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165555 kubelet[2623]: I0813 00:18:10.164804 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165555 kubelet[2623]: I0813 00:18:10.164828 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.165555 kubelet[2623]: I0813 00:18:10.164865 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6fd1689e83d17465182663f85780d840-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-8-f2ca23fedd\" (UID: \"6fd1689e83d17465182663f85780d840\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.170049 kubelet[2623]: I0813 00:18:10.169762 2623 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.180187 kubelet[2623]: I0813 00:18:10.180124 2623 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.180349 kubelet[2623]: I0813 00:18:10.180265 2623 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:10.922261 kubelet[2623]: I0813 00:18:10.921771 2623 apiserver.go:52] "Watching apiserver" Aug 13 00:18:10.964586 kubelet[2623]: I0813 00:18:10.964527 2623 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 00:18:11.021970 kubelet[2623]: I0813 00:18:11.021877 2623 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:11.035150 kubelet[2623]: E0813 00:18:11.034538 2623 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-5-8-f2ca23fedd\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:11.069340 kubelet[2623]: I0813 00:18:11.069271 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-5-8-f2ca23fedd" podStartSLOduration=2.069254225 podStartE2EDuration="2.069254225s" podCreationTimestamp="2025-08-13 00:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:18:11.068241795 +0000 UTC m=+1.234616602" watchObservedRunningTime="2025-08-13 00:18:11.069254225 +0000 UTC m=+1.235629032" Aug 13 00:18:11.109177 kubelet[2623]: I0813 00:18:11.108747 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-5-8-f2ca23fedd" podStartSLOduration=1.108725705 podStartE2EDuration="1.108725705s" podCreationTimestamp="2025-08-13 00:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:18:11.09148704 +0000 UTC m=+1.257861847" watchObservedRunningTime="2025-08-13 00:18:11.108725705 +0000 UTC m=+1.275100512" Aug 13 00:18:11.130647 kubelet[2623]: I0813 00:18:11.130268 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-5-8-f2ca23fedd" podStartSLOduration=1.130249487 podStartE2EDuration="1.130249487s" podCreationTimestamp="2025-08-13 00:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:18:11.109765574 +0000 UTC m=+1.276140381" watchObservedRunningTime="2025-08-13 00:18:11.130249487 +0000 UTC m=+1.296624294" Aug 13 00:18:15.227639 kubelet[2623]: I0813 00:18:15.227547 2623 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 13 00:18:15.228836 containerd[1477]: time="2025-08-13T00:18:15.228765329Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 13 00:18:15.229776 kubelet[2623]: I0813 00:18:15.229153 2623 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 13 00:18:15.918211 systemd[1]: Created slice kubepods-besteffort-pod73116fb7_b5bc_464f_955b_a9644a2b9f71.slice - libcontainer container kubepods-besteffort-pod73116fb7_b5bc_464f_955b_a9644a2b9f71.slice. Aug 13 00:18:16.004425 kubelet[2623]: I0813 00:18:16.004359 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/73116fb7-b5bc-464f-955b-a9644a2b9f71-kube-proxy\") pod \"kube-proxy-q4nz7\" (UID: \"73116fb7-b5bc-464f-955b-a9644a2b9f71\") " pod="kube-system/kube-proxy-q4nz7" Aug 13 00:18:16.005571 kubelet[2623]: I0813 00:18:16.005527 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqgps\" (UniqueName: \"kubernetes.io/projected/73116fb7-b5bc-464f-955b-a9644a2b9f71-kube-api-access-sqgps\") pod \"kube-proxy-q4nz7\" (UID: \"73116fb7-b5bc-464f-955b-a9644a2b9f71\") " pod="kube-system/kube-proxy-q4nz7" Aug 13 00:18:16.005860 kubelet[2623]: I0813 00:18:16.005779 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/73116fb7-b5bc-464f-955b-a9644a2b9f71-xtables-lock\") pod \"kube-proxy-q4nz7\" (UID: \"73116fb7-b5bc-464f-955b-a9644a2b9f71\") " pod="kube-system/kube-proxy-q4nz7" Aug 13 00:18:16.005860 kubelet[2623]: I0813 00:18:16.005805 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73116fb7-b5bc-464f-955b-a9644a2b9f71-lib-modules\") pod \"kube-proxy-q4nz7\" (UID: \"73116fb7-b5bc-464f-955b-a9644a2b9f71\") " pod="kube-system/kube-proxy-q4nz7" Aug 13 00:18:16.228266 containerd[1477]: time="2025-08-13T00:18:16.227682683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q4nz7,Uid:73116fb7-b5bc-464f-955b-a9644a2b9f71,Namespace:kube-system,Attempt:0,}" Aug 13 00:18:16.256904 containerd[1477]: time="2025-08-13T00:18:16.256756439Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:16.256904 containerd[1477]: time="2025-08-13T00:18:16.256847158Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:16.256904 containerd[1477]: time="2025-08-13T00:18:16.256876558Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:16.257722 containerd[1477]: time="2025-08-13T00:18:16.257616271Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:16.285677 systemd[1]: Started cri-containerd-75e15edbd672c296b908857348ecbece939b6b2e7da9fc8d8878db0a6f20a4b9.scope - libcontainer container 75e15edbd672c296b908857348ecbece939b6b2e7da9fc8d8878db0a6f20a4b9. Aug 13 00:18:16.319761 containerd[1477]: time="2025-08-13T00:18:16.319709748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q4nz7,Uid:73116fb7-b5bc-464f-955b-a9644a2b9f71,Namespace:kube-system,Attempt:0,} returns sandbox id \"75e15edbd672c296b908857348ecbece939b6b2e7da9fc8d8878db0a6f20a4b9\"" Aug 13 00:18:16.331841 containerd[1477]: time="2025-08-13T00:18:16.331774967Z" level=info msg="CreateContainer within sandbox \"75e15edbd672c296b908857348ecbece939b6b2e7da9fc8d8878db0a6f20a4b9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 13 00:18:16.359774 containerd[1477]: time="2025-08-13T00:18:16.359707692Z" level=info msg="CreateContainer within sandbox \"75e15edbd672c296b908857348ecbece939b6b2e7da9fc8d8878db0a6f20a4b9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"2d8f56cea6bf1bed869d732fbb218f6a3a3277c9eb4a6402b8aeb6354460d297\"" Aug 13 00:18:16.362869 containerd[1477]: time="2025-08-13T00:18:16.362787746Z" level=info msg="StartContainer for \"2d8f56cea6bf1bed869d732fbb218f6a3a3277c9eb4a6402b8aeb6354460d297\"" Aug 13 00:18:16.421674 systemd[1]: Started cri-containerd-2d8f56cea6bf1bed869d732fbb218f6a3a3277c9eb4a6402b8aeb6354460d297.scope - libcontainer container 2d8f56cea6bf1bed869d732fbb218f6a3a3277c9eb4a6402b8aeb6354460d297. Aug 13 00:18:16.427295 systemd[1]: Created slice kubepods-besteffort-pod6ecd7d90_789c_42ed_aceb_76da48df2094.slice - libcontainer container kubepods-besteffort-pod6ecd7d90_789c_42ed_aceb_76da48df2094.slice. Aug 13 00:18:16.473908 containerd[1477]: time="2025-08-13T00:18:16.473812931Z" level=info msg="StartContainer for \"2d8f56cea6bf1bed869d732fbb218f6a3a3277c9eb4a6402b8aeb6354460d297\" returns successfully" Aug 13 00:18:16.509241 kubelet[2623]: I0813 00:18:16.508917 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgr7h\" (UniqueName: \"kubernetes.io/projected/6ecd7d90-789c-42ed-aceb-76da48df2094-kube-api-access-wgr7h\") pod \"tigera-operator-747864d56d-97gbj\" (UID: \"6ecd7d90-789c-42ed-aceb-76da48df2094\") " pod="tigera-operator/tigera-operator-747864d56d-97gbj" Aug 13 00:18:16.509241 kubelet[2623]: I0813 00:18:16.509058 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6ecd7d90-789c-42ed-aceb-76da48df2094-var-lib-calico\") pod \"tigera-operator-747864d56d-97gbj\" (UID: \"6ecd7d90-789c-42ed-aceb-76da48df2094\") " pod="tigera-operator/tigera-operator-747864d56d-97gbj" Aug 13 00:18:16.736704 containerd[1477]: time="2025-08-13T00:18:16.736634757Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-97gbj,Uid:6ecd7d90-789c-42ed-aceb-76da48df2094,Namespace:tigera-operator,Attempt:0,}" Aug 13 00:18:16.760791 containerd[1477]: time="2025-08-13T00:18:16.760423117Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:16.761303 containerd[1477]: time="2025-08-13T00:18:16.760597955Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:16.761303 containerd[1477]: time="2025-08-13T00:18:16.761144271Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:16.761458 containerd[1477]: time="2025-08-13T00:18:16.761287510Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:16.789800 systemd[1]: Started cri-containerd-e80a39c508be21c68f48ce1a775da938f0556687d8868adb329e247ef4a56a1e.scope - libcontainer container e80a39c508be21c68f48ce1a775da938f0556687d8868adb329e247ef4a56a1e. Aug 13 00:18:16.837122 containerd[1477]: time="2025-08-13T00:18:16.836972952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-97gbj,Uid:6ecd7d90-789c-42ed-aceb-76da48df2094,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e80a39c508be21c68f48ce1a775da938f0556687d8868adb329e247ef4a56a1e\"" Aug 13 00:18:16.841583 containerd[1477]: time="2025-08-13T00:18:16.840914759Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 13 00:18:18.022476 kubelet[2623]: I0813 00:18:18.022291 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-q4nz7" podStartSLOduration=3.022265983 podStartE2EDuration="3.022265983s" podCreationTimestamp="2025-08-13 00:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:18:17.051368361 +0000 UTC m=+7.217743168" watchObservedRunningTime="2025-08-13 00:18:18.022265983 +0000 UTC m=+8.188640790" Aug 13 00:18:18.581746 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1057878730.mount: Deactivated successfully. Aug 13 00:18:19.053693 containerd[1477]: time="2025-08-13T00:18:19.053610194Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:19.055678 containerd[1477]: time="2025-08-13T00:18:19.055245581Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Aug 13 00:18:19.057002 containerd[1477]: time="2025-08-13T00:18:19.056934689Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:19.061203 containerd[1477]: time="2025-08-13T00:18:19.061076417Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:19.067135 containerd[1477]: time="2025-08-13T00:18:19.066938972Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 2.2251673s" Aug 13 00:18:19.067135 containerd[1477]: time="2025-08-13T00:18:19.066986372Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Aug 13 00:18:19.072201 containerd[1477]: time="2025-08-13T00:18:19.072150693Z" level=info msg="CreateContainer within sandbox \"e80a39c508be21c68f48ce1a775da938f0556687d8868adb329e247ef4a56a1e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 13 00:18:19.090469 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3532123186.mount: Deactivated successfully. Aug 13 00:18:19.095747 containerd[1477]: time="2025-08-13T00:18:19.095592234Z" level=info msg="CreateContainer within sandbox \"e80a39c508be21c68f48ce1a775da938f0556687d8868adb329e247ef4a56a1e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ec30613553f4c9a1e26d65677b548db996340281225036d1db73ac8a17cf826b\"" Aug 13 00:18:19.096618 containerd[1477]: time="2025-08-13T00:18:19.096401308Z" level=info msg="StartContainer for \"ec30613553f4c9a1e26d65677b548db996340281225036d1db73ac8a17cf826b\"" Aug 13 00:18:19.140775 systemd[1]: Started cri-containerd-ec30613553f4c9a1e26d65677b548db996340281225036d1db73ac8a17cf826b.scope - libcontainer container ec30613553f4c9a1e26d65677b548db996340281225036d1db73ac8a17cf826b. Aug 13 00:18:19.181403 containerd[1477]: time="2025-08-13T00:18:19.181295260Z" level=info msg="StartContainer for \"ec30613553f4c9a1e26d65677b548db996340281225036d1db73ac8a17cf826b\" returns successfully" Aug 13 00:18:20.101666 kubelet[2623]: I0813 00:18:20.101074 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-97gbj" podStartSLOduration=1.872893149 podStartE2EDuration="4.101039506s" podCreationTimestamp="2025-08-13 00:18:16 +0000 UTC" firstStartedPulling="2025-08-13 00:18:16.839917687 +0000 UTC m=+7.006292494" lastFinishedPulling="2025-08-13 00:18:19.068064044 +0000 UTC m=+9.234438851" observedRunningTime="2025-08-13 00:18:20.073646509 +0000 UTC m=+10.240021476" watchObservedRunningTime="2025-08-13 00:18:20.101039506 +0000 UTC m=+10.267414313" Aug 13 00:18:25.696957 sudo[1771]: pam_unix(sudo:session): session closed for user root Aug 13 00:18:25.861300 sshd[1768]: pam_unix(sshd:session): session closed for user core Aug 13 00:18:25.866873 systemd-logind[1461]: Session 7 logged out. Waiting for processes to exit. Aug 13 00:18:25.867952 systemd[1]: sshd@6-91.99.89.242:22-139.178.89.65:35078.service: Deactivated successfully. Aug 13 00:18:25.873545 systemd[1]: session-7.scope: Deactivated successfully. Aug 13 00:18:25.874627 systemd[1]: session-7.scope: Consumed 7.354s CPU time, 155.1M memory peak, 0B memory swap peak. Aug 13 00:18:25.879009 systemd-logind[1461]: Removed session 7. Aug 13 00:18:35.432377 systemd[1]: Created slice kubepods-besteffort-pod6b771d2d_eec8_4195_80da_8537f4fd2df4.slice - libcontainer container kubepods-besteffort-pod6b771d2d_eec8_4195_80da_8537f4fd2df4.slice. Aug 13 00:18:35.438093 kubelet[2623]: I0813 00:18:35.437138 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b771d2d-eec8-4195-80da-8537f4fd2df4-tigera-ca-bundle\") pod \"calico-typha-5cf796b88d-8dmbb\" (UID: \"6b771d2d-eec8-4195-80da-8537f4fd2df4\") " pod="calico-system/calico-typha-5cf796b88d-8dmbb" Aug 13 00:18:35.438093 kubelet[2623]: I0813 00:18:35.437174 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6b771d2d-eec8-4195-80da-8537f4fd2df4-typha-certs\") pod \"calico-typha-5cf796b88d-8dmbb\" (UID: \"6b771d2d-eec8-4195-80da-8537f4fd2df4\") " pod="calico-system/calico-typha-5cf796b88d-8dmbb" Aug 13 00:18:35.438093 kubelet[2623]: I0813 00:18:35.437194 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxd4q\" (UniqueName: \"kubernetes.io/projected/6b771d2d-eec8-4195-80da-8537f4fd2df4-kube-api-access-gxd4q\") pod \"calico-typha-5cf796b88d-8dmbb\" (UID: \"6b771d2d-eec8-4195-80da-8537f4fd2df4\") " pod="calico-system/calico-typha-5cf796b88d-8dmbb" Aug 13 00:18:35.624658 systemd[1]: Created slice kubepods-besteffort-pod6abd2108_1584_4790_8dfc_fcd236c4b476.slice - libcontainer container kubepods-besteffort-pod6abd2108_1584_4790_8dfc_fcd236c4b476.slice. Aug 13 00:18:35.638854 kubelet[2623]: I0813 00:18:35.638820 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-flexvol-driver-host\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.639179 kubelet[2623]: I0813 00:18:35.639084 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-policysync\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.639326 kubelet[2623]: I0813 00:18:35.639311 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-cni-bin-dir\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.639425 kubelet[2623]: I0813 00:18:35.639411 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-var-run-calico\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.639577 kubelet[2623]: I0813 00:18:35.639531 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-cni-log-dir\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.639794 kubelet[2623]: I0813 00:18:35.639776 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-cni-net-dir\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.639991 kubelet[2623]: I0813 00:18:35.639976 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-var-lib-calico\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.640139 kubelet[2623]: I0813 00:18:35.640087 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-xtables-lock\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.640270 kubelet[2623]: I0813 00:18:35.640220 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkfwv\" (UniqueName: \"kubernetes.io/projected/6abd2108-1584-4790-8dfc-fcd236c4b476-kube-api-access-wkfwv\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.641221 kubelet[2623]: I0813 00:18:35.641188 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6abd2108-1584-4790-8dfc-fcd236c4b476-lib-modules\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.641619 kubelet[2623]: I0813 00:18:35.641511 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6abd2108-1584-4790-8dfc-fcd236c4b476-tigera-ca-bundle\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.641619 kubelet[2623]: I0813 00:18:35.641542 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6abd2108-1584-4790-8dfc-fcd236c4b476-node-certs\") pod \"calico-node-qdf9l\" (UID: \"6abd2108-1584-4790-8dfc-fcd236c4b476\") " pod="calico-system/calico-node-qdf9l" Aug 13 00:18:35.741257 containerd[1477]: time="2025-08-13T00:18:35.740719942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cf796b88d-8dmbb,Uid:6b771d2d-eec8-4195-80da-8537f4fd2df4,Namespace:calico-system,Attempt:0,}" Aug 13 00:18:35.750982 kubelet[2623]: E0813 00:18:35.749910 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.750982 kubelet[2623]: W0813 00:18:35.750154 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.750982 kubelet[2623]: E0813 00:18:35.750187 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.750982 kubelet[2623]: E0813 00:18:35.750833 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.750982 kubelet[2623]: W0813 00:18:35.750849 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.750982 kubelet[2623]: E0813 00:18:35.750867 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.754464 kubelet[2623]: E0813 00:18:35.753056 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.754464 kubelet[2623]: W0813 00:18:35.753256 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.754464 kubelet[2623]: E0813 00:18:35.753281 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.756237 kubelet[2623]: E0813 00:18:35.755729 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.756237 kubelet[2623]: W0813 00:18:35.755748 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.756237 kubelet[2623]: E0813 00:18:35.755775 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.767436 kubelet[2623]: E0813 00:18:35.767404 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.768114 kubelet[2623]: W0813 00:18:35.767980 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.768702 kubelet[2623]: E0813 00:18:35.768487 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.771358 kubelet[2623]: E0813 00:18:35.771277 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.771358 kubelet[2623]: W0813 00:18:35.771298 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.771358 kubelet[2623]: E0813 00:18:35.771319 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.791134 containerd[1477]: time="2025-08-13T00:18:35.790685081Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:35.791134 containerd[1477]: time="2025-08-13T00:18:35.790859840Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:35.791134 containerd[1477]: time="2025-08-13T00:18:35.790881800Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:35.792688 kubelet[2623]: E0813 00:18:35.792652 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.792688 kubelet[2623]: W0813 00:18:35.792677 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.792921 kubelet[2623]: E0813 00:18:35.792702 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.795422 containerd[1477]: time="2025-08-13T00:18:35.792942829Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:35.834691 systemd[1]: Started cri-containerd-53b22b9a90b4d42dae5bdf921efc76bb20af661b02199c00fafd3685adfe1714.scope - libcontainer container 53b22b9a90b4d42dae5bdf921efc76bb20af661b02199c00fafd3685adfe1714. Aug 13 00:18:35.870566 kubelet[2623]: E0813 00:18:35.870367 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hgndp" podUID="549a8f55-4e42-4ab3-8e46-bed16219fe3c" Aug 13 00:18:35.927832 containerd[1477]: time="2025-08-13T00:18:35.927690844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cf796b88d-8dmbb,Uid:6b771d2d-eec8-4195-80da-8537f4fd2df4,Namespace:calico-system,Attempt:0,} returns sandbox id \"53b22b9a90b4d42dae5bdf921efc76bb20af661b02199c00fafd3685adfe1714\"" Aug 13 00:18:35.929999 kubelet[2623]: E0813 00:18:35.929751 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.929999 kubelet[2623]: W0813 00:18:35.929865 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.929999 kubelet[2623]: E0813 00:18:35.929889 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.931155 kubelet[2623]: E0813 00:18:35.931123 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.931799 kubelet[2623]: W0813 00:18:35.931468 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.931799 kubelet[2623]: E0813 00:18:35.931558 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.932524 kubelet[2623]: E0813 00:18:35.932329 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.932821 kubelet[2623]: W0813 00:18:35.932344 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.932821 kubelet[2623]: E0813 00:18:35.932658 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.933421 kubelet[2623]: E0813 00:18:35.933403 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.933662 kubelet[2623]: W0813 00:18:35.933545 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.933662 kubelet[2623]: E0813 00:18:35.933566 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.934153 kubelet[2623]: E0813 00:18:35.934017 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.934153 kubelet[2623]: W0813 00:18:35.934044 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.934153 kubelet[2623]: E0813 00:18:35.934059 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.934990 containerd[1477]: time="2025-08-13T00:18:35.934479128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 13 00:18:35.935616 kubelet[2623]: E0813 00:18:35.935268 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.935616 kubelet[2623]: W0813 00:18:35.935302 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.935616 kubelet[2623]: E0813 00:18:35.935333 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.936669 kubelet[2623]: E0813 00:18:35.936394 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.936669 kubelet[2623]: W0813 00:18:35.936422 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.936669 kubelet[2623]: E0813 00:18:35.936474 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.938057 kubelet[2623]: E0813 00:18:35.937917 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.938057 kubelet[2623]: W0813 00:18:35.937945 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.938057 kubelet[2623]: E0813 00:18:35.937996 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.939986 containerd[1477]: time="2025-08-13T00:18:35.939382302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qdf9l,Uid:6abd2108-1584-4790-8dfc-fcd236c4b476,Namespace:calico-system,Attempt:0,}" Aug 13 00:18:35.941128 kubelet[2623]: E0813 00:18:35.940976 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.941128 kubelet[2623]: W0813 00:18:35.941124 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.941259 kubelet[2623]: E0813 00:18:35.941143 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.943000 kubelet[2623]: E0813 00:18:35.942939 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.943000 kubelet[2623]: W0813 00:18:35.942986 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.943000 kubelet[2623]: E0813 00:18:35.943003 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.944283 kubelet[2623]: E0813 00:18:35.944265 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.944891 kubelet[2623]: W0813 00:18:35.944726 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.944891 kubelet[2623]: E0813 00:18:35.944780 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.945392 kubelet[2623]: E0813 00:18:35.945229 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.945392 kubelet[2623]: W0813 00:18:35.945244 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.945392 kubelet[2623]: E0813 00:18:35.945257 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.946300 kubelet[2623]: E0813 00:18:35.946173 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.946300 kubelet[2623]: W0813 00:18:35.946190 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.946300 kubelet[2623]: E0813 00:18:35.946203 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.947135 kubelet[2623]: E0813 00:18:35.946724 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.947135 kubelet[2623]: W0813 00:18:35.946739 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.947135 kubelet[2623]: E0813 00:18:35.946752 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.947498 kubelet[2623]: E0813 00:18:35.947326 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.947498 kubelet[2623]: W0813 00:18:35.947340 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.947498 kubelet[2623]: E0813 00:18:35.947355 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.948452 kubelet[2623]: E0813 00:18:35.948278 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.948452 kubelet[2623]: W0813 00:18:35.948295 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.948452 kubelet[2623]: E0813 00:18:35.948309 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.948917 kubelet[2623]: E0813 00:18:35.948790 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.948917 kubelet[2623]: W0813 00:18:35.948813 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.948917 kubelet[2623]: E0813 00:18:35.948828 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.949813 kubelet[2623]: E0813 00:18:35.949679 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.949813 kubelet[2623]: W0813 00:18:35.949695 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.949813 kubelet[2623]: E0813 00:18:35.949709 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.950734 kubelet[2623]: E0813 00:18:35.950609 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.950734 kubelet[2623]: W0813 00:18:35.950626 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.950734 kubelet[2623]: E0813 00:18:35.950641 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.951559 kubelet[2623]: E0813 00:18:35.951502 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.952040 kubelet[2623]: W0813 00:18:35.951789 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.952040 kubelet[2623]: E0813 00:18:35.951814 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.952664 kubelet[2623]: E0813 00:18:35.952649 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.952753 kubelet[2623]: W0813 00:18:35.952740 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.952753 kubelet[2623]: E0813 00:18:35.952815 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.952753 kubelet[2623]: I0813 00:18:35.952846 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/549a8f55-4e42-4ab3-8e46-bed16219fe3c-socket-dir\") pod \"csi-node-driver-hgndp\" (UID: \"549a8f55-4e42-4ab3-8e46-bed16219fe3c\") " pod="calico-system/csi-node-driver-hgndp" Aug 13 00:18:35.954815 kubelet[2623]: E0813 00:18:35.954797 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.955054 kubelet[2623]: W0813 00:18:35.954884 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.955054 kubelet[2623]: E0813 00:18:35.954905 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.955054 kubelet[2623]: I0813 00:18:35.954937 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzsln\" (UniqueName: \"kubernetes.io/projected/549a8f55-4e42-4ab3-8e46-bed16219fe3c-kube-api-access-lzsln\") pod \"csi-node-driver-hgndp\" (UID: \"549a8f55-4e42-4ab3-8e46-bed16219fe3c\") " pod="calico-system/csi-node-driver-hgndp" Aug 13 00:18:35.955358 kubelet[2623]: E0813 00:18:35.955340 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.955560 kubelet[2623]: W0813 00:18:35.955417 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.955560 kubelet[2623]: E0813 00:18:35.955435 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.955560 kubelet[2623]: I0813 00:18:35.955500 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549a8f55-4e42-4ab3-8e46-bed16219fe3c-kubelet-dir\") pod \"csi-node-driver-hgndp\" (UID: \"549a8f55-4e42-4ab3-8e46-bed16219fe3c\") " pod="calico-system/csi-node-driver-hgndp" Aug 13 00:18:35.955862 kubelet[2623]: E0813 00:18:35.955846 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.955998 kubelet[2623]: W0813 00:18:35.955982 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.956077 kubelet[2623]: E0813 00:18:35.956064 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.956191 kubelet[2623]: I0813 00:18:35.956176 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/549a8f55-4e42-4ab3-8e46-bed16219fe3c-registration-dir\") pod \"csi-node-driver-hgndp\" (UID: \"549a8f55-4e42-4ab3-8e46-bed16219fe3c\") " pod="calico-system/csi-node-driver-hgndp" Aug 13 00:18:35.956771 kubelet[2623]: E0813 00:18:35.956532 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.956771 kubelet[2623]: W0813 00:18:35.956548 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.956771 kubelet[2623]: E0813 00:18:35.956561 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.956771 kubelet[2623]: I0813 00:18:35.956629 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/549a8f55-4e42-4ab3-8e46-bed16219fe3c-varrun\") pod \"csi-node-driver-hgndp\" (UID: \"549a8f55-4e42-4ab3-8e46-bed16219fe3c\") " pod="calico-system/csi-node-driver-hgndp" Aug 13 00:18:35.957963 kubelet[2623]: E0813 00:18:35.957940 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.958122 kubelet[2623]: W0813 00:18:35.958024 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.958122 kubelet[2623]: E0813 00:18:35.958042 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.958388 kubelet[2623]: E0813 00:18:35.958292 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.958388 kubelet[2623]: W0813 00:18:35.958303 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.958388 kubelet[2623]: E0813 00:18:35.958312 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.958730 kubelet[2623]: E0813 00:18:35.958644 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.958730 kubelet[2623]: W0813 00:18:35.958656 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.958730 kubelet[2623]: E0813 00:18:35.958668 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.959033 kubelet[2623]: E0813 00:18:35.958923 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.959033 kubelet[2623]: W0813 00:18:35.958934 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.959033 kubelet[2623]: E0813 00:18:35.958943 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.959475 kubelet[2623]: E0813 00:18:35.959458 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.959827 kubelet[2623]: W0813 00:18:35.959631 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.959827 kubelet[2623]: E0813 00:18:35.959653 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.960318 kubelet[2623]: E0813 00:18:35.960199 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.960318 kubelet[2623]: W0813 00:18:35.960215 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.960318 kubelet[2623]: E0813 00:18:35.960227 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.960730 kubelet[2623]: E0813 00:18:35.960714 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.961339 kubelet[2623]: W0813 00:18:35.961185 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.961339 kubelet[2623]: E0813 00:18:35.961227 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.961681 kubelet[2623]: E0813 00:18:35.961666 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.962361 kubelet[2623]: W0813 00:18:35.961750 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.962361 kubelet[2623]: E0813 00:18:35.961769 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.962969 kubelet[2623]: E0813 00:18:35.962726 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.962969 kubelet[2623]: W0813 00:18:35.962741 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.962969 kubelet[2623]: E0813 00:18:35.962755 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:35.963219 kubelet[2623]: E0813 00:18:35.963205 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:35.963283 kubelet[2623]: W0813 00:18:35.963271 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:35.963344 kubelet[2623]: E0813 00:18:35.963332 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.006499 containerd[1477]: time="2025-08-13T00:18:36.005901995Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:36.006499 containerd[1477]: time="2025-08-13T00:18:36.005968874Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:36.006499 containerd[1477]: time="2025-08-13T00:18:36.005985714Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:36.006499 containerd[1477]: time="2025-08-13T00:18:36.006096434Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:36.043969 systemd[1]: Started cri-containerd-8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2.scope - libcontainer container 8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2. Aug 13 00:18:36.057612 kubelet[2623]: E0813 00:18:36.057562 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.057612 kubelet[2623]: W0813 00:18:36.057601 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.057794 kubelet[2623]: E0813 00:18:36.057625 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.058325 kubelet[2623]: E0813 00:18:36.058286 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.058325 kubelet[2623]: W0813 00:18:36.058312 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.058325 kubelet[2623]: E0813 00:18:36.058334 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.059786 kubelet[2623]: E0813 00:18:36.059739 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.059786 kubelet[2623]: W0813 00:18:36.059764 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.059786 kubelet[2623]: E0813 00:18:36.059781 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.060083 kubelet[2623]: E0813 00:18:36.060060 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.060130 kubelet[2623]: W0813 00:18:36.060088 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.060130 kubelet[2623]: E0813 00:18:36.060101 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.060404 kubelet[2623]: E0813 00:18:36.060379 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.060404 kubelet[2623]: W0813 00:18:36.060397 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.060404 kubelet[2623]: E0813 00:18:36.060409 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.061812 kubelet[2623]: E0813 00:18:36.061782 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.061812 kubelet[2623]: W0813 00:18:36.061807 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.061906 kubelet[2623]: E0813 00:18:36.061823 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.062624 kubelet[2623]: E0813 00:18:36.062543 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.062624 kubelet[2623]: W0813 00:18:36.062563 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.062624 kubelet[2623]: E0813 00:18:36.062588 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.063077 kubelet[2623]: E0813 00:18:36.063051 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.063077 kubelet[2623]: W0813 00:18:36.063066 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.063077 kubelet[2623]: E0813 00:18:36.063078 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.064748 kubelet[2623]: E0813 00:18:36.064718 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.064748 kubelet[2623]: W0813 00:18:36.064737 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.064748 kubelet[2623]: E0813 00:18:36.064752 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.066597 kubelet[2623]: E0813 00:18:36.066163 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.066597 kubelet[2623]: W0813 00:18:36.066184 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.066597 kubelet[2623]: E0813 00:18:36.066198 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.068708 kubelet[2623]: E0813 00:18:36.068247 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.068708 kubelet[2623]: W0813 00:18:36.068269 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.068708 kubelet[2623]: E0813 00:18:36.068283 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.069648 kubelet[2623]: E0813 00:18:36.069621 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.069648 kubelet[2623]: W0813 00:18:36.069636 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.069648 kubelet[2623]: E0813 00:18:36.069651 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.070786 kubelet[2623]: E0813 00:18:36.070752 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.070921 kubelet[2623]: W0813 00:18:36.070869 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.070921 kubelet[2623]: E0813 00:18:36.070893 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.071768 kubelet[2623]: E0813 00:18:36.071737 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.071768 kubelet[2623]: W0813 00:18:36.071758 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.071868 kubelet[2623]: E0813 00:18:36.071774 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.072602 kubelet[2623]: E0813 00:18:36.072554 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.072602 kubelet[2623]: W0813 00:18:36.072586 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.072602 kubelet[2623]: E0813 00:18:36.072601 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.073545 kubelet[2623]: E0813 00:18:36.073515 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.073545 kubelet[2623]: W0813 00:18:36.073535 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.073545 kubelet[2623]: E0813 00:18:36.073550 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.074314 kubelet[2623]: E0813 00:18:36.074281 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.074314 kubelet[2623]: W0813 00:18:36.074311 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.074428 kubelet[2623]: E0813 00:18:36.074326 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.074971 kubelet[2623]: E0813 00:18:36.074945 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.074971 kubelet[2623]: W0813 00:18:36.074964 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.075099 kubelet[2623]: E0813 00:18:36.074979 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.075750 kubelet[2623]: E0813 00:18:36.075557 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.075750 kubelet[2623]: W0813 00:18:36.075608 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.075750 kubelet[2623]: E0813 00:18:36.075623 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.076280 kubelet[2623]: E0813 00:18:36.076238 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.076280 kubelet[2623]: W0813 00:18:36.076253 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.076280 kubelet[2623]: E0813 00:18:36.076266 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.076972 kubelet[2623]: E0813 00:18:36.076938 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.076972 kubelet[2623]: W0813 00:18:36.076964 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.077111 kubelet[2623]: E0813 00:18:36.076981 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.077623 kubelet[2623]: E0813 00:18:36.077591 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.077623 kubelet[2623]: W0813 00:18:36.077613 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.077623 kubelet[2623]: E0813 00:18:36.077627 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.078073 kubelet[2623]: E0813 00:18:36.078039 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.078073 kubelet[2623]: W0813 00:18:36.078056 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.078073 kubelet[2623]: E0813 00:18:36.078067 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.078729 kubelet[2623]: E0813 00:18:36.078700 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.078729 kubelet[2623]: W0813 00:18:36.078718 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.078729 kubelet[2623]: E0813 00:18:36.078732 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.080159 kubelet[2623]: E0813 00:18:36.080135 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.080159 kubelet[2623]: W0813 00:18:36.080153 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.080281 kubelet[2623]: E0813 00:18:36.080166 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.101210 kubelet[2623]: E0813 00:18:36.101174 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:36.101210 kubelet[2623]: W0813 00:18:36.101198 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:36.101381 kubelet[2623]: E0813 00:18:36.101233 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:36.132975 containerd[1477]: time="2025-08-13T00:18:36.132923780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qdf9l,Uid:6abd2108-1584-4790-8dfc-fcd236c4b476,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2\"" Aug 13 00:18:37.259798 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3549772524.mount: Deactivated successfully. Aug 13 00:18:37.901107 containerd[1477]: time="2025-08-13T00:18:37.901041863Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:37.905180 containerd[1477]: time="2025-08-13T00:18:37.905096962Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Aug 13 00:18:37.905902 containerd[1477]: time="2025-08-13T00:18:37.905857519Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:37.916029 containerd[1477]: time="2025-08-13T00:18:37.915979347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:37.917188 containerd[1477]: time="2025-08-13T00:18:37.916778943Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 1.981833857s" Aug 13 00:18:37.917188 containerd[1477]: time="2025-08-13T00:18:37.916832303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Aug 13 00:18:37.920945 containerd[1477]: time="2025-08-13T00:18:37.920311165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 13 00:18:37.939888 containerd[1477]: time="2025-08-13T00:18:37.939804026Z" level=info msg="CreateContainer within sandbox \"53b22b9a90b4d42dae5bdf921efc76bb20af661b02199c00fafd3685adfe1714\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 13 00:18:37.962266 containerd[1477]: time="2025-08-13T00:18:37.962133433Z" level=info msg="CreateContainer within sandbox \"53b22b9a90b4d42dae5bdf921efc76bb20af661b02199c00fafd3685adfe1714\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1c1cb42fec39a86aac9f705a4d5d85a23d064e4e20c0305248c57001b305945d\"" Aug 13 00:18:37.964607 containerd[1477]: time="2025-08-13T00:18:37.964556141Z" level=info msg="StartContainer for \"1c1cb42fec39a86aac9f705a4d5d85a23d064e4e20c0305248c57001b305945d\"" Aug 13 00:18:37.988747 kubelet[2623]: E0813 00:18:37.984786 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hgndp" podUID="549a8f55-4e42-4ab3-8e46-bed16219fe3c" Aug 13 00:18:38.012995 systemd[1]: Started cri-containerd-1c1cb42fec39a86aac9f705a4d5d85a23d064e4e20c0305248c57001b305945d.scope - libcontainer container 1c1cb42fec39a86aac9f705a4d5d85a23d064e4e20c0305248c57001b305945d. Aug 13 00:18:38.066955 containerd[1477]: time="2025-08-13T00:18:38.066863067Z" level=info msg="StartContainer for \"1c1cb42fec39a86aac9f705a4d5d85a23d064e4e20c0305248c57001b305945d\" returns successfully" Aug 13 00:18:38.170930 kubelet[2623]: E0813 00:18:38.170746 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.170930 kubelet[2623]: W0813 00:18:38.170799 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.170930 kubelet[2623]: E0813 00:18:38.170838 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.173030 kubelet[2623]: E0813 00:18:38.172976 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.173175 kubelet[2623]: W0813 00:18:38.173028 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.173175 kubelet[2623]: E0813 00:18:38.173123 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.174892 kubelet[2623]: E0813 00:18:38.173879 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.174892 kubelet[2623]: W0813 00:18:38.173917 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.174892 kubelet[2623]: E0813 00:18:38.173947 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.176468 kubelet[2623]: E0813 00:18:38.175667 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.176468 kubelet[2623]: W0813 00:18:38.175704 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.176468 kubelet[2623]: E0813 00:18:38.175738 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.177150 kubelet[2623]: E0813 00:18:38.177126 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.177379 kubelet[2623]: W0813 00:18:38.177265 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.177379 kubelet[2623]: E0813 00:18:38.177289 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.178178 kubelet[2623]: E0813 00:18:38.178001 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.178178 kubelet[2623]: W0813 00:18:38.178016 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.178178 kubelet[2623]: E0813 00:18:38.178034 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.179199 kubelet[2623]: E0813 00:18:38.178806 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.179199 kubelet[2623]: W0813 00:18:38.178822 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.179199 kubelet[2623]: E0813 00:18:38.178839 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.179199 kubelet[2623]: E0813 00:18:38.179380 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.179199 kubelet[2623]: W0813 00:18:38.179395 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.179199 kubelet[2623]: E0813 00:18:38.179416 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.180815 kubelet[2623]: E0813 00:18:38.180716 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.180815 kubelet[2623]: W0813 00:18:38.180735 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.180815 kubelet[2623]: E0813 00:18:38.180753 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.181678 kubelet[2623]: E0813 00:18:38.181495 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.181678 kubelet[2623]: W0813 00:18:38.181511 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.181678 kubelet[2623]: E0813 00:18:38.181525 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.182185 kubelet[2623]: E0813 00:18:38.182093 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.182185 kubelet[2623]: W0813 00:18:38.182110 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.182185 kubelet[2623]: E0813 00:18:38.182128 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.182642 kubelet[2623]: E0813 00:18:38.182626 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.182795 kubelet[2623]: W0813 00:18:38.182738 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.182795 kubelet[2623]: E0813 00:18:38.182758 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.183130 kubelet[2623]: E0813 00:18:38.183115 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.183341 kubelet[2623]: W0813 00:18:38.183238 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.183341 kubelet[2623]: E0813 00:18:38.183257 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.183666 kubelet[2623]: E0813 00:18:38.183649 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.183882 kubelet[2623]: W0813 00:18:38.183740 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.183882 kubelet[2623]: E0813 00:18:38.183759 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.184107 kubelet[2623]: E0813 00:18:38.184092 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.184286 kubelet[2623]: W0813 00:18:38.184184 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.184286 kubelet[2623]: E0813 00:18:38.184204 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.185623 kubelet[2623]: E0813 00:18:38.185567 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.185623 kubelet[2623]: W0813 00:18:38.185584 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.185623 kubelet[2623]: E0813 00:18:38.185598 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.186748 kubelet[2623]: E0813 00:18:38.186619 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.186748 kubelet[2623]: W0813 00:18:38.186641 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.186748 kubelet[2623]: E0813 00:18:38.186657 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.187363 kubelet[2623]: E0813 00:18:38.187222 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.187363 kubelet[2623]: W0813 00:18:38.187236 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.187363 kubelet[2623]: E0813 00:18:38.187250 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.188583 kubelet[2623]: E0813 00:18:38.188349 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.188583 kubelet[2623]: W0813 00:18:38.188367 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.188583 kubelet[2623]: E0813 00:18:38.188478 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.189820 kubelet[2623]: E0813 00:18:38.189094 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.189820 kubelet[2623]: W0813 00:18:38.189111 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.189820 kubelet[2623]: E0813 00:18:38.189142 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.190590 kubelet[2623]: E0813 00:18:38.190571 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.190763 kubelet[2623]: W0813 00:18:38.190656 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.190763 kubelet[2623]: E0813 00:18:38.190676 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.191613 kubelet[2623]: E0813 00:18:38.191409 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.191613 kubelet[2623]: W0813 00:18:38.191425 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.191613 kubelet[2623]: E0813 00:18:38.191474 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.192121 kubelet[2623]: E0813 00:18:38.191954 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.192121 kubelet[2623]: W0813 00:18:38.191968 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.192121 kubelet[2623]: E0813 00:18:38.192050 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.193710 kubelet[2623]: E0813 00:18:38.193604 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.193710 kubelet[2623]: W0813 00:18:38.193624 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.193710 kubelet[2623]: E0813 00:18:38.193642 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.195008 kubelet[2623]: E0813 00:18:38.194850 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.195008 kubelet[2623]: W0813 00:18:38.194868 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.195008 kubelet[2623]: E0813 00:18:38.194885 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.195217 kubelet[2623]: E0813 00:18:38.195204 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.195342 kubelet[2623]: W0813 00:18:38.195262 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.195342 kubelet[2623]: E0813 00:18:38.195281 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.195840 kubelet[2623]: E0813 00:18:38.195727 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.195840 kubelet[2623]: W0813 00:18:38.195744 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.195840 kubelet[2623]: E0813 00:18:38.195759 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.198004 kubelet[2623]: E0813 00:18:38.197615 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.198004 kubelet[2623]: W0813 00:18:38.197632 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.198004 kubelet[2623]: E0813 00:18:38.197646 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.198335 kubelet[2623]: E0813 00:18:38.198310 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.198335 kubelet[2623]: W0813 00:18:38.198333 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.198412 kubelet[2623]: E0813 00:18:38.198350 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.198916 kubelet[2623]: E0813 00:18:38.198805 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.198916 kubelet[2623]: W0813 00:18:38.198821 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.198916 kubelet[2623]: E0813 00:18:38.198834 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.199468 kubelet[2623]: E0813 00:18:38.199322 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.199468 kubelet[2623]: W0813 00:18:38.199346 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.199468 kubelet[2623]: E0813 00:18:38.199357 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.199922 kubelet[2623]: E0813 00:18:38.199728 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.199922 kubelet[2623]: W0813 00:18:38.199741 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.199922 kubelet[2623]: E0813 00:18:38.199753 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:38.200294 kubelet[2623]: E0813 00:18:38.200244 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:38.200294 kubelet[2623]: W0813 00:18:38.200257 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:38.200294 kubelet[2623]: E0813 00:18:38.200269 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.117247 kubelet[2623]: I0813 00:18:39.116585 2623 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:18:39.192200 kubelet[2623]: E0813 00:18:39.192141 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.192200 kubelet[2623]: W0813 00:18:39.192193 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.192592 kubelet[2623]: E0813 00:18:39.192223 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.192710 kubelet[2623]: E0813 00:18:39.192647 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.192710 kubelet[2623]: W0813 00:18:39.192678 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.192710 kubelet[2623]: E0813 00:18:39.192699 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.193090 kubelet[2623]: E0813 00:18:39.192998 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.193090 kubelet[2623]: W0813 00:18:39.193013 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.193090 kubelet[2623]: E0813 00:18:39.193029 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.193392 kubelet[2623]: E0813 00:18:39.193317 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.193392 kubelet[2623]: W0813 00:18:39.193332 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.193392 kubelet[2623]: E0813 00:18:39.193347 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.193877 kubelet[2623]: E0813 00:18:39.193818 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.193877 kubelet[2623]: W0813 00:18:39.193862 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.193877 kubelet[2623]: E0813 00:18:39.193876 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.194105 kubelet[2623]: E0813 00:18:39.194063 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.194105 kubelet[2623]: W0813 00:18:39.194083 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.194105 kubelet[2623]: E0813 00:18:39.194092 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.194268 kubelet[2623]: E0813 00:18:39.194243 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.194268 kubelet[2623]: W0813 00:18:39.194252 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.194268 kubelet[2623]: E0813 00:18:39.194259 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.194434 kubelet[2623]: E0813 00:18:39.194409 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.194434 kubelet[2623]: W0813 00:18:39.194424 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.194434 kubelet[2623]: E0813 00:18:39.194433 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.194794 kubelet[2623]: E0813 00:18:39.194654 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.194794 kubelet[2623]: W0813 00:18:39.194663 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.194794 kubelet[2623]: E0813 00:18:39.194675 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.194986 kubelet[2623]: E0813 00:18:39.194840 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.194986 kubelet[2623]: W0813 00:18:39.194851 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.194986 kubelet[2623]: E0813 00:18:39.194861 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.194986 kubelet[2623]: E0813 00:18:39.194987 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.195192 kubelet[2623]: W0813 00:18:39.194994 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.195192 kubelet[2623]: E0813 00:18:39.195006 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.195192 kubelet[2623]: E0813 00:18:39.195126 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.195192 kubelet[2623]: W0813 00:18:39.195134 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.195192 kubelet[2623]: E0813 00:18:39.195141 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.195458 kubelet[2623]: E0813 00:18:39.195281 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.195458 kubelet[2623]: W0813 00:18:39.195288 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.195458 kubelet[2623]: E0813 00:18:39.195295 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.195753 kubelet[2623]: E0813 00:18:39.195497 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.195753 kubelet[2623]: W0813 00:18:39.195506 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.195753 kubelet[2623]: E0813 00:18:39.195516 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.195821 kubelet[2623]: E0813 00:18:39.195759 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.195821 kubelet[2623]: W0813 00:18:39.195767 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.195821 kubelet[2623]: E0813 00:18:39.195776 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.197364 kubelet[2623]: E0813 00:18:39.197320 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.197364 kubelet[2623]: W0813 00:18:39.197342 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.197364 kubelet[2623]: E0813 00:18:39.197356 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.198214 kubelet[2623]: E0813 00:18:39.198151 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.198214 kubelet[2623]: W0813 00:18:39.198175 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.198214 kubelet[2623]: E0813 00:18:39.198190 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.198696 kubelet[2623]: E0813 00:18:39.198656 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.198696 kubelet[2623]: W0813 00:18:39.198679 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.198946 kubelet[2623]: E0813 00:18:39.198692 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.199377 kubelet[2623]: E0813 00:18:39.199354 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.199377 kubelet[2623]: W0813 00:18:39.199371 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.199489 kubelet[2623]: E0813 00:18:39.199385 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.199829 kubelet[2623]: E0813 00:18:39.199796 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.199829 kubelet[2623]: W0813 00:18:39.199819 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.199925 kubelet[2623]: E0813 00:18:39.199834 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.200363 kubelet[2623]: E0813 00:18:39.200313 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.200363 kubelet[2623]: W0813 00:18:39.200334 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.200363 kubelet[2623]: E0813 00:18:39.200347 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.201007 kubelet[2623]: E0813 00:18:39.200981 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.201007 kubelet[2623]: W0813 00:18:39.201001 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.201135 kubelet[2623]: E0813 00:18:39.201019 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.201466 kubelet[2623]: E0813 00:18:39.201401 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.201466 kubelet[2623]: W0813 00:18:39.201424 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.201466 kubelet[2623]: E0813 00:18:39.201456 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.202342 kubelet[2623]: E0813 00:18:39.202287 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.202342 kubelet[2623]: W0813 00:18:39.202307 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.202342 kubelet[2623]: E0813 00:18:39.202320 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.202662 kubelet[2623]: E0813 00:18:39.202644 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.202662 kubelet[2623]: W0813 00:18:39.202655 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.202774 kubelet[2623]: E0813 00:18:39.202665 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.202969 kubelet[2623]: E0813 00:18:39.202911 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.202969 kubelet[2623]: W0813 00:18:39.202933 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.202969 kubelet[2623]: E0813 00:18:39.202943 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.203724 kubelet[2623]: E0813 00:18:39.203669 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.203724 kubelet[2623]: W0813 00:18:39.203699 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.203868 kubelet[2623]: E0813 00:18:39.203854 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.204494 kubelet[2623]: E0813 00:18:39.204413 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.204641 kubelet[2623]: W0813 00:18:39.204597 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.204700 kubelet[2623]: E0813 00:18:39.204666 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.205604 kubelet[2623]: E0813 00:18:39.205571 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.205604 kubelet[2623]: W0813 00:18:39.205589 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.205604 kubelet[2623]: E0813 00:18:39.205601 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.206214 kubelet[2623]: E0813 00:18:39.205938 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.206214 kubelet[2623]: W0813 00:18:39.205951 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.206214 kubelet[2623]: E0813 00:18:39.205989 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.206398 kubelet[2623]: E0813 00:18:39.206358 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.206398 kubelet[2623]: W0813 00:18:39.206370 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.206398 kubelet[2623]: E0813 00:18:39.206400 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.207343 kubelet[2623]: E0813 00:18:39.206837 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.207343 kubelet[2623]: W0813 00:18:39.206859 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.207343 kubelet[2623]: E0813 00:18:39.206871 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.207966 kubelet[2623]: E0813 00:18:39.207927 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:18:39.207966 kubelet[2623]: W0813 00:18:39.207948 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:18:39.207966 kubelet[2623]: E0813 00:18:39.207963 2623 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:18:39.627935 containerd[1477]: time="2025-08-13T00:18:39.627887303Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:39.629253 containerd[1477]: time="2025-08-13T00:18:39.629168377Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Aug 13 00:18:39.631162 containerd[1477]: time="2025-08-13T00:18:39.631046768Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:39.634475 containerd[1477]: time="2025-08-13T00:18:39.634379591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:39.635987 containerd[1477]: time="2025-08-13T00:18:39.635932464Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.715574019s" Aug 13 00:18:39.635987 containerd[1477]: time="2025-08-13T00:18:39.635984143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Aug 13 00:18:39.641895 containerd[1477]: time="2025-08-13T00:18:39.641848675Z" level=info msg="CreateContainer within sandbox \"8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 13 00:18:39.664407 containerd[1477]: time="2025-08-13T00:18:39.664307444Z" level=info msg="CreateContainer within sandbox \"8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5\"" Aug 13 00:18:39.665328 containerd[1477]: time="2025-08-13T00:18:39.665264999Z" level=info msg="StartContainer for \"6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5\"" Aug 13 00:18:39.698505 systemd[1]: run-containerd-runc-k8s.io-6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5-runc.Bine4p.mount: Deactivated successfully. Aug 13 00:18:39.705680 systemd[1]: Started cri-containerd-6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5.scope - libcontainer container 6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5. Aug 13 00:18:39.742291 containerd[1477]: time="2025-08-13T00:18:39.742126460Z" level=info msg="StartContainer for \"6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5\" returns successfully" Aug 13 00:18:39.767751 systemd[1]: cri-containerd-6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5.scope: Deactivated successfully. Aug 13 00:18:39.807651 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5-rootfs.mount: Deactivated successfully. Aug 13 00:18:39.901892 containerd[1477]: time="2025-08-13T00:18:39.901546474Z" level=info msg="shim disconnected" id=6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5 namespace=k8s.io Aug 13 00:18:39.901892 containerd[1477]: time="2025-08-13T00:18:39.901645433Z" level=warning msg="cleaning up after shim disconnected" id=6dce108ebd7482b14bd82e7330049f8455be0d45017752de4b16c1c7a2ffefe5 namespace=k8s.io Aug 13 00:18:39.901892 containerd[1477]: time="2025-08-13T00:18:39.901657513Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 13 00:18:39.985417 kubelet[2623]: E0813 00:18:39.984965 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hgndp" podUID="549a8f55-4e42-4ab3-8e46-bed16219fe3c" Aug 13 00:18:40.128200 containerd[1477]: time="2025-08-13T00:18:40.128128324Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 13 00:18:40.156819 kubelet[2623]: I0813 00:18:40.156595 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5cf796b88d-8dmbb" podStartSLOduration=3.169895554 podStartE2EDuration="5.156574946s" podCreationTimestamp="2025-08-13 00:18:35 +0000 UTC" firstStartedPulling="2025-08-13 00:18:35.9322531 +0000 UTC m=+26.098627907" lastFinishedPulling="2025-08-13 00:18:37.918932412 +0000 UTC m=+28.085307299" observedRunningTime="2025-08-13 00:18:38.133730332 +0000 UTC m=+28.300105139" watchObservedRunningTime="2025-08-13 00:18:40.156574946 +0000 UTC m=+30.322949753" Aug 13 00:18:41.986915 kubelet[2623]: E0813 00:18:41.986212 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hgndp" podUID="549a8f55-4e42-4ab3-8e46-bed16219fe3c" Aug 13 00:18:42.852402 containerd[1477]: time="2025-08-13T00:18:42.850939387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:42.852402 containerd[1477]: time="2025-08-13T00:18:42.852342580Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Aug 13 00:18:42.853137 containerd[1477]: time="2025-08-13T00:18:42.853076857Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:42.856777 containerd[1477]: time="2025-08-13T00:18:42.856716359Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:42.858873 containerd[1477]: time="2025-08-13T00:18:42.858001273Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 2.729795909s" Aug 13 00:18:42.859321 containerd[1477]: time="2025-08-13T00:18:42.859201308Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Aug 13 00:18:42.867400 containerd[1477]: time="2025-08-13T00:18:42.867328989Z" level=info msg="CreateContainer within sandbox \"8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 13 00:18:42.891045 containerd[1477]: time="2025-08-13T00:18:42.890994636Z" level=info msg="CreateContainer within sandbox \"8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c\"" Aug 13 00:18:42.893785 containerd[1477]: time="2025-08-13T00:18:42.893724584Z" level=info msg="StartContainer for \"24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c\"" Aug 13 00:18:42.933646 systemd[1]: Started cri-containerd-24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c.scope - libcontainer container 24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c. Aug 13 00:18:42.971293 containerd[1477]: time="2025-08-13T00:18:42.971226375Z" level=info msg="StartContainer for \"24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c\" returns successfully" Aug 13 00:18:43.572388 containerd[1477]: time="2025-08-13T00:18:43.572327469Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 00:18:43.575684 systemd[1]: cri-containerd-24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c.scope: Deactivated successfully. Aug 13 00:18:43.605251 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c-rootfs.mount: Deactivated successfully. Aug 13 00:18:43.642136 kubelet[2623]: I0813 00:18:43.638963 2623 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 13 00:18:43.683607 containerd[1477]: time="2025-08-13T00:18:43.683533266Z" level=info msg="shim disconnected" id=24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c namespace=k8s.io Aug 13 00:18:43.683607 containerd[1477]: time="2025-08-13T00:18:43.683590986Z" level=warning msg="cleaning up after shim disconnected" id=24f5cbf8e1c7162d6d0d0d6bd9f32c1b2017a034551f411d0af48fb6f1ebc51c namespace=k8s.io Aug 13 00:18:43.683607 containerd[1477]: time="2025-08-13T00:18:43.683599946Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 13 00:18:43.708556 systemd[1]: Created slice kubepods-besteffort-podb5166162_63e4_49bd_85b9_e1c018b456ba.slice - libcontainer container kubepods-besteffort-podb5166162_63e4_49bd_85b9_e1c018b456ba.slice. Aug 13 00:18:43.724513 containerd[1477]: time="2025-08-13T00:18:43.722638282Z" level=warning msg="cleanup warnings time=\"2025-08-13T00:18:43Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Aug 13 00:18:43.727751 systemd[1]: Created slice kubepods-besteffort-pod44cb98ea_961e_4490_ac08_a70d6d62cf40.slice - libcontainer container kubepods-besteffort-pod44cb98ea_961e_4490_ac08_a70d6d62cf40.slice. Aug 13 00:18:43.734713 kubelet[2623]: I0813 00:18:43.734648 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f6fff45d-a329-484b-9031-9dfa784fb836-calico-apiserver-certs\") pod \"calico-apiserver-776d55f88-sw2d2\" (UID: \"f6fff45d-a329-484b-9031-9dfa784fb836\") " pod="calico-apiserver/calico-apiserver-776d55f88-sw2d2" Aug 13 00:18:43.734713 kubelet[2623]: I0813 00:18:43.734694 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sz9f\" (UniqueName: \"kubernetes.io/projected/f6fff45d-a329-484b-9031-9dfa784fb836-kube-api-access-9sz9f\") pod \"calico-apiserver-776d55f88-sw2d2\" (UID: \"f6fff45d-a329-484b-9031-9dfa784fb836\") " pod="calico-apiserver/calico-apiserver-776d55f88-sw2d2" Aug 13 00:18:43.734898 kubelet[2623]: I0813 00:18:43.734716 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b87d22-e12e-4c68-948f-91d75e756004-config-volume\") pod \"coredns-674b8bbfcf-hrjxb\" (UID: \"28b87d22-e12e-4c68-948f-91d75e756004\") " pod="kube-system/coredns-674b8bbfcf-hrjxb" Aug 13 00:18:43.734898 kubelet[2623]: I0813 00:18:43.734754 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44cb98ea-961e-4490-ac08-a70d6d62cf40-tigera-ca-bundle\") pod \"calico-kube-controllers-66459bfc84-krdnk\" (UID: \"44cb98ea-961e-4490-ac08-a70d6d62cf40\") " pod="calico-system/calico-kube-controllers-66459bfc84-krdnk" Aug 13 00:18:43.734898 kubelet[2623]: I0813 00:18:43.734769 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grxfc\" (UniqueName: \"kubernetes.io/projected/28b87d22-e12e-4c68-948f-91d75e756004-kube-api-access-grxfc\") pod \"coredns-674b8bbfcf-hrjxb\" (UID: \"28b87d22-e12e-4c68-948f-91d75e756004\") " pod="kube-system/coredns-674b8bbfcf-hrjxb" Aug 13 00:18:43.734898 kubelet[2623]: I0813 00:18:43.734788 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-985px\" (UniqueName: \"kubernetes.io/projected/44cb98ea-961e-4490-ac08-a70d6d62cf40-kube-api-access-985px\") pod \"calico-kube-controllers-66459bfc84-krdnk\" (UID: \"44cb98ea-961e-4490-ac08-a70d6d62cf40\") " pod="calico-system/calico-kube-controllers-66459bfc84-krdnk" Aug 13 00:18:43.734898 kubelet[2623]: I0813 00:18:43.734804 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b5166162-63e4-49bd-85b9-e1c018b456ba-calico-apiserver-certs\") pod \"calico-apiserver-776d55f88-4lbfd\" (UID: \"b5166162-63e4-49bd-85b9-e1c018b456ba\") " pod="calico-apiserver/calico-apiserver-776d55f88-4lbfd" Aug 13 00:18:43.735044 kubelet[2623]: I0813 00:18:43.734821 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ltpc\" (UniqueName: \"kubernetes.io/projected/b5166162-63e4-49bd-85b9-e1c018b456ba-kube-api-access-7ltpc\") pod \"calico-apiserver-776d55f88-4lbfd\" (UID: \"b5166162-63e4-49bd-85b9-e1c018b456ba\") " pod="calico-apiserver/calico-apiserver-776d55f88-4lbfd" Aug 13 00:18:43.744762 systemd[1]: Created slice kubepods-besteffort-podf6fff45d_a329_484b_9031_9dfa784fb836.slice - libcontainer container kubepods-besteffort-podf6fff45d_a329_484b_9031_9dfa784fb836.slice. Aug 13 00:18:43.763046 systemd[1]: Created slice kubepods-besteffort-pod3a23147b_52c7_4db5_9f87_1a85015feabd.slice - libcontainer container kubepods-besteffort-pod3a23147b_52c7_4db5_9f87_1a85015feabd.slice. Aug 13 00:18:43.785174 systemd[1]: Created slice kubepods-burstable-pod28b87d22_e12e_4c68_948f_91d75e756004.slice - libcontainer container kubepods-burstable-pod28b87d22_e12e_4c68_948f_91d75e756004.slice. Aug 13 00:18:43.794892 systemd[1]: Created slice kubepods-burstable-pod4bba616f_1733_479f_98eb_88700a24fca2.slice - libcontainer container kubepods-burstable-pod4bba616f_1733_479f_98eb_88700a24fca2.slice. Aug 13 00:18:43.805411 systemd[1]: Created slice kubepods-besteffort-podf7c4b773_00db_432a_98a5_27a94e2aa827.slice - libcontainer container kubepods-besteffort-podf7c4b773_00db_432a_98a5_27a94e2aa827.slice. Aug 13 00:18:43.836901 kubelet[2623]: I0813 00:18:43.836724 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c4b773-00db-432a-98a5-27a94e2aa827-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-tpx4h\" (UID: \"f7c4b773-00db-432a-98a5-27a94e2aa827\") " pod="calico-system/goldmane-768f4c5c69-tpx4h" Aug 13 00:18:43.836901 kubelet[2623]: I0813 00:18:43.836818 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-267bm\" (UniqueName: \"kubernetes.io/projected/3a23147b-52c7-4db5-9f87-1a85015feabd-kube-api-access-267bm\") pod \"whisker-85f6f6cf67-7zt6q\" (UID: \"3a23147b-52c7-4db5-9f87-1a85015feabd\") " pod="calico-system/whisker-85f6f6cf67-7zt6q" Aug 13 00:18:43.836901 kubelet[2623]: I0813 00:18:43.836892 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-backend-key-pair\") pod \"whisker-85f6f6cf67-7zt6q\" (UID: \"3a23147b-52c7-4db5-9f87-1a85015feabd\") " pod="calico-system/whisker-85f6f6cf67-7zt6q" Aug 13 00:18:43.837110 kubelet[2623]: I0813 00:18:43.837004 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f7c4b773-00db-432a-98a5-27a94e2aa827-goldmane-key-pair\") pod \"goldmane-768f4c5c69-tpx4h\" (UID: \"f7c4b773-00db-432a-98a5-27a94e2aa827\") " pod="calico-system/goldmane-768f4c5c69-tpx4h" Aug 13 00:18:43.837110 kubelet[2623]: I0813 00:18:43.837036 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bba616f-1733-479f-98eb-88700a24fca2-config-volume\") pod \"coredns-674b8bbfcf-dbkv2\" (UID: \"4bba616f-1733-479f-98eb-88700a24fca2\") " pod="kube-system/coredns-674b8bbfcf-dbkv2" Aug 13 00:18:43.837110 kubelet[2623]: I0813 00:18:43.837068 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c4b773-00db-432a-98a5-27a94e2aa827-config\") pod \"goldmane-768f4c5c69-tpx4h\" (UID: \"f7c4b773-00db-432a-98a5-27a94e2aa827\") " pod="calico-system/goldmane-768f4c5c69-tpx4h" Aug 13 00:18:43.837203 kubelet[2623]: I0813 00:18:43.837116 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwjjg\" (UniqueName: \"kubernetes.io/projected/f7c4b773-00db-432a-98a5-27a94e2aa827-kube-api-access-bwjjg\") pod \"goldmane-768f4c5c69-tpx4h\" (UID: \"f7c4b773-00db-432a-98a5-27a94e2aa827\") " pod="calico-system/goldmane-768f4c5c69-tpx4h" Aug 13 00:18:43.837203 kubelet[2623]: I0813 00:18:43.837142 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-ca-bundle\") pod \"whisker-85f6f6cf67-7zt6q\" (UID: \"3a23147b-52c7-4db5-9f87-1a85015feabd\") " pod="calico-system/whisker-85f6f6cf67-7zt6q" Aug 13 00:18:43.837203 kubelet[2623]: I0813 00:18:43.837176 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdtqv\" (UniqueName: \"kubernetes.io/projected/4bba616f-1733-479f-98eb-88700a24fca2-kube-api-access-gdtqv\") pod \"coredns-674b8bbfcf-dbkv2\" (UID: \"4bba616f-1733-479f-98eb-88700a24fca2\") " pod="kube-system/coredns-674b8bbfcf-dbkv2" Aug 13 00:18:43.997044 systemd[1]: Created slice kubepods-besteffort-pod549a8f55_4e42_4ab3_8e46_bed16219fe3c.slice - libcontainer container kubepods-besteffort-pod549a8f55_4e42_4ab3_8e46_bed16219fe3c.slice. Aug 13 00:18:44.002965 containerd[1477]: time="2025-08-13T00:18:44.002510887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hgndp,Uid:549a8f55-4e42-4ab3-8e46-bed16219fe3c,Namespace:calico-system,Attempt:0,}" Aug 13 00:18:44.030669 containerd[1477]: time="2025-08-13T00:18:44.030146638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-4lbfd,Uid:b5166162-63e4-49bd-85b9-e1c018b456ba,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:18:44.035152 containerd[1477]: time="2025-08-13T00:18:44.034799377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66459bfc84-krdnk,Uid:44cb98ea-961e-4490-ac08-a70d6d62cf40,Namespace:calico-system,Attempt:0,}" Aug 13 00:18:44.060068 containerd[1477]: time="2025-08-13T00:18:44.060019220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-sw2d2,Uid:f6fff45d-a329-484b-9031-9dfa784fb836,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:18:44.077628 containerd[1477]: time="2025-08-13T00:18:44.077403579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85f6f6cf67-7zt6q,Uid:3a23147b-52c7-4db5-9f87-1a85015feabd,Namespace:calico-system,Attempt:0,}" Aug 13 00:18:44.095788 containerd[1477]: time="2025-08-13T00:18:44.095599134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hrjxb,Uid:28b87d22-e12e-4c68-948f-91d75e756004,Namespace:kube-system,Attempt:0,}" Aug 13 00:18:44.104964 containerd[1477]: time="2025-08-13T00:18:44.104238934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbkv2,Uid:4bba616f-1733-479f-98eb-88700a24fca2,Namespace:kube-system,Attempt:0,}" Aug 13 00:18:44.115048 containerd[1477]: time="2025-08-13T00:18:44.115010964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tpx4h,Uid:f7c4b773-00db-432a-98a5-27a94e2aa827,Namespace:calico-system,Attempt:0,}" Aug 13 00:18:44.150343 containerd[1477]: time="2025-08-13T00:18:44.149918361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 13 00:18:44.290720 containerd[1477]: time="2025-08-13T00:18:44.290646907Z" level=error msg="Failed to destroy network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.291237 containerd[1477]: time="2025-08-13T00:18:44.291187865Z" level=error msg="encountered an error cleaning up failed sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.291503 containerd[1477]: time="2025-08-13T00:18:44.291408544Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hgndp,Uid:549a8f55-4e42-4ab3-8e46-bed16219fe3c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.292146 kubelet[2623]: E0813 00:18:44.292093 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.292245 kubelet[2623]: E0813 00:18:44.292165 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hgndp" Aug 13 00:18:44.292245 kubelet[2623]: E0813 00:18:44.292192 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hgndp" Aug 13 00:18:44.292301 kubelet[2623]: E0813 00:18:44.292241 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hgndp_calico-system(549a8f55-4e42-4ab3-8e46-bed16219fe3c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hgndp_calico-system(549a8f55-4e42-4ab3-8e46-bed16219fe3c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hgndp" podUID="549a8f55-4e42-4ab3-8e46-bed16219fe3c" Aug 13 00:18:44.320681 containerd[1477]: time="2025-08-13T00:18:44.320605688Z" level=error msg="Failed to destroy network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.324687 containerd[1477]: time="2025-08-13T00:18:44.324526590Z" level=error msg="encountered an error cleaning up failed sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.325018 containerd[1477]: time="2025-08-13T00:18:44.324868788Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66459bfc84-krdnk,Uid:44cb98ea-961e-4490-ac08-a70d6d62cf40,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.335605 kubelet[2623]: E0813 00:18:44.335548 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.335605 kubelet[2623]: E0813 00:18:44.335611 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66459bfc84-krdnk" Aug 13 00:18:44.335780 kubelet[2623]: E0813 00:18:44.335632 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66459bfc84-krdnk" Aug 13 00:18:44.335780 kubelet[2623]: E0813 00:18:44.335680 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66459bfc84-krdnk_calico-system(44cb98ea-961e-4490-ac08-a70d6d62cf40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66459bfc84-krdnk_calico-system(44cb98ea-961e-4490-ac08-a70d6d62cf40)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66459bfc84-krdnk" podUID="44cb98ea-961e-4490-ac08-a70d6d62cf40" Aug 13 00:18:44.363288 containerd[1477]: time="2025-08-13T00:18:44.362981771Z" level=error msg="Failed to destroy network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.365498 containerd[1477]: time="2025-08-13T00:18:44.365319960Z" level=error msg="encountered an error cleaning up failed sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.365838 containerd[1477]: time="2025-08-13T00:18:44.365416239Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-4lbfd,Uid:b5166162-63e4-49bd-85b9-e1c018b456ba,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.366869 kubelet[2623]: E0813 00:18:44.366136 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.367025 kubelet[2623]: E0813 00:18:44.366922 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776d55f88-4lbfd" Aug 13 00:18:44.367025 kubelet[2623]: E0813 00:18:44.366971 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776d55f88-4lbfd" Aug 13 00:18:44.367080 kubelet[2623]: E0813 00:18:44.367031 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-776d55f88-4lbfd_calico-apiserver(b5166162-63e4-49bd-85b9-e1c018b456ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-776d55f88-4lbfd_calico-apiserver(b5166162-63e4-49bd-85b9-e1c018b456ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-776d55f88-4lbfd" podUID="b5166162-63e4-49bd-85b9-e1c018b456ba" Aug 13 00:18:44.370837 containerd[1477]: time="2025-08-13T00:18:44.370545536Z" level=error msg="Failed to destroy network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.371691 containerd[1477]: time="2025-08-13T00:18:44.371608491Z" level=error msg="encountered an error cleaning up failed sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.372048 containerd[1477]: time="2025-08-13T00:18:44.371974489Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-sw2d2,Uid:f6fff45d-a329-484b-9031-9dfa784fb836,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.372561 kubelet[2623]: E0813 00:18:44.372455 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.372561 kubelet[2623]: E0813 00:18:44.372521 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776d55f88-sw2d2" Aug 13 00:18:44.372561 kubelet[2623]: E0813 00:18:44.372545 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776d55f88-sw2d2" Aug 13 00:18:44.373361 kubelet[2623]: E0813 00:18:44.372590 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-776d55f88-sw2d2_calico-apiserver(f6fff45d-a329-484b-9031-9dfa784fb836)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-776d55f88-sw2d2_calico-apiserver(f6fff45d-a329-484b-9031-9dfa784fb836)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-776d55f88-sw2d2" podUID="f6fff45d-a329-484b-9031-9dfa784fb836" Aug 13 00:18:44.391621 containerd[1477]: time="2025-08-13T00:18:44.391471718Z" level=error msg="Failed to destroy network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.392106 containerd[1477]: time="2025-08-13T00:18:44.392052156Z" level=error msg="encountered an error cleaning up failed sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.392313 containerd[1477]: time="2025-08-13T00:18:44.392239635Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hrjxb,Uid:28b87d22-e12e-4c68-948f-91d75e756004,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.392669 kubelet[2623]: E0813 00:18:44.392621 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.392757 kubelet[2623]: E0813 00:18:44.392686 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hrjxb" Aug 13 00:18:44.392757 kubelet[2623]: E0813 00:18:44.392712 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hrjxb" Aug 13 00:18:44.392818 kubelet[2623]: E0813 00:18:44.392780 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-hrjxb_kube-system(28b87d22-e12e-4c68-948f-91d75e756004)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-hrjxb_kube-system(28b87d22-e12e-4c68-948f-91d75e756004)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-hrjxb" podUID="28b87d22-e12e-4c68-948f-91d75e756004" Aug 13 00:18:44.395973 containerd[1477]: time="2025-08-13T00:18:44.395526499Z" level=error msg="Failed to destroy network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.395973 containerd[1477]: time="2025-08-13T00:18:44.395853258Z" level=error msg="encountered an error cleaning up failed sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.395973 containerd[1477]: time="2025-08-13T00:18:44.395943897Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85f6f6cf67-7zt6q,Uid:3a23147b-52c7-4db5-9f87-1a85015feabd,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.396227 kubelet[2623]: E0813 00:18:44.396156 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.396294 kubelet[2623]: E0813 00:18:44.396242 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-85f6f6cf67-7zt6q" Aug 13 00:18:44.396294 kubelet[2623]: E0813 00:18:44.396271 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-85f6f6cf67-7zt6q" Aug 13 00:18:44.396352 kubelet[2623]: E0813 00:18:44.396320 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-85f6f6cf67-7zt6q_calico-system(3a23147b-52c7-4db5-9f87-1a85015feabd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-85f6f6cf67-7zt6q_calico-system(3a23147b-52c7-4db5-9f87-1a85015feabd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-85f6f6cf67-7zt6q" podUID="3a23147b-52c7-4db5-9f87-1a85015feabd" Aug 13 00:18:44.412078 containerd[1477]: time="2025-08-13T00:18:44.411999743Z" level=error msg="Failed to destroy network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.413210 containerd[1477]: time="2025-08-13T00:18:44.413060138Z" level=error msg="encountered an error cleaning up failed sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.413210 containerd[1477]: time="2025-08-13T00:18:44.413151017Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbkv2,Uid:4bba616f-1733-479f-98eb-88700a24fca2,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.413758 kubelet[2623]: E0813 00:18:44.413688 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.413978 kubelet[2623]: E0813 00:18:44.413862 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dbkv2" Aug 13 00:18:44.413978 kubelet[2623]: E0813 00:18:44.413888 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dbkv2" Aug 13 00:18:44.414277 kubelet[2623]: E0813 00:18:44.414134 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dbkv2_kube-system(4bba616f-1733-479f-98eb-88700a24fca2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dbkv2_kube-system(4bba616f-1733-479f-98eb-88700a24fca2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dbkv2" podUID="4bba616f-1733-479f-98eb-88700a24fca2" Aug 13 00:18:44.427153 containerd[1477]: time="2025-08-13T00:18:44.427028433Z" level=error msg="Failed to destroy network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.427893 containerd[1477]: time="2025-08-13T00:18:44.427823549Z" level=error msg="encountered an error cleaning up failed sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.428036 containerd[1477]: time="2025-08-13T00:18:44.427919349Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tpx4h,Uid:f7c4b773-00db-432a-98a5-27a94e2aa827,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.429553 kubelet[2623]: E0813 00:18:44.428203 2623 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:44.429553 kubelet[2623]: E0813 00:18:44.428292 2623 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tpx4h" Aug 13 00:18:44.429553 kubelet[2623]: E0813 00:18:44.428327 2623 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tpx4h" Aug 13 00:18:44.429782 kubelet[2623]: E0813 00:18:44.428421 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-tpx4h_calico-system(f7c4b773-00db-432a-98a5-27a94e2aa827)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-tpx4h_calico-system(f7c4b773-00db-432a-98a5-27a94e2aa827)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-tpx4h" podUID="f7c4b773-00db-432a-98a5-27a94e2aa827" Aug 13 00:18:45.150232 kubelet[2623]: I0813 00:18:45.150177 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:18:45.152816 containerd[1477]: time="2025-08-13T00:18:45.151417151Z" level=info msg="StopPodSandbox for \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\"" Aug 13 00:18:45.152816 containerd[1477]: time="2025-08-13T00:18:45.152160908Z" level=info msg="Ensure that sandbox 01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962 in task-service has been cleanup successfully" Aug 13 00:18:45.155257 kubelet[2623]: I0813 00:18:45.154273 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:18:45.155752 containerd[1477]: time="2025-08-13T00:18:45.155721211Z" level=info msg="StopPodSandbox for \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\"" Aug 13 00:18:45.156391 containerd[1477]: time="2025-08-13T00:18:45.156359688Z" level=info msg="Ensure that sandbox ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3 in task-service has been cleanup successfully" Aug 13 00:18:45.158190 kubelet[2623]: I0813 00:18:45.158158 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:18:45.161557 containerd[1477]: time="2025-08-13T00:18:45.161511624Z" level=info msg="StopPodSandbox for \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\"" Aug 13 00:18:45.161833 kubelet[2623]: I0813 00:18:45.161767 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:18:45.162326 containerd[1477]: time="2025-08-13T00:18:45.162050862Z" level=info msg="Ensure that sandbox b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b in task-service has been cleanup successfully" Aug 13 00:18:45.164716 containerd[1477]: time="2025-08-13T00:18:45.164641730Z" level=info msg="StopPodSandbox for \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\"" Aug 13 00:18:45.167865 containerd[1477]: time="2025-08-13T00:18:45.167680236Z" level=info msg="Ensure that sandbox 51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae in task-service has been cleanup successfully" Aug 13 00:18:45.169217 kubelet[2623]: I0813 00:18:45.169178 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:18:45.170613 containerd[1477]: time="2025-08-13T00:18:45.170560663Z" level=info msg="StopPodSandbox for \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\"" Aug 13 00:18:45.170752 containerd[1477]: time="2025-08-13T00:18:45.170727622Z" level=info msg="Ensure that sandbox 2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6 in task-service has been cleanup successfully" Aug 13 00:18:45.177072 kubelet[2623]: I0813 00:18:45.177041 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:18:45.179169 containerd[1477]: time="2025-08-13T00:18:45.179093104Z" level=info msg="StopPodSandbox for \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\"" Aug 13 00:18:45.180362 containerd[1477]: time="2025-08-13T00:18:45.180326698Z" level=info msg="Ensure that sandbox 0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d in task-service has been cleanup successfully" Aug 13 00:18:45.184162 kubelet[2623]: I0813 00:18:45.184126 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:18:45.186924 containerd[1477]: time="2025-08-13T00:18:45.186592589Z" level=info msg="StopPodSandbox for \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\"" Aug 13 00:18:45.191507 containerd[1477]: time="2025-08-13T00:18:45.191391287Z" level=info msg="Ensure that sandbox ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565 in task-service has been cleanup successfully" Aug 13 00:18:45.198768 kubelet[2623]: I0813 00:18:45.198686 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:18:45.199905 containerd[1477]: time="2025-08-13T00:18:45.199765048Z" level=info msg="StopPodSandbox for \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\"" Aug 13 00:18:45.200560 containerd[1477]: time="2025-08-13T00:18:45.200211686Z" level=info msg="Ensure that sandbox e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2 in task-service has been cleanup successfully" Aug 13 00:18:45.291102 containerd[1477]: time="2025-08-13T00:18:45.290354151Z" level=error msg="StopPodSandbox for \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\" failed" error="failed to destroy network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.291318 kubelet[2623]: E0813 00:18:45.291049 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:18:45.291318 kubelet[2623]: E0813 00:18:45.291122 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3"} Aug 13 00:18:45.291318 kubelet[2623]: E0813 00:18:45.291193 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"549a8f55-4e42-4ab3-8e46-bed16219fe3c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.291318 kubelet[2623]: E0813 00:18:45.291214 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"549a8f55-4e42-4ab3-8e46-bed16219fe3c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hgndp" podUID="549a8f55-4e42-4ab3-8e46-bed16219fe3c" Aug 13 00:18:45.310033 containerd[1477]: time="2025-08-13T00:18:45.309242504Z" level=error msg="StopPodSandbox for \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\" failed" error="failed to destroy network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.310198 kubelet[2623]: E0813 00:18:45.309600 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:18:45.310198 kubelet[2623]: E0813 00:18:45.309652 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962"} Aug 13 00:18:45.310198 kubelet[2623]: E0813 00:18:45.309693 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3a23147b-52c7-4db5-9f87-1a85015feabd\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.310198 kubelet[2623]: E0813 00:18:45.309715 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3a23147b-52c7-4db5-9f87-1a85015feabd\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-85f6f6cf67-7zt6q" podUID="3a23147b-52c7-4db5-9f87-1a85015feabd" Aug 13 00:18:45.317854 containerd[1477]: time="2025-08-13T00:18:45.317583546Z" level=error msg="StopPodSandbox for \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\" failed" error="failed to destroy network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.318045 kubelet[2623]: E0813 00:18:45.317930 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:18:45.318045 kubelet[2623]: E0813 00:18:45.317977 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565"} Aug 13 00:18:45.318045 kubelet[2623]: E0813 00:18:45.318013 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"44cb98ea-961e-4490-ac08-a70d6d62cf40\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.318618 kubelet[2623]: E0813 00:18:45.318036 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"44cb98ea-961e-4490-ac08-a70d6d62cf40\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66459bfc84-krdnk" podUID="44cb98ea-961e-4490-ac08-a70d6d62cf40" Aug 13 00:18:45.318846 containerd[1477]: time="2025-08-13T00:18:45.318807100Z" level=error msg="StopPodSandbox for \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\" failed" error="failed to destroy network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.319526 kubelet[2623]: E0813 00:18:45.319411 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:18:45.319662 kubelet[2623]: E0813 00:18:45.319543 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d"} Aug 13 00:18:45.319662 kubelet[2623]: E0813 00:18:45.319584 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b5166162-63e4-49bd-85b9-e1c018b456ba\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.319662 kubelet[2623]: E0813 00:18:45.319604 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b5166162-63e4-49bd-85b9-e1c018b456ba\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-776d55f88-4lbfd" podUID="b5166162-63e4-49bd-85b9-e1c018b456ba" Aug 13 00:18:45.327306 containerd[1477]: time="2025-08-13T00:18:45.327253421Z" level=error msg="StopPodSandbox for \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\" failed" error="failed to destroy network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.327636 containerd[1477]: time="2025-08-13T00:18:45.327261461Z" level=error msg="StopPodSandbox for \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\" failed" error="failed to destroy network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.327825 kubelet[2623]: E0813 00:18:45.327780 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:18:45.327825 kubelet[2623]: E0813 00:18:45.327838 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b"} Aug 13 00:18:45.328106 kubelet[2623]: E0813 00:18:45.327873 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"f7c4b773-00db-432a-98a5-27a94e2aa827\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.328106 kubelet[2623]: E0813 00:18:45.327900 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"f7c4b773-00db-432a-98a5-27a94e2aa827\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-tpx4h" podUID="f7c4b773-00db-432a-98a5-27a94e2aa827" Aug 13 00:18:45.328106 kubelet[2623]: E0813 00:18:45.327983 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:18:45.328106 kubelet[2623]: E0813 00:18:45.328004 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae"} Aug 13 00:18:45.328290 kubelet[2623]: E0813 00:18:45.328035 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4bba616f-1733-479f-98eb-88700a24fca2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.328290 kubelet[2623]: E0813 00:18:45.328055 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4bba616f-1733-479f-98eb-88700a24fca2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dbkv2" podUID="4bba616f-1733-479f-98eb-88700a24fca2" Aug 13 00:18:45.329222 containerd[1477]: time="2025-08-13T00:18:45.329098853Z" level=error msg="StopPodSandbox for \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\" failed" error="failed to destroy network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.329375 kubelet[2623]: E0813 00:18:45.329338 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:18:45.329519 kubelet[2623]: E0813 00:18:45.329384 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6"} Aug 13 00:18:45.329519 kubelet[2623]: E0813 00:18:45.329414 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"f6fff45d-a329-484b-9031-9dfa784fb836\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.329775 kubelet[2623]: E0813 00:18:45.329702 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"f6fff45d-a329-484b-9031-9dfa784fb836\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-776d55f88-sw2d2" podUID="f6fff45d-a329-484b-9031-9dfa784fb836" Aug 13 00:18:45.333179 containerd[1477]: time="2025-08-13T00:18:45.333056235Z" level=error msg="StopPodSandbox for \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\" failed" error="failed to destroy network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:18:45.333602 kubelet[2623]: E0813 00:18:45.333546 2623 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:18:45.333677 kubelet[2623]: E0813 00:18:45.333606 2623 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2"} Aug 13 00:18:45.333677 kubelet[2623]: E0813 00:18:45.333657 2623 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"28b87d22-e12e-4c68-948f-91d75e756004\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 13 00:18:45.333794 kubelet[2623]: E0813 00:18:45.333688 2623 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"28b87d22-e12e-4c68-948f-91d75e756004\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-hrjxb" podUID="28b87d22-e12e-4c68-948f-91d75e756004" Aug 13 00:18:47.805354 kubelet[2623]: I0813 00:18:47.803048 2623 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:18:48.938795 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3619004217.mount: Deactivated successfully. Aug 13 00:18:48.995138 containerd[1477]: time="2025-08-13T00:18:48.994015268Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:48.996521 containerd[1477]: time="2025-08-13T00:18:48.996482977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Aug 13 00:18:48.997998 containerd[1477]: time="2025-08-13T00:18:48.997958371Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:49.000916 containerd[1477]: time="2025-08-13T00:18:49.000862398Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:49.002871 containerd[1477]: time="2025-08-13T00:18:49.002813269Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 4.85224015s" Aug 13 00:18:49.003077 containerd[1477]: time="2025-08-13T00:18:49.003040708Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Aug 13 00:18:49.023830 containerd[1477]: time="2025-08-13T00:18:49.023794856Z" level=info msg="CreateContainer within sandbox \"8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 13 00:18:49.043215 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1399217645.mount: Deactivated successfully. Aug 13 00:18:49.044815 containerd[1477]: time="2025-08-13T00:18:49.044686083Z" level=info msg="CreateContainer within sandbox \"8d1f06e8f285eaef09e584defbe6fb1a0436fa2a3c5bfc4511c165466fdb33d2\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"d5b2bf1f5736e513e62817895c1514b853e58c086f37a66adcf1d222cbc00b60\"" Aug 13 00:18:49.047681 containerd[1477]: time="2025-08-13T00:18:49.046698834Z" level=info msg="StartContainer for \"d5b2bf1f5736e513e62817895c1514b853e58c086f37a66adcf1d222cbc00b60\"" Aug 13 00:18:49.081878 systemd[1]: Started cri-containerd-d5b2bf1f5736e513e62817895c1514b853e58c086f37a66adcf1d222cbc00b60.scope - libcontainer container d5b2bf1f5736e513e62817895c1514b853e58c086f37a66adcf1d222cbc00b60. Aug 13 00:18:49.118902 containerd[1477]: time="2025-08-13T00:18:49.118753954Z" level=info msg="StartContainer for \"d5b2bf1f5736e513e62817895c1514b853e58c086f37a66adcf1d222cbc00b60\" returns successfully" Aug 13 00:18:49.247270 kubelet[2623]: I0813 00:18:49.247066 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-qdf9l" podStartSLOduration=1.376281977 podStartE2EDuration="14.245967308s" podCreationTimestamp="2025-08-13 00:18:35 +0000 UTC" firstStartedPulling="2025-08-13 00:18:36.134561372 +0000 UTC m=+26.300936179" lastFinishedPulling="2025-08-13 00:18:49.004246703 +0000 UTC m=+39.170621510" observedRunningTime="2025-08-13 00:18:49.242658643 +0000 UTC m=+39.409033450" watchObservedRunningTime="2025-08-13 00:18:49.245967308 +0000 UTC m=+39.412342115" Aug 13 00:18:49.313978 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 13 00:18:49.314085 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 13 00:18:49.482462 containerd[1477]: time="2025-08-13T00:18:49.481907339Z" level=info msg="StopPodSandbox for \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\"" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.603 [INFO][3877] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.603 [INFO][3877] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" iface="eth0" netns="/var/run/netns/cni-1e4d17b2-c990-f1ac-4987-463aed84f02a" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.604 [INFO][3877] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" iface="eth0" netns="/var/run/netns/cni-1e4d17b2-c990-f1ac-4987-463aed84f02a" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.606 [INFO][3877] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" iface="eth0" netns="/var/run/netns/cni-1e4d17b2-c990-f1ac-4987-463aed84f02a" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.606 [INFO][3877] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.606 [INFO][3877] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.671 [INFO][3891] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.671 [INFO][3891] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.671 [INFO][3891] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.686 [WARNING][3891] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.687 [INFO][3891] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.689 [INFO][3891] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:49.695154 containerd[1477]: 2025-08-13 00:18:49.692 [INFO][3877] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:18:49.695802 containerd[1477]: time="2025-08-13T00:18:49.695584709Z" level=info msg="TearDown network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\" successfully" Aug 13 00:18:49.695802 containerd[1477]: time="2025-08-13T00:18:49.695621309Z" level=info msg="StopPodSandbox for \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\" returns successfully" Aug 13 00:18:49.783674 kubelet[2623]: I0813 00:18:49.783612 2623 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-backend-key-pair\") pod \"3a23147b-52c7-4db5-9f87-1a85015feabd\" (UID: \"3a23147b-52c7-4db5-9f87-1a85015feabd\") " Aug 13 00:18:49.783674 kubelet[2623]: I0813 00:18:49.783666 2623 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-ca-bundle\") pod \"3a23147b-52c7-4db5-9f87-1a85015feabd\" (UID: \"3a23147b-52c7-4db5-9f87-1a85015feabd\") " Aug 13 00:18:49.783879 kubelet[2623]: I0813 00:18:49.783697 2623 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-267bm\" (UniqueName: \"kubernetes.io/projected/3a23147b-52c7-4db5-9f87-1a85015feabd-kube-api-access-267bm\") pod \"3a23147b-52c7-4db5-9f87-1a85015feabd\" (UID: \"3a23147b-52c7-4db5-9f87-1a85015feabd\") " Aug 13 00:18:49.791480 kubelet[2623]: I0813 00:18:49.791189 2623 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "3a23147b-52c7-4db5-9f87-1a85015feabd" (UID: "3a23147b-52c7-4db5-9f87-1a85015feabd"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 13 00:18:49.792420 kubelet[2623]: I0813 00:18:49.792370 2623 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a23147b-52c7-4db5-9f87-1a85015feabd-kube-api-access-267bm" (OuterVolumeSpecName: "kube-api-access-267bm") pod "3a23147b-52c7-4db5-9f87-1a85015feabd" (UID: "3a23147b-52c7-4db5-9f87-1a85015feabd"). InnerVolumeSpecName "kube-api-access-267bm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 13 00:18:49.792755 kubelet[2623]: I0813 00:18:49.792571 2623 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "3a23147b-52c7-4db5-9f87-1a85015feabd" (UID: "3a23147b-52c7-4db5-9f87-1a85015feabd"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 13 00:18:49.885159 kubelet[2623]: I0813 00:18:49.885086 2623 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-267bm\" (UniqueName: \"kubernetes.io/projected/3a23147b-52c7-4db5-9f87-1a85015feabd-kube-api-access-267bm\") on node \"ci-4081-3-5-8-f2ca23fedd\" DevicePath \"\"" Aug 13 00:18:49.885889 kubelet[2623]: I0813 00:18:49.885759 2623 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-backend-key-pair\") on node \"ci-4081-3-5-8-f2ca23fedd\" DevicePath \"\"" Aug 13 00:18:49.885889 kubelet[2623]: I0813 00:18:49.885833 2623 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a23147b-52c7-4db5-9f87-1a85015feabd-whisker-ca-bundle\") on node \"ci-4081-3-5-8-f2ca23fedd\" DevicePath \"\"" Aug 13 00:18:49.935531 systemd[1]: run-netns-cni\x2d1e4d17b2\x2dc990\x2df1ac\x2d4987\x2d463aed84f02a.mount: Deactivated successfully. Aug 13 00:18:49.935817 systemd[1]: var-lib-kubelet-pods-3a23147b\x2d52c7\x2d4db5\x2d9f87\x2d1a85015feabd-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d267bm.mount: Deactivated successfully. Aug 13 00:18:49.935987 systemd[1]: var-lib-kubelet-pods-3a23147b\x2d52c7\x2d4db5\x2d9f87\x2d1a85015feabd-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 13 00:18:49.995594 systemd[1]: Removed slice kubepods-besteffort-pod3a23147b_52c7_4db5_9f87_1a85015feabd.slice - libcontainer container kubepods-besteffort-pod3a23147b_52c7_4db5_9f87_1a85015feabd.slice. Aug 13 00:18:50.324430 systemd[1]: Created slice kubepods-besteffort-pod986dd22b_ad89_4e2b_bd47_6ac70ad1f0ed.slice - libcontainer container kubepods-besteffort-pod986dd22b_ad89_4e2b_bd47_6ac70ad1f0ed.slice. Aug 13 00:18:50.389186 kubelet[2623]: I0813 00:18:50.389042 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed-whisker-backend-key-pair\") pod \"whisker-65c4f4585c-p6j6w\" (UID: \"986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed\") " pod="calico-system/whisker-65c4f4585c-p6j6w" Aug 13 00:18:50.389186 kubelet[2623]: I0813 00:18:50.389102 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed-whisker-ca-bundle\") pod \"whisker-65c4f4585c-p6j6w\" (UID: \"986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed\") " pod="calico-system/whisker-65c4f4585c-p6j6w" Aug 13 00:18:50.389186 kubelet[2623]: I0813 00:18:50.389120 2623 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxmxk\" (UniqueName: \"kubernetes.io/projected/986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed-kube-api-access-qxmxk\") pod \"whisker-65c4f4585c-p6j6w\" (UID: \"986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed\") " pod="calico-system/whisker-65c4f4585c-p6j6w" Aug 13 00:18:50.631332 containerd[1477]: time="2025-08-13T00:18:50.631133091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65c4f4585c-p6j6w,Uid:986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed,Namespace:calico-system,Attempt:0,}" Aug 13 00:18:50.799424 systemd-networkd[1377]: cali79e420b4b53: Link UP Aug 13 00:18:50.800494 systemd-networkd[1377]: cali79e420b4b53: Gained carrier Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.671 [INFO][3936] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.689 [INFO][3936] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0 whisker-65c4f4585c- calico-system 986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed 932 0 2025-08-13 00:18:50 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:65c4f4585c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd whisker-65c4f4585c-p6j6w eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali79e420b4b53 [] [] }} ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.689 [INFO][3936] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.724 [INFO][3947] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" HandleID="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.724 [INFO][3947] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" HandleID="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b210), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"whisker-65c4f4585c-p6j6w", "timestamp":"2025-08-13 00:18:50.724073139 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.724 [INFO][3947] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.724 [INFO][3947] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.724 [INFO][3947] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.740 [INFO][3947] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.748 [INFO][3947] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.755 [INFO][3947] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.759 [INFO][3947] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.763 [INFO][3947] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.763 [INFO][3947] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.766 [INFO][3947] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.776 [INFO][3947] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.785 [INFO][3947] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.1/26] block=192.168.98.0/26 handle="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.785 [INFO][3947] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.1/26] handle="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.785 [INFO][3947] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:50.828757 containerd[1477]: 2025-08-13 00:18:50.785 [INFO][3947] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.1/26] IPv6=[] ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" HandleID="k8s-pod-network.41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" Aug 13 00:18:50.829478 containerd[1477]: 2025-08-13 00:18:50.788 [INFO][3936] cni-plugin/k8s.go 418: Populated endpoint ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0", GenerateName:"whisker-65c4f4585c-", Namespace:"calico-system", SelfLink:"", UID:"986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"65c4f4585c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"whisker-65c4f4585c-p6j6w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali79e420b4b53", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:50.829478 containerd[1477]: 2025-08-13 00:18:50.788 [INFO][3936] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.1/32] ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" Aug 13 00:18:50.829478 containerd[1477]: 2025-08-13 00:18:50.789 [INFO][3936] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79e420b4b53 ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" Aug 13 00:18:50.829478 containerd[1477]: 2025-08-13 00:18:50.801 [INFO][3936] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" Aug 13 00:18:50.829478 containerd[1477]: 2025-08-13 00:18:50.802 [INFO][3936] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0", GenerateName:"whisker-65c4f4585c-", Namespace:"calico-system", SelfLink:"", UID:"986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"65c4f4585c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f", Pod:"whisker-65c4f4585c-p6j6w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali79e420b4b53", MAC:"ea:98:de:49:e1:94", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:50.829478 containerd[1477]: 2025-08-13 00:18:50.824 [INFO][3936] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f" Namespace="calico-system" Pod="whisker-65c4f4585c-p6j6w" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--65c4f4585c--p6j6w-eth0" Aug 13 00:18:50.868314 containerd[1477]: time="2025-08-13T00:18:50.864603853Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:50.868314 containerd[1477]: time="2025-08-13T00:18:50.868050105Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:50.868314 containerd[1477]: time="2025-08-13T00:18:50.868068146Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:50.869776 containerd[1477]: time="2025-08-13T00:18:50.869571683Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:50.918731 systemd[1]: Started cri-containerd-41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f.scope - libcontainer container 41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f. Aug 13 00:18:51.014941 containerd[1477]: time="2025-08-13T00:18:51.014884165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65c4f4585c-p6j6w,Uid:986dd22b-ad89-4e2b-bd47-6ac70ad1f0ed,Namespace:calico-system,Attempt:0,} returns sandbox id \"41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f\"" Aug 13 00:18:51.034043 containerd[1477]: time="2025-08-13T00:18:51.033978596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 13 00:18:51.368563 kernel: bpftool[4142]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Aug 13 00:18:51.593812 systemd-networkd[1377]: vxlan.calico: Link UP Aug 13 00:18:51.593820 systemd-networkd[1377]: vxlan.calico: Gained carrier Aug 13 00:18:51.995850 kubelet[2623]: I0813 00:18:51.995614 2623 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a23147b-52c7-4db5-9f87-1a85015feabd" path="/var/lib/kubelet/pods/3a23147b-52c7-4db5-9f87-1a85015feabd/volumes" Aug 13 00:18:52.359781 containerd[1477]: time="2025-08-13T00:18:52.359637195Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:52.361944 containerd[1477]: time="2025-08-13T00:18:52.361694469Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Aug 13 00:18:52.363888 containerd[1477]: time="2025-08-13T00:18:52.363834066Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:52.368845 containerd[1477]: time="2025-08-13T00:18:52.368775524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:52.370707 containerd[1477]: time="2025-08-13T00:18:52.370530028Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.33650127s" Aug 13 00:18:52.370707 containerd[1477]: time="2025-08-13T00:18:52.370586750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Aug 13 00:18:52.377672 containerd[1477]: time="2025-08-13T00:18:52.377576762Z" level=info msg="CreateContainer within sandbox \"41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 13 00:18:52.396972 containerd[1477]: time="2025-08-13T00:18:52.396868498Z" level=info msg="CreateContainer within sandbox \"41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"be86bd6564e1198e4c635387890e34e91b040b35d5ae19635f144e06aabb1a09\"" Aug 13 00:18:52.399042 containerd[1477]: time="2025-08-13T00:18:52.397789251Z" level=info msg="StartContainer for \"be86bd6564e1198e4c635387890e34e91b040b35d5ae19635f144e06aabb1a09\"" Aug 13 00:18:52.449655 systemd[1]: Started cri-containerd-be86bd6564e1198e4c635387890e34e91b040b35d5ae19635f144e06aabb1a09.scope - libcontainer container be86bd6564e1198e4c635387890e34e91b040b35d5ae19635f144e06aabb1a09. Aug 13 00:18:52.494053 containerd[1477]: time="2025-08-13T00:18:52.493982001Z" level=info msg="StartContainer for \"be86bd6564e1198e4c635387890e34e91b040b35d5ae19635f144e06aabb1a09\" returns successfully" Aug 13 00:18:52.498425 containerd[1477]: time="2025-08-13T00:18:52.498239554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 13 00:18:52.719718 systemd-networkd[1377]: cali79e420b4b53: Gained IPv6LL Aug 13 00:18:53.424601 systemd-networkd[1377]: vxlan.calico: Gained IPv6LL Aug 13 00:18:55.913636 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3280396027.mount: Deactivated successfully. Aug 13 00:18:55.936202 containerd[1477]: time="2025-08-13T00:18:55.936137218Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:55.937465 containerd[1477]: time="2025-08-13T00:18:55.937359458Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Aug 13 00:18:55.939546 containerd[1477]: time="2025-08-13T00:18:55.938294529Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:55.942167 containerd[1477]: time="2025-08-13T00:18:55.942101894Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:18:55.943600 containerd[1477]: time="2025-08-13T00:18:55.943550621Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 3.445250585s" Aug 13 00:18:55.943846 containerd[1477]: time="2025-08-13T00:18:55.943750428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Aug 13 00:18:55.948942 containerd[1477]: time="2025-08-13T00:18:55.948905717Z" level=info msg="CreateContainer within sandbox \"41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 13 00:18:55.975226 containerd[1477]: time="2025-08-13T00:18:55.975152180Z" level=info msg="CreateContainer within sandbox \"41e99e6a1b8a454f0e3d5d88e06e81b91d34e67a6d2874260006ecb73e76007f\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"cd2e89b22772c05d4c6f13e85d10ffd5314afb5bed19056b0de119f254e310bb\"" Aug 13 00:18:55.977000 containerd[1477]: time="2025-08-13T00:18:55.976935119Z" level=info msg="StartContainer for \"cd2e89b22772c05d4c6f13e85d10ffd5314afb5bed19056b0de119f254e310bb\"" Aug 13 00:18:56.019753 systemd[1]: Started cri-containerd-cd2e89b22772c05d4c6f13e85d10ffd5314afb5bed19056b0de119f254e310bb.scope - libcontainer container cd2e89b22772c05d4c6f13e85d10ffd5314afb5bed19056b0de119f254e310bb. Aug 13 00:18:56.078084 containerd[1477]: time="2025-08-13T00:18:56.075858055Z" level=info msg="StartContainer for \"cd2e89b22772c05d4c6f13e85d10ffd5314afb5bed19056b0de119f254e310bb\" returns successfully" Aug 13 00:18:57.989090 containerd[1477]: time="2025-08-13T00:18:57.989000976Z" level=info msg="StopPodSandbox for \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\"" Aug 13 00:18:57.992871 containerd[1477]: time="2025-08-13T00:18:57.991233725Z" level=info msg="StopPodSandbox for \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\"" Aug 13 00:18:57.993201 containerd[1477]: time="2025-08-13T00:18:57.991355569Z" level=info msg="StopPodSandbox for \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\"" Aug 13 00:18:57.996305 containerd[1477]: time="2025-08-13T00:18:57.991376889Z" level=info msg="StopPodSandbox for \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\"" Aug 13 00:18:58.108228 kubelet[2623]: I0813 00:18:58.108151 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-65c4f4585c-p6j6w" podStartSLOduration=3.19380592 podStartE2EDuration="8.108131036s" podCreationTimestamp="2025-08-13 00:18:50 +0000 UTC" firstStartedPulling="2025-08-13 00:18:51.030533028 +0000 UTC m=+41.196907835" lastFinishedPulling="2025-08-13 00:18:55.944858144 +0000 UTC m=+46.111232951" observedRunningTime="2025-08-13 00:18:56.269116813 +0000 UTC m=+46.435491660" watchObservedRunningTime="2025-08-13 00:18:58.108131036 +0000 UTC m=+48.274505803" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.113 [INFO][4339] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.114 [INFO][4339] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" iface="eth0" netns="/var/run/netns/cni-7194ae7d-d9fb-7ba9-7997-18b2d92b2d4c" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.115 [INFO][4339] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" iface="eth0" netns="/var/run/netns/cni-7194ae7d-d9fb-7ba9-7997-18b2d92b2d4c" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.115 [INFO][4339] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" iface="eth0" netns="/var/run/netns/cni-7194ae7d-d9fb-7ba9-7997-18b2d92b2d4c" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.115 [INFO][4339] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.115 [INFO][4339] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.174 [INFO][4375] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.175 [INFO][4375] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.175 [INFO][4375] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.186 [WARNING][4375] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.187 [INFO][4375] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.190 [INFO][4375] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.200656 containerd[1477]: 2025-08-13 00:18:58.195 [INFO][4339] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:18:58.205961 systemd[1]: run-netns-cni\x2d7194ae7d\x2dd9fb\x2d7ba9\x2d7997\x2d18b2d92b2d4c.mount: Deactivated successfully. Aug 13 00:18:58.208315 containerd[1477]: time="2025-08-13T00:18:58.208168032Z" level=info msg="TearDown network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\" successfully" Aug 13 00:18:58.208315 containerd[1477]: time="2025-08-13T00:18:58.208208913Z" level=info msg="StopPodSandbox for \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\" returns successfully" Aug 13 00:18:58.210483 containerd[1477]: time="2025-08-13T00:18:58.210390499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-4lbfd,Uid:b5166162-63e4-49bd-85b9-e1c018b456ba,Namespace:calico-apiserver,Attempt:1,}" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.145 [INFO][4351] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.145 [INFO][4351] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" iface="eth0" netns="/var/run/netns/cni-91e994c3-4971-b764-6041-bb95eab0ef30" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.147 [INFO][4351] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" iface="eth0" netns="/var/run/netns/cni-91e994c3-4971-b764-6041-bb95eab0ef30" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.147 [INFO][4351] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" iface="eth0" netns="/var/run/netns/cni-91e994c3-4971-b764-6041-bb95eab0ef30" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.147 [INFO][4351] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.147 [INFO][4351] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.237 [INFO][4383] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.237 [INFO][4383] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.239 [INFO][4383] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.285 [WARNING][4383] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.286 [INFO][4383] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.291 [INFO][4383] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.310815 containerd[1477]: 2025-08-13 00:18:58.303 [INFO][4351] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:18:58.313535 containerd[1477]: time="2025-08-13T00:18:58.312992972Z" level=info msg="TearDown network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\" successfully" Aug 13 00:18:58.313777 containerd[1477]: time="2025-08-13T00:18:58.313695193Z" level=info msg="StopPodSandbox for \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\" returns successfully" Aug 13 00:18:58.316937 systemd[1]: run-netns-cni\x2d91e994c3\x2d4971\x2db764\x2d6041\x2dbb95eab0ef30.mount: Deactivated successfully. Aug 13 00:18:58.322163 containerd[1477]: time="2025-08-13T00:18:58.321169257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-sw2d2,Uid:f6fff45d-a329-484b-9031-9dfa784fb836,Namespace:calico-apiserver,Attempt:1,}" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.143 [INFO][4347] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.143 [INFO][4347] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" iface="eth0" netns="/var/run/netns/cni-22fa6f15-1eab-7125-79b0-9c2872b3c94c" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.145 [INFO][4347] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" iface="eth0" netns="/var/run/netns/cni-22fa6f15-1eab-7125-79b0-9c2872b3c94c" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.146 [INFO][4347] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" iface="eth0" netns="/var/run/netns/cni-22fa6f15-1eab-7125-79b0-9c2872b3c94c" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.146 [INFO][4347] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.146 [INFO][4347] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.293 [INFO][4382] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.293 [INFO][4382] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.293 [INFO][4382] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.320 [WARNING][4382] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.320 [INFO][4382] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.324 [INFO][4382] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.337555 containerd[1477]: 2025-08-13 00:18:58.332 [INFO][4347] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:18:58.338734 containerd[1477]: time="2025-08-13T00:18:58.338425134Z" level=info msg="TearDown network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\" successfully" Aug 13 00:18:58.338734 containerd[1477]: time="2025-08-13T00:18:58.338482015Z" level=info msg="StopPodSandbox for \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\" returns successfully" Aug 13 00:18:58.344781 containerd[1477]: time="2025-08-13T00:18:58.343020231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hrjxb,Uid:28b87d22-e12e-4c68-948f-91d75e756004,Namespace:kube-system,Attempt:1,}" Aug 13 00:18:58.344273 systemd[1]: run-netns-cni\x2d22fa6f15\x2d1eab\x2d7125\x2d79b0\x2d9c2872b3c94c.mount: Deactivated successfully. Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.162 [INFO][4358] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.163 [INFO][4358] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" iface="eth0" netns="/var/run/netns/cni-dba0a094-d0db-55d7-3b96-f0a585c44ddd" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.163 [INFO][4358] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" iface="eth0" netns="/var/run/netns/cni-dba0a094-d0db-55d7-3b96-f0a585c44ddd" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.164 [INFO][4358] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" iface="eth0" netns="/var/run/netns/cni-dba0a094-d0db-55d7-3b96-f0a585c44ddd" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.164 [INFO][4358] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.164 [INFO][4358] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.330 [INFO][4391] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.331 [INFO][4391] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.331 [INFO][4391] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.358 [WARNING][4391] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.358 [INFO][4391] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.362 [INFO][4391] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.370285 containerd[1477]: 2025-08-13 00:18:58.366 [INFO][4358] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:18:58.370285 containerd[1477]: time="2025-08-13T00:18:58.369953398Z" level=info msg="TearDown network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\" successfully" Aug 13 00:18:58.370285 containerd[1477]: time="2025-08-13T00:18:58.369985319Z" level=info msg="StopPodSandbox for \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\" returns successfully" Aug 13 00:18:58.373212 containerd[1477]: time="2025-08-13T00:18:58.372800923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tpx4h,Uid:f7c4b773-00db-432a-98a5-27a94e2aa827,Namespace:calico-system,Attempt:1,}" Aug 13 00:18:58.636581 systemd-networkd[1377]: calib78ad38f13a: Link UP Aug 13 00:18:58.640280 systemd-networkd[1377]: calib78ad38f13a: Gained carrier Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.435 [INFO][4404] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0 calico-apiserver-776d55f88- calico-apiserver b5166162-63e4-49bd-85b9-e1c018b456ba 970 0 2025-08-13 00:18:33 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:776d55f88 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd calico-apiserver-776d55f88-4lbfd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib78ad38f13a [] [] }} ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.435 [INFO][4404] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.547 [INFO][4447] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" HandleID="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.548 [INFO][4447] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" HandleID="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"calico-apiserver-776d55f88-4lbfd", "timestamp":"2025-08-13 00:18:58.547824846 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.548 [INFO][4447] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.548 [INFO][4447] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.548 [INFO][4447] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.567 [INFO][4447] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.575 [INFO][4447] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.583 [INFO][4447] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.590 [INFO][4447] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.594 [INFO][4447] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.596 [INFO][4447] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.603 [INFO][4447] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352 Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.612 [INFO][4447] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.622 [INFO][4447] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.2/26] block=192.168.98.0/26 handle="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.623 [INFO][4447] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.2/26] handle="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.623 [INFO][4447] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.662510 containerd[1477]: 2025-08-13 00:18:58.623 [INFO][4447] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.2/26] IPv6=[] ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" HandleID="k8s-pod-network.d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.663094 containerd[1477]: 2025-08-13 00:18:58.629 [INFO][4404] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"b5166162-63e4-49bd-85b9-e1c018b456ba", ResourceVersion:"970", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"calico-apiserver-776d55f88-4lbfd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib78ad38f13a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.663094 containerd[1477]: 2025-08-13 00:18:58.630 [INFO][4404] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.2/32] ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.663094 containerd[1477]: 2025-08-13 00:18:58.630 [INFO][4404] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib78ad38f13a ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.663094 containerd[1477]: 2025-08-13 00:18:58.639 [INFO][4404] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.663094 containerd[1477]: 2025-08-13 00:18:58.639 [INFO][4404] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"b5166162-63e4-49bd-85b9-e1c018b456ba", ResourceVersion:"970", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352", Pod:"calico-apiserver-776d55f88-4lbfd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib78ad38f13a", MAC:"8e:43:84:19:c5:82", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.663094 containerd[1477]: 2025-08-13 00:18:58.652 [INFO][4404] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-4lbfd" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:18:58.701179 containerd[1477]: time="2025-08-13T00:18:58.700162248Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:58.701179 containerd[1477]: time="2025-08-13T00:18:58.700229450Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:58.701179 containerd[1477]: time="2025-08-13T00:18:58.700241771Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:58.701179 containerd[1477]: time="2025-08-13T00:18:58.700349214Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:58.742865 systemd[1]: Started cri-containerd-d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352.scope - libcontainer container d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352. Aug 13 00:18:58.747980 systemd-networkd[1377]: calid4492d0302f: Link UP Aug 13 00:18:58.752346 systemd-networkd[1377]: calid4492d0302f: Gained carrier Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.479 [INFO][4417] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0 calico-apiserver-776d55f88- calico-apiserver f6fff45d-a329-484b-9031-9dfa784fb836 973 0 2025-08-13 00:18:33 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:776d55f88 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd calico-apiserver-776d55f88-sw2d2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid4492d0302f [] [] }} ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.479 [INFO][4417] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.560 [INFO][4455] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" HandleID="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.562 [INFO][4455] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" HandleID="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d930), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"calico-apiserver-776d55f88-sw2d2", "timestamp":"2025-08-13 00:18:58.560385342 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.562 [INFO][4455] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.624 [INFO][4455] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.624 [INFO][4455] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.671 [INFO][4455] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.679 [INFO][4455] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.687 [INFO][4455] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.694 [INFO][4455] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.700 [INFO][4455] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.701 [INFO][4455] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.703 [INFO][4455] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.713 [INFO][4455] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.725 [INFO][4455] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.3/26] block=192.168.98.0/26 handle="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.725 [INFO][4455] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.3/26] handle="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.725 [INFO][4455] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.774742 containerd[1477]: 2025-08-13 00:18:58.725 [INFO][4455] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.3/26] IPv6=[] ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" HandleID="k8s-pod-network.e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.775289 containerd[1477]: 2025-08-13 00:18:58.734 [INFO][4417] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"f6fff45d-a329-484b-9031-9dfa784fb836", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"calico-apiserver-776d55f88-sw2d2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid4492d0302f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.775289 containerd[1477]: 2025-08-13 00:18:58.735 [INFO][4417] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.3/32] ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.775289 containerd[1477]: 2025-08-13 00:18:58.735 [INFO][4417] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid4492d0302f ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.775289 containerd[1477]: 2025-08-13 00:18:58.749 [INFO][4417] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.775289 containerd[1477]: 2025-08-13 00:18:58.750 [INFO][4417] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"f6fff45d-a329-484b-9031-9dfa784fb836", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e", Pod:"calico-apiserver-776d55f88-sw2d2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid4492d0302f", MAC:"fa:fa:05:fc:5b:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.775289 containerd[1477]: 2025-08-13 00:18:58.768 [INFO][4417] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e" Namespace="calico-apiserver" Pod="calico-apiserver-776d55f88-sw2d2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:18:58.822471 containerd[1477]: time="2025-08-13T00:18:58.821524003Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:58.822632 containerd[1477]: time="2025-08-13T00:18:58.821708409Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:58.822632 containerd[1477]: time="2025-08-13T00:18:58.821731410Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:58.822632 containerd[1477]: time="2025-08-13T00:18:58.821908495Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:58.851064 systemd-networkd[1377]: caliabb883014ef: Link UP Aug 13 00:18:58.851882 systemd-networkd[1377]: caliabb883014ef: Gained carrier Aug 13 00:18:58.867747 systemd[1]: Started cri-containerd-e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e.scope - libcontainer container e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e. Aug 13 00:18:58.888915 containerd[1477]: time="2025-08-13T00:18:58.887166809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-4lbfd,Uid:b5166162-63e4-49bd-85b9-e1c018b456ba,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352\"" Aug 13 00:18:58.902063 containerd[1477]: time="2025-08-13T00:18:58.901456597Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.483 [INFO][4416] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0 coredns-674b8bbfcf- kube-system 28b87d22-e12e-4c68-948f-91d75e756004 974 0 2025-08-13 00:18:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd coredns-674b8bbfcf-hrjxb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliabb883014ef [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.484 [INFO][4416] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.562 [INFO][4457] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" HandleID="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.562 [INFO][4457] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" HandleID="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d790), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"coredns-674b8bbfcf-hrjxb", "timestamp":"2025-08-13 00:18:58.562429483 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.563 [INFO][4457] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.725 [INFO][4457] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.725 [INFO][4457] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.767 [INFO][4457] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.781 [INFO][4457] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.793 [INFO][4457] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.800 [INFO][4457] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.806 [INFO][4457] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.806 [INFO][4457] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.810 [INFO][4457] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.818 [INFO][4457] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.832 [INFO][4457] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.4/26] block=192.168.98.0/26 handle="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.832 [INFO][4457] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.4/26] handle="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.832 [INFO][4457] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.905087 containerd[1477]: 2025-08-13 00:18:58.832 [INFO][4457] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.4/26] IPv6=[] ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" HandleID="k8s-pod-network.6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.905866 containerd[1477]: 2025-08-13 00:18:58.842 [INFO][4416] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"28b87d22-e12e-4c68-948f-91d75e756004", ResourceVersion:"974", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"coredns-674b8bbfcf-hrjxb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliabb883014ef", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.905866 containerd[1477]: 2025-08-13 00:18:58.842 [INFO][4416] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.4/32] ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.905866 containerd[1477]: 2025-08-13 00:18:58.842 [INFO][4416] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliabb883014ef ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.905866 containerd[1477]: 2025-08-13 00:18:58.847 [INFO][4416] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.905866 containerd[1477]: 2025-08-13 00:18:58.850 [INFO][4416] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"28b87d22-e12e-4c68-948f-91d75e756004", ResourceVersion:"974", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e", Pod:"coredns-674b8bbfcf-hrjxb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliabb883014ef", MAC:"d6:2a:be:72:4c:33", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.905866 containerd[1477]: 2025-08-13 00:18:58.897 [INFO][4416] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e" Namespace="kube-system" Pod="coredns-674b8bbfcf-hrjxb" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:18:58.962896 systemd-networkd[1377]: calia8d5098e7b4: Link UP Aug 13 00:18:58.963134 systemd-networkd[1377]: calia8d5098e7b4: Gained carrier Aug 13 00:18:58.977480 containerd[1477]: time="2025-08-13T00:18:58.972652690Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776d55f88-sw2d2,Uid:f6fff45d-a329-484b-9031-9dfa784fb836,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e\"" Aug 13 00:18:58.980360 containerd[1477]: time="2025-08-13T00:18:58.978664350Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:58.980360 containerd[1477]: time="2025-08-13T00:18:58.978738792Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:58.980360 containerd[1477]: time="2025-08-13T00:18:58.978754913Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:58.980360 containerd[1477]: time="2025-08-13T00:18:58.978892997Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.542 [INFO][4435] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0 goldmane-768f4c5c69- calico-system f7c4b773-00db-432a-98a5-27a94e2aa827 976 0 2025-08-13 00:18:35 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd goldmane-768f4c5c69-tpx4h eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia8d5098e7b4 [] [] }} ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.543 [INFO][4435] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.625 [INFO][4473] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" HandleID="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.625 [INFO][4473] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" HandleID="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b210), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"goldmane-768f4c5c69-tpx4h", "timestamp":"2025-08-13 00:18:58.624999957 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.625 [INFO][4473] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.832 [INFO][4473] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.832 [INFO][4473] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.875 [INFO][4473] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.887 [INFO][4473] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.910 [INFO][4473] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.914 [INFO][4473] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.923 [INFO][4473] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.923 [INFO][4473] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.926 [INFO][4473] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13 Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.934 [INFO][4473] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.950 [INFO][4473] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.5/26] block=192.168.98.0/26 handle="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.951 [INFO][4473] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.5/26] handle="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.951 [INFO][4473] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:18:58.997242 containerd[1477]: 2025-08-13 00:18:58.951 [INFO][4473] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.5/26] IPv6=[] ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" HandleID="k8s-pod-network.80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.998260 containerd[1477]: 2025-08-13 00:18:58.956 [INFO][4435] cni-plugin/k8s.go 418: Populated endpoint ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"f7c4b773-00db-432a-98a5-27a94e2aa827", ResourceVersion:"976", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"goldmane-768f4c5c69-tpx4h", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8d5098e7b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.998260 containerd[1477]: 2025-08-13 00:18:58.957 [INFO][4435] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.5/32] ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.998260 containerd[1477]: 2025-08-13 00:18:58.957 [INFO][4435] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia8d5098e7b4 ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.998260 containerd[1477]: 2025-08-13 00:18:58.964 [INFO][4435] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:58.998260 containerd[1477]: 2025-08-13 00:18:58.965 [INFO][4435] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"f7c4b773-00db-432a-98a5-27a94e2aa827", ResourceVersion:"976", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13", Pod:"goldmane-768f4c5c69-tpx4h", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8d5098e7b4", MAC:"26:0c:49:8b:d1:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:18:58.998260 containerd[1477]: 2025-08-13 00:18:58.993 [INFO][4435] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13" Namespace="calico-system" Pod="goldmane-768f4c5c69-tpx4h" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:18:59.014312 systemd[1]: Started cri-containerd-6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e.scope - libcontainer container 6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e. Aug 13 00:18:59.036331 containerd[1477]: time="2025-08-13T00:18:59.036188601Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:18:59.036331 containerd[1477]: time="2025-08-13T00:18:59.036258563Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:18:59.036331 containerd[1477]: time="2025-08-13T00:18:59.036287084Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:59.038675 containerd[1477]: time="2025-08-13T00:18:59.038377385Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:18:59.081669 systemd[1]: Started cri-containerd-80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13.scope - libcontainer container 80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13. Aug 13 00:18:59.087000 containerd[1477]: time="2025-08-13T00:18:59.086872873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hrjxb,Uid:28b87d22-e12e-4c68-948f-91d75e756004,Namespace:kube-system,Attempt:1,} returns sandbox id \"6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e\"" Aug 13 00:18:59.095399 containerd[1477]: time="2025-08-13T00:18:59.094802063Z" level=info msg="CreateContainer within sandbox \"6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:18:59.114750 containerd[1477]: time="2025-08-13T00:18:59.114693041Z" level=info msg="CreateContainer within sandbox \"6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dccbf53b0e20331194f4a39127c3d2fb65e34d1d1215624e5f4d7ee48aff7050\"" Aug 13 00:18:59.118024 containerd[1477]: time="2025-08-13T00:18:59.116764501Z" level=info msg="StartContainer for \"dccbf53b0e20331194f4a39127c3d2fb65e34d1d1215624e5f4d7ee48aff7050\"" Aug 13 00:18:59.154978 systemd[1]: Started cri-containerd-dccbf53b0e20331194f4a39127c3d2fb65e34d1d1215624e5f4d7ee48aff7050.scope - libcontainer container dccbf53b0e20331194f4a39127c3d2fb65e34d1d1215624e5f4d7ee48aff7050. Aug 13 00:18:59.160687 containerd[1477]: time="2025-08-13T00:18:59.160648976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tpx4h,Uid:f7c4b773-00db-432a-98a5-27a94e2aa827,Namespace:calico-system,Attempt:1,} returns sandbox id \"80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13\"" Aug 13 00:18:59.197412 containerd[1477]: time="2025-08-13T00:18:59.196407014Z" level=info msg="StartContainer for \"dccbf53b0e20331194f4a39127c3d2fb65e34d1d1215624e5f4d7ee48aff7050\" returns successfully" Aug 13 00:18:59.218668 systemd[1]: run-netns-cni\x2ddba0a094\x2dd0db\x2d55d7\x2d3b96\x2df0a585c44ddd.mount: Deactivated successfully. Aug 13 00:18:59.958527 systemd-networkd[1377]: calib78ad38f13a: Gained IPv6LL Aug 13 00:18:59.987452 containerd[1477]: time="2025-08-13T00:18:59.987362584Z" level=info msg="StopPodSandbox for \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\"" Aug 13 00:18:59.987864 containerd[1477]: time="2025-08-13T00:18:59.987811837Z" level=info msg="StopPodSandbox for \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\"" Aug 13 00:19:00.017097 systemd-networkd[1377]: caliabb883014ef: Gained IPv6LL Aug 13 00:19:00.083356 kubelet[2623]: I0813 00:19:00.083239 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-hrjxb" podStartSLOduration=44.083218016000004 podStartE2EDuration="44.083218016s" podCreationTimestamp="2025-08-13 00:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:18:59.310562049 +0000 UTC m=+49.476936856" watchObservedRunningTime="2025-08-13 00:19:00.083218016 +0000 UTC m=+50.249592823" Aug 13 00:19:00.144944 systemd-networkd[1377]: calia8d5098e7b4: Gained IPv6LL Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.083 [INFO][4741] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.084 [INFO][4741] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" iface="eth0" netns="/var/run/netns/cni-2a249505-942c-3428-dabf-63a8e7b90835" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.085 [INFO][4741] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" iface="eth0" netns="/var/run/netns/cni-2a249505-942c-3428-dabf-63a8e7b90835" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.085 [INFO][4741] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" iface="eth0" netns="/var/run/netns/cni-2a249505-942c-3428-dabf-63a8e7b90835" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.085 [INFO][4741] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.085 [INFO][4741] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.132 [INFO][4753] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.132 [INFO][4753] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.132 [INFO][4753] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.144 [WARNING][4753] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.144 [INFO][4753] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.147 [INFO][4753] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:00.156743 containerd[1477]: 2025-08-13 00:19:00.152 [INFO][4741] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:00.157722 containerd[1477]: time="2025-08-13T00:19:00.157533788Z" level=info msg="TearDown network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\" successfully" Aug 13 00:19:00.157722 containerd[1477]: time="2025-08-13T00:19:00.157607671Z" level=info msg="StopPodSandbox for \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\" returns successfully" Aug 13 00:19:00.161726 systemd[1]: run-netns-cni\x2d2a249505\x2d942c\x2d3428\x2ddabf\x2d63a8e7b90835.mount: Deactivated successfully. Aug 13 00:19:00.166104 containerd[1477]: time="2025-08-13T00:19:00.166064709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66459bfc84-krdnk,Uid:44cb98ea-961e-4490-ac08-a70d6d62cf40,Namespace:calico-system,Attempt:1,}" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.095 [INFO][4740] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.096 [INFO][4740] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" iface="eth0" netns="/var/run/netns/cni-4446df02-1b6b-f485-97df-2b31c2d807a7" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.098 [INFO][4740] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" iface="eth0" netns="/var/run/netns/cni-4446df02-1b6b-f485-97df-2b31c2d807a7" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.099 [INFO][4740] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" iface="eth0" netns="/var/run/netns/cni-4446df02-1b6b-f485-97df-2b31c2d807a7" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.099 [INFO][4740] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.099 [INFO][4740] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.134 [INFO][4758] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.134 [INFO][4758] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.148 [INFO][4758] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.173 [WARNING][4758] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.173 [INFO][4758] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.178 [INFO][4758] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:00.186069 containerd[1477]: 2025-08-13 00:19:00.181 [INFO][4740] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:00.193335 containerd[1477]: time="2025-08-13T00:19:00.186370440Z" level=info msg="TearDown network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\" successfully" Aug 13 00:19:00.193335 containerd[1477]: time="2025-08-13T00:19:00.186405121Z" level=info msg="StopPodSandbox for \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\" returns successfully" Aug 13 00:19:00.193335 containerd[1477]: time="2025-08-13T00:19:00.191320380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hgndp,Uid:549a8f55-4e42-4ab3-8e46-bed16219fe3c,Namespace:calico-system,Attempt:1,}" Aug 13 00:19:00.191432 systemd[1]: run-netns-cni\x2d4446df02\x2d1b6b\x2df485\x2d97df\x2d2b31c2d807a7.mount: Deactivated successfully. Aug 13 00:19:00.570537 systemd-networkd[1377]: cali1f825409371: Link UP Aug 13 00:19:00.574274 systemd-networkd[1377]: cali1f825409371: Gained carrier Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.378 [INFO][4766] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0 calico-kube-controllers-66459bfc84- calico-system 44cb98ea-961e-4490-ac08-a70d6d62cf40 1006 0 2025-08-13 00:18:35 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:66459bfc84 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd calico-kube-controllers-66459bfc84-krdnk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1f825409371 [] [] }} ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.379 [INFO][4766] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.467 [INFO][4794] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" HandleID="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.467 [INFO][4794] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" HandleID="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000393b00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"calico-kube-controllers-66459bfc84-krdnk", "timestamp":"2025-08-13 00:19:00.467098945 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.467 [INFO][4794] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.467 [INFO][4794] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.467 [INFO][4794] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.484 [INFO][4794] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.496 [INFO][4794] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.510 [INFO][4794] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.515 [INFO][4794] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.522 [INFO][4794] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.522 [INFO][4794] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.526 [INFO][4794] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.535 [INFO][4794] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.548 [INFO][4794] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.6/26] block=192.168.98.0/26 handle="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.548 [INFO][4794] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.6/26] handle="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.551 [INFO][4794] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:00.615558 containerd[1477]: 2025-08-13 00:19:00.554 [INFO][4794] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.6/26] IPv6=[] ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" HandleID="k8s-pod-network.8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.616204 containerd[1477]: 2025-08-13 00:19:00.559 [INFO][4766] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0", GenerateName:"calico-kube-controllers-66459bfc84-", Namespace:"calico-system", SelfLink:"", UID:"44cb98ea-961e-4490-ac08-a70d6d62cf40", ResourceVersion:"1006", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66459bfc84", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"calico-kube-controllers-66459bfc84-krdnk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1f825409371", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:00.616204 containerd[1477]: 2025-08-13 00:19:00.559 [INFO][4766] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.6/32] ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.616204 containerd[1477]: 2025-08-13 00:19:00.559 [INFO][4766] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f825409371 ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.616204 containerd[1477]: 2025-08-13 00:19:00.575 [INFO][4766] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.616204 containerd[1477]: 2025-08-13 00:19:00.576 [INFO][4766] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0", GenerateName:"calico-kube-controllers-66459bfc84-", Namespace:"calico-system", SelfLink:"", UID:"44cb98ea-961e-4490-ac08-a70d6d62cf40", ResourceVersion:"1006", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66459bfc84", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a", Pod:"calico-kube-controllers-66459bfc84-krdnk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1f825409371", MAC:"7a:9a:9e:e1:7d:27", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:00.616204 containerd[1477]: 2025-08-13 00:19:00.596 [INFO][4766] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a" Namespace="calico-system" Pod="calico-kube-controllers-66459bfc84-krdnk" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:00.672224 containerd[1477]: time="2025-08-13T00:19:00.670682598Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:19:00.672224 containerd[1477]: time="2025-08-13T00:19:00.670773440Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:19:00.672224 containerd[1477]: time="2025-08-13T00:19:00.670785720Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:19:00.672224 containerd[1477]: time="2025-08-13T00:19:00.670959485Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:19:00.692432 systemd-networkd[1377]: calidf4a76dc10d: Link UP Aug 13 00:19:00.692796 systemd-networkd[1377]: calidf4a76dc10d: Gained carrier Aug 13 00:19:00.719772 systemd[1]: Started cri-containerd-8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a.scope - libcontainer container 8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a. Aug 13 00:19:00.720349 systemd-networkd[1377]: calid4492d0302f: Gained IPv6LL Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.401 [INFO][4775] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0 csi-node-driver- calico-system 549a8f55-4e42-4ab3-8e46-bed16219fe3c 1007 0 2025-08-13 00:18:35 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd csi-node-driver-hgndp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calidf4a76dc10d [] [] }} ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.401 [INFO][4775] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.473 [INFO][4799] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" HandleID="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.478 [INFO][4799] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" HandleID="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3930), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"csi-node-driver-hgndp", "timestamp":"2025-08-13 00:19:00.473869616 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.478 [INFO][4799] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.548 [INFO][4799] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.548 [INFO][4799] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.587 [INFO][4799] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.608 [INFO][4799] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.623 [INFO][4799] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.630 [INFO][4799] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.637 [INFO][4799] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.637 [INFO][4799] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.642 [INFO][4799] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614 Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.651 [INFO][4799] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.666 [INFO][4799] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.7/26] block=192.168.98.0/26 handle="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.666 [INFO][4799] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.7/26] handle="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.666 [INFO][4799] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:00.735284 containerd[1477]: 2025-08-13 00:19:00.666 [INFO][4799] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.7/26] IPv6=[] ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" HandleID="k8s-pod-network.a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.736482 containerd[1477]: 2025-08-13 00:19:00.683 [INFO][4775] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"549a8f55-4e42-4ab3-8e46-bed16219fe3c", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"csi-node-driver-hgndp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidf4a76dc10d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:00.736482 containerd[1477]: 2025-08-13 00:19:00.684 [INFO][4775] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.7/32] ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.736482 containerd[1477]: 2025-08-13 00:19:00.685 [INFO][4775] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidf4a76dc10d ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.736482 containerd[1477]: 2025-08-13 00:19:00.691 [INFO][4775] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.736482 containerd[1477]: 2025-08-13 00:19:00.694 [INFO][4775] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"549a8f55-4e42-4ab3-8e46-bed16219fe3c", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614", Pod:"csi-node-driver-hgndp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidf4a76dc10d", MAC:"be:c1:b4:67:f6:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:00.736482 containerd[1477]: 2025-08-13 00:19:00.726 [INFO][4775] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614" Namespace="calico-system" Pod="csi-node-driver-hgndp" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:00.788871 containerd[1477]: time="2025-08-13T00:19:00.786895670Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:19:00.789149 containerd[1477]: time="2025-08-13T00:19:00.788953408Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:19:00.789149 containerd[1477]: time="2025-08-13T00:19:00.788978129Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:19:00.789654 containerd[1477]: time="2025-08-13T00:19:00.789508823Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:19:00.807660 containerd[1477]: time="2025-08-13T00:19:00.807603413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66459bfc84-krdnk,Uid:44cb98ea-961e-4490-ac08-a70d6d62cf40,Namespace:calico-system,Attempt:1,} returns sandbox id \"8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a\"" Aug 13 00:19:00.835699 systemd[1]: Started cri-containerd-a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614.scope - libcontainer container a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614. Aug 13 00:19:00.878820 containerd[1477]: time="2025-08-13T00:19:00.878776817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hgndp,Uid:549a8f55-4e42-4ab3-8e46-bed16219fe3c,Namespace:calico-system,Attempt:1,} returns sandbox id \"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614\"" Aug 13 00:19:00.989718 containerd[1477]: time="2025-08-13T00:19:00.988428785Z" level=info msg="StopPodSandbox for \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\"" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.072 [INFO][4919] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.073 [INFO][4919] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" iface="eth0" netns="/var/run/netns/cni-6d86ee5b-be89-5b31-da56-7c7359bb13c0" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.074 [INFO][4919] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" iface="eth0" netns="/var/run/netns/cni-6d86ee5b-be89-5b31-da56-7c7359bb13c0" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.074 [INFO][4919] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" iface="eth0" netns="/var/run/netns/cni-6d86ee5b-be89-5b31-da56-7c7359bb13c0" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.074 [INFO][4919] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.074 [INFO][4919] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.103 [INFO][4926] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.103 [INFO][4926] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.103 [INFO][4926] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.114 [WARNING][4926] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.114 [INFO][4926] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.117 [INFO][4926] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:01.122671 containerd[1477]: 2025-08-13 00:19:01.120 [INFO][4919] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:01.123770 containerd[1477]: time="2025-08-13T00:19:01.123733169Z" level=info msg="TearDown network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\" successfully" Aug 13 00:19:01.123849 containerd[1477]: time="2025-08-13T00:19:01.123834332Z" level=info msg="StopPodSandbox for \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\" returns successfully" Aug 13 00:19:01.125222 containerd[1477]: time="2025-08-13T00:19:01.124835159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbkv2,Uid:4bba616f-1733-479f-98eb-88700a24fca2,Namespace:kube-system,Attempt:1,}" Aug 13 00:19:01.214980 systemd[1]: run-netns-cni\x2d6d86ee5b\x2dbe89\x2d5b31\x2dda56\x2d7c7359bb13c0.mount: Deactivated successfully. Aug 13 00:19:01.361112 systemd-networkd[1377]: cali56926639e40: Link UP Aug 13 00:19:01.363302 systemd-networkd[1377]: cali56926639e40: Gained carrier Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.217 [INFO][4933] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0 coredns-674b8bbfcf- kube-system 4bba616f-1733-479f-98eb-88700a24fca2 1025 0 2025-08-13 00:18:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-8-f2ca23fedd coredns-674b8bbfcf-dbkv2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali56926639e40 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.218 [INFO][4933] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.267 [INFO][4945] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" HandleID="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.267 [INFO][4945] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" HandleID="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b0a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-8-f2ca23fedd", "pod":"coredns-674b8bbfcf-dbkv2", "timestamp":"2025-08-13 00:19:01.267736981 +0000 UTC"}, Hostname:"ci-4081-3-5-8-f2ca23fedd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.268 [INFO][4945] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.268 [INFO][4945] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.268 [INFO][4945] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-8-f2ca23fedd' Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.284 [INFO][4945] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.292 [INFO][4945] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.303 [INFO][4945] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.306 [INFO][4945] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.310 [INFO][4945] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.310 [INFO][4945] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.313 [INFO][4945] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6 Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.321 [INFO][4945] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.338 [INFO][4945] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.8/26] block=192.168.98.0/26 handle="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.338 [INFO][4945] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.8/26] handle="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" host="ci-4081-3-5-8-f2ca23fedd" Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.338 [INFO][4945] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:01.409771 containerd[1477]: 2025-08-13 00:19:01.338 [INFO][4945] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.8/26] IPv6=[] ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" HandleID="k8s-pod-network.06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.410751 containerd[1477]: 2025-08-13 00:19:01.345 [INFO][4933] cni-plugin/k8s.go 418: Populated endpoint ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4bba616f-1733-479f-98eb-88700a24fca2", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"", Pod:"coredns-674b8bbfcf-dbkv2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali56926639e40", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:01.410751 containerd[1477]: 2025-08-13 00:19:01.346 [INFO][4933] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.8/32] ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.410751 containerd[1477]: 2025-08-13 00:19:01.346 [INFO][4933] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali56926639e40 ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.410751 containerd[1477]: 2025-08-13 00:19:01.364 [INFO][4933] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.410751 containerd[1477]: 2025-08-13 00:19:01.369 [INFO][4933] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4bba616f-1733-479f-98eb-88700a24fca2", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6", Pod:"coredns-674b8bbfcf-dbkv2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali56926639e40", MAC:"2e:d1:64:12:45:c2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:01.410751 containerd[1477]: 2025-08-13 00:19:01.403 [INFO][4933] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbkv2" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:01.473889 containerd[1477]: time="2025-08-13T00:19:01.473780486Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 13 00:19:01.473889 containerd[1477]: time="2025-08-13T00:19:01.473847208Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 13 00:19:01.473889 containerd[1477]: time="2025-08-13T00:19:01.473862808Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:19:01.474113 containerd[1477]: time="2025-08-13T00:19:01.473955651Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 13 00:19:01.516419 systemd[1]: Started cri-containerd-06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6.scope - libcontainer container 06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6. Aug 13 00:19:01.600094 containerd[1477]: time="2025-08-13T00:19:01.600002572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbkv2,Uid:4bba616f-1733-479f-98eb-88700a24fca2,Namespace:kube-system,Attempt:1,} returns sandbox id \"06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6\"" Aug 13 00:19:01.613041 containerd[1477]: time="2025-08-13T00:19:01.612994047Z" level=info msg="CreateContainer within sandbox \"06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:19:01.649038 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4063538736.mount: Deactivated successfully. Aug 13 00:19:01.650403 containerd[1477]: time="2025-08-13T00:19:01.650269985Z" level=info msg="CreateContainer within sandbox \"06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7d6ee8d7cbe1cc41175f00a34af8a789423f6515eb1d7aabd287c7ad915664b7\"" Aug 13 00:19:01.652027 containerd[1477]: time="2025-08-13T00:19:01.651502778Z" level=info msg="StartContainer for \"7d6ee8d7cbe1cc41175f00a34af8a789423f6515eb1d7aabd287c7ad915664b7\"" Aug 13 00:19:01.685673 systemd[1]: Started cri-containerd-7d6ee8d7cbe1cc41175f00a34af8a789423f6515eb1d7aabd287c7ad915664b7.scope - libcontainer container 7d6ee8d7cbe1cc41175f00a34af8a789423f6515eb1d7aabd287c7ad915664b7. Aug 13 00:19:01.740632 containerd[1477]: time="2025-08-13T00:19:01.740393125Z" level=info msg="StartContainer for \"7d6ee8d7cbe1cc41175f00a34af8a789423f6515eb1d7aabd287c7ad915664b7\" returns successfully" Aug 13 00:19:01.910398 containerd[1477]: time="2025-08-13T00:19:01.910208802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:01.911819 containerd[1477]: time="2025-08-13T00:19:01.911575039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Aug 13 00:19:01.913335 containerd[1477]: time="2025-08-13T00:19:01.913228364Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:01.917815 containerd[1477]: time="2025-08-13T00:19:01.917722647Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:01.919084 containerd[1477]: time="2025-08-13T00:19:01.918944000Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 3.017441122s" Aug 13 00:19:01.919084 containerd[1477]: time="2025-08-13T00:19:01.918991282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 13 00:19:01.921546 containerd[1477]: time="2025-08-13T00:19:01.921223822Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:19:01.927245 containerd[1477]: time="2025-08-13T00:19:01.926997460Z" level=info msg="CreateContainer within sandbox \"d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:19:01.948499 containerd[1477]: time="2025-08-13T00:19:01.948213679Z" level=info msg="CreateContainer within sandbox \"d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"570fadbb464cf92e1e399130c28dd4b2d87609f1543b09b81d84009e6a803d16\"" Aug 13 00:19:01.950786 containerd[1477]: time="2025-08-13T00:19:01.949321390Z" level=info msg="StartContainer for \"570fadbb464cf92e1e399130c28dd4b2d87609f1543b09b81d84009e6a803d16\"" Aug 13 00:19:01.990752 systemd[1]: Started cri-containerd-570fadbb464cf92e1e399130c28dd4b2d87609f1543b09b81d84009e6a803d16.scope - libcontainer container 570fadbb464cf92e1e399130c28dd4b2d87609f1543b09b81d84009e6a803d16. Aug 13 00:19:02.043818 containerd[1477]: time="2025-08-13T00:19:02.043075874Z" level=info msg="StartContainer for \"570fadbb464cf92e1e399130c28dd4b2d87609f1543b09b81d84009e6a803d16\" returns successfully" Aug 13 00:19:02.191792 systemd-networkd[1377]: cali1f825409371: Gained IPv6LL Aug 13 00:19:02.339155 kubelet[2623]: I0813 00:19:02.338669 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dbkv2" podStartSLOduration=46.338648659 podStartE2EDuration="46.338648659s" podCreationTimestamp="2025-08-13 00:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:19:02.334813517 +0000 UTC m=+52.501188284" watchObservedRunningTime="2025-08-13 00:19:02.338648659 +0000 UTC m=+52.505023426" Aug 13 00:19:02.361914 containerd[1477]: time="2025-08-13T00:19:02.361858113Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:02.366148 containerd[1477]: time="2025-08-13T00:19:02.366101946Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 13 00:19:02.371172 containerd[1477]: time="2025-08-13T00:19:02.371076998Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 449.809774ms" Aug 13 00:19:02.371330 containerd[1477]: time="2025-08-13T00:19:02.371169840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 13 00:19:02.374187 containerd[1477]: time="2025-08-13T00:19:02.374140679Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 13 00:19:02.381721 containerd[1477]: time="2025-08-13T00:19:02.381673638Z" level=info msg="CreateContainer within sandbox \"e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:19:02.404343 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount148903803.mount: Deactivated successfully. Aug 13 00:19:02.424907 containerd[1477]: time="2025-08-13T00:19:02.424837701Z" level=info msg="CreateContainer within sandbox \"e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"45ef4a2df1290f41d11958d3870bde7a1cac7027ffa5f087c34959803ee5e849\"" Aug 13 00:19:02.427312 containerd[1477]: time="2025-08-13T00:19:02.427268085Z" level=info msg="StartContainer for \"45ef4a2df1290f41d11958d3870bde7a1cac7027ffa5f087c34959803ee5e849\"" Aug 13 00:19:02.480840 systemd[1]: Started cri-containerd-45ef4a2df1290f41d11958d3870bde7a1cac7027ffa5f087c34959803ee5e849.scope - libcontainer container 45ef4a2df1290f41d11958d3870bde7a1cac7027ffa5f087c34959803ee5e849. Aug 13 00:19:02.540507 containerd[1477]: time="2025-08-13T00:19:02.539897907Z" level=info msg="StartContainer for \"45ef4a2df1290f41d11958d3870bde7a1cac7027ffa5f087c34959803ee5e849\" returns successfully" Aug 13 00:19:02.703651 systemd-networkd[1377]: calidf4a76dc10d: Gained IPv6LL Aug 13 00:19:03.361831 kubelet[2623]: I0813 00:19:03.361527 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-776d55f88-4lbfd" podStartSLOduration=27.341575816 podStartE2EDuration="30.361505008s" podCreationTimestamp="2025-08-13 00:18:33 +0000 UTC" firstStartedPulling="2025-08-13 00:18:58.900649333 +0000 UTC m=+49.067024140" lastFinishedPulling="2025-08-13 00:19:01.920578525 +0000 UTC m=+52.086953332" observedRunningTime="2025-08-13 00:19:02.419773967 +0000 UTC m=+52.586148774" watchObservedRunningTime="2025-08-13 00:19:03.361505008 +0000 UTC m=+53.527879815" Aug 13 00:19:03.407789 systemd-networkd[1377]: cali56926639e40: Gained IPv6LL Aug 13 00:19:04.345005 kubelet[2623]: I0813 00:19:04.344942 2623 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:19:04.345435 kubelet[2623]: I0813 00:19:04.345397 2623 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:19:04.615748 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1323304321.mount: Deactivated successfully. Aug 13 00:19:04.651357 kubelet[2623]: I0813 00:19:04.651223 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-776d55f88-sw2d2" podStartSLOduration=28.254053671 podStartE2EDuration="31.651206611s" podCreationTimestamp="2025-08-13 00:18:33 +0000 UTC" firstStartedPulling="2025-08-13 00:18:58.976629849 +0000 UTC m=+49.143004656" lastFinishedPulling="2025-08-13 00:19:02.373782829 +0000 UTC m=+52.540157596" observedRunningTime="2025-08-13 00:19:03.365714636 +0000 UTC m=+53.532089483" watchObservedRunningTime="2025-08-13 00:19:04.651206611 +0000 UTC m=+54.817581418" Aug 13 00:19:05.358476 containerd[1477]: time="2025-08-13T00:19:05.357435683Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:05.359557 containerd[1477]: time="2025-08-13T00:19:05.359509253Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Aug 13 00:19:05.360212 containerd[1477]: time="2025-08-13T00:19:05.360178029Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:05.365950 containerd[1477]: time="2025-08-13T00:19:05.365910207Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:05.367053 containerd[1477]: time="2025-08-13T00:19:05.367006434Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 2.992821074s" Aug 13 00:19:05.367053 containerd[1477]: time="2025-08-13T00:19:05.367054075Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Aug 13 00:19:05.371090 containerd[1477]: time="2025-08-13T00:19:05.370871527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 13 00:19:05.373995 containerd[1477]: time="2025-08-13T00:19:05.373905160Z" level=info msg="CreateContainer within sandbox \"80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 13 00:19:05.395097 containerd[1477]: time="2025-08-13T00:19:05.394937228Z" level=info msg="CreateContainer within sandbox \"80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32\"" Aug 13 00:19:05.396351 containerd[1477]: time="2025-08-13T00:19:05.396222299Z" level=info msg="StartContainer for \"8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32\"" Aug 13 00:19:05.480015 systemd[1]: Started cri-containerd-8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32.scope - libcontainer container 8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32. Aug 13 00:19:05.524820 containerd[1477]: time="2025-08-13T00:19:05.524743401Z" level=info msg="StartContainer for \"8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32\" returns successfully" Aug 13 00:19:06.383537 kubelet[2623]: I0813 00:19:06.380298 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-tpx4h" podStartSLOduration=25.175892185 podStartE2EDuration="31.380282177s" podCreationTimestamp="2025-08-13 00:18:35 +0000 UTC" firstStartedPulling="2025-08-13 00:18:59.164344243 +0000 UTC m=+49.330719010" lastFinishedPulling="2025-08-13 00:19:05.368734195 +0000 UTC m=+55.535109002" observedRunningTime="2025-08-13 00:19:06.379191712 +0000 UTC m=+56.545566519" watchObservedRunningTime="2025-08-13 00:19:06.380282177 +0000 UTC m=+56.546656944" Aug 13 00:19:08.460072 containerd[1477]: time="2025-08-13T00:19:08.459721993Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:08.462139 containerd[1477]: time="2025-08-13T00:19:08.461116984Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Aug 13 00:19:08.462408 containerd[1477]: time="2025-08-13T00:19:08.462370891Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:08.466472 containerd[1477]: time="2025-08-13T00:19:08.466383820Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:08.468401 containerd[1477]: time="2025-08-13T00:19:08.467400282Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 3.096478834s" Aug 13 00:19:08.468401 containerd[1477]: time="2025-08-13T00:19:08.467524005Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Aug 13 00:19:08.471331 containerd[1477]: time="2025-08-13T00:19:08.471256207Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 13 00:19:08.491296 containerd[1477]: time="2025-08-13T00:19:08.491089684Z" level=info msg="CreateContainer within sandbox \"8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 13 00:19:08.514365 containerd[1477]: time="2025-08-13T00:19:08.514302875Z" level=info msg="CreateContainer within sandbox \"8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d\"" Aug 13 00:19:08.515000 containerd[1477]: time="2025-08-13T00:19:08.514922128Z" level=info msg="StartContainer for \"9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d\"" Aug 13 00:19:08.567663 systemd[1]: Started cri-containerd-9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d.scope - libcontainer container 9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d. Aug 13 00:19:08.618951 containerd[1477]: time="2025-08-13T00:19:08.618857217Z" level=info msg="StartContainer for \"9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d\" returns successfully" Aug 13 00:19:09.403815 kubelet[2623]: I0813 00:19:09.403559 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-66459bfc84-krdnk" podStartSLOduration=26.742545467 podStartE2EDuration="34.403525067s" podCreationTimestamp="2025-08-13 00:18:35 +0000 UTC" firstStartedPulling="2025-08-13 00:19:00.810049202 +0000 UTC m=+50.976424009" lastFinishedPulling="2025-08-13 00:19:08.471028802 +0000 UTC m=+58.637403609" observedRunningTime="2025-08-13 00:19:09.394822841 +0000 UTC m=+59.561197608" watchObservedRunningTime="2025-08-13 00:19:09.403525067 +0000 UTC m=+59.569899874" Aug 13 00:19:09.932922 containerd[1477]: time="2025-08-13T00:19:09.930635524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:09.932922 containerd[1477]: time="2025-08-13T00:19:09.931413101Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Aug 13 00:19:09.932922 containerd[1477]: time="2025-08-13T00:19:09.932713529Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:09.936811 containerd[1477]: time="2025-08-13T00:19:09.936743335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:09.937407 containerd[1477]: time="2025-08-13T00:19:09.937361348Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 1.46605398s" Aug 13 00:19:09.937407 containerd[1477]: time="2025-08-13T00:19:09.937403389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Aug 13 00:19:09.945288 containerd[1477]: time="2025-08-13T00:19:09.945221396Z" level=info msg="CreateContainer within sandbox \"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 13 00:19:09.975249 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1419212142.mount: Deactivated successfully. Aug 13 00:19:09.985933 containerd[1477]: time="2025-08-13T00:19:09.985839463Z" level=info msg="StopPodSandbox for \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\"" Aug 13 00:19:09.988234 containerd[1477]: time="2025-08-13T00:19:09.988151392Z" level=info msg="CreateContainer within sandbox \"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"26493fe1f517a33f9865dbd86b20e1a8a347de7b86f837cfa2d90f113004c31b\"" Aug 13 00:19:09.994742 containerd[1477]: time="2025-08-13T00:19:09.994555889Z" level=info msg="StartContainer for \"26493fe1f517a33f9865dbd86b20e1a8a347de7b86f837cfa2d90f113004c31b\"" Aug 13 00:19:10.063662 systemd[1]: Started cri-containerd-26493fe1f517a33f9865dbd86b20e1a8a347de7b86f837cfa2d90f113004c31b.scope - libcontainer container 26493fe1f517a33f9865dbd86b20e1a8a347de7b86f837cfa2d90f113004c31b. Aug 13 00:19:10.117182 containerd[1477]: time="2025-08-13T00:19:10.116926828Z" level=info msg="StartContainer for \"26493fe1f517a33f9865dbd86b20e1a8a347de7b86f837cfa2d90f113004c31b\" returns successfully" Aug 13 00:19:10.120566 containerd[1477]: time="2025-08-13T00:19:10.120485661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.067 [WARNING][5348] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0", GenerateName:"calico-kube-controllers-66459bfc84-", Namespace:"calico-system", SelfLink:"", UID:"44cb98ea-961e-4490-ac08-a70d6d62cf40", ResourceVersion:"1092", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66459bfc84", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a", Pod:"calico-kube-controllers-66459bfc84-krdnk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1f825409371", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.068 [INFO][5348] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.068 [INFO][5348] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" iface="eth0" netns="" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.068 [INFO][5348] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.068 [INFO][5348] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.129 [INFO][5380] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.132 [INFO][5380] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.132 [INFO][5380] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.145 [WARNING][5380] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.145 [INFO][5380] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.148 [INFO][5380] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.155547 containerd[1477]: 2025-08-13 00:19:10.151 [INFO][5348] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.155547 containerd[1477]: time="2025-08-13T00:19:10.155154740Z" level=info msg="TearDown network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\" successfully" Aug 13 00:19:10.155547 containerd[1477]: time="2025-08-13T00:19:10.155186380Z" level=info msg="StopPodSandbox for \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\" returns successfully" Aug 13 00:19:10.157636 containerd[1477]: time="2025-08-13T00:19:10.157148941Z" level=info msg="RemovePodSandbox for \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\"" Aug 13 00:19:10.157636 containerd[1477]: time="2025-08-13T00:19:10.157209582Z" level=info msg="Forcibly stopping sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\"" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.219 [WARNING][5402] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0", GenerateName:"calico-kube-controllers-66459bfc84-", Namespace:"calico-system", SelfLink:"", UID:"44cb98ea-961e-4490-ac08-a70d6d62cf40", ResourceVersion:"1092", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66459bfc84", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"8fb066f7edba20aea8520d4291a98de399aef69f218600d47def695de985084a", Pod:"calico-kube-controllers-66459bfc84-krdnk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1f825409371", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.220 [INFO][5402] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.220 [INFO][5402] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" iface="eth0" netns="" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.220 [INFO][5402] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.220 [INFO][5402] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.256 [INFO][5409] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.256 [INFO][5409] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.257 [INFO][5409] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.269 [WARNING][5409] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.269 [INFO][5409] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" HandleID="k8s-pod-network.ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--kube--controllers--66459bfc84--krdnk-eth0" Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.272 [INFO][5409] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.278706 containerd[1477]: 2025-08-13 00:19:10.274 [INFO][5402] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565" Aug 13 00:19:10.278706 containerd[1477]: time="2025-08-13T00:19:10.278889982Z" level=info msg="TearDown network for sandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\" successfully" Aug 13 00:19:10.286815 containerd[1477]: time="2025-08-13T00:19:10.286738865Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:10.286972 containerd[1477]: time="2025-08-13T00:19:10.286830787Z" level=info msg="RemovePodSandbox \"ffb0f09dc0316926d491844e67f1881e9e98154b81a416c5f3afb5a063559565\" returns successfully" Aug 13 00:19:10.288153 containerd[1477]: time="2025-08-13T00:19:10.288085293Z" level=info msg="StopPodSandbox for \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\"" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.380 [WARNING][5423] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4bba616f-1733-479f-98eb-88700a24fca2", ResourceVersion:"1039", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6", Pod:"coredns-674b8bbfcf-dbkv2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali56926639e40", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.380 [INFO][5423] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.380 [INFO][5423] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" iface="eth0" netns="" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.380 [INFO][5423] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.380 [INFO][5423] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.413 [INFO][5437] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.413 [INFO][5437] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.413 [INFO][5437] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.426 [WARNING][5437] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.426 [INFO][5437] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.430 [INFO][5437] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.437356 containerd[1477]: 2025-08-13 00:19:10.433 [INFO][5423] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.439054 containerd[1477]: time="2025-08-13T00:19:10.437455707Z" level=info msg="TearDown network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\" successfully" Aug 13 00:19:10.439054 containerd[1477]: time="2025-08-13T00:19:10.437487348Z" level=info msg="StopPodSandbox for \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\" returns successfully" Aug 13 00:19:10.439213 containerd[1477]: time="2025-08-13T00:19:10.439182183Z" level=info msg="RemovePodSandbox for \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\"" Aug 13 00:19:10.439341 containerd[1477]: time="2025-08-13T00:19:10.439223383Z" level=info msg="Forcibly stopping sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\"" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.502 [WARNING][5451] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4bba616f-1733-479f-98eb-88700a24fca2", ResourceVersion:"1039", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"06e53108c481c2d5bdddc335ef8964454c4f0c67d017ed8662505348fdc6dcd6", Pod:"coredns-674b8bbfcf-dbkv2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali56926639e40", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.502 [INFO][5451] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.502 [INFO][5451] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" iface="eth0" netns="" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.502 [INFO][5451] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.502 [INFO][5451] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.534 [INFO][5458] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.534 [INFO][5458] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.534 [INFO][5458] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.545 [WARNING][5458] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.545 [INFO][5458] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" HandleID="k8s-pod-network.51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--dbkv2-eth0" Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.548 [INFO][5458] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.556409 containerd[1477]: 2025-08-13 00:19:10.552 [INFO][5451] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae" Aug 13 00:19:10.556409 containerd[1477]: time="2025-08-13T00:19:10.555815838Z" level=info msg="TearDown network for sandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\" successfully" Aug 13 00:19:10.567125 containerd[1477]: time="2025-08-13T00:19:10.567051391Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:10.567318 containerd[1477]: time="2025-08-13T00:19:10.567167154Z" level=info msg="RemovePodSandbox \"51dcc02fab435517f72f66125728ebad0239f5d5cbc00b333c8960389c1965ae\" returns successfully" Aug 13 00:19:10.568180 containerd[1477]: time="2025-08-13T00:19:10.568058412Z" level=info msg="StopPodSandbox for \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\"" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.616 [WARNING][5472] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"b5166162-63e4-49bd-85b9-e1c018b456ba", ResourceVersion:"1057", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352", Pod:"calico-apiserver-776d55f88-4lbfd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib78ad38f13a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.616 [INFO][5472] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.616 [INFO][5472] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" iface="eth0" netns="" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.617 [INFO][5472] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.617 [INFO][5472] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.643 [INFO][5479] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.643 [INFO][5479] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.643 [INFO][5479] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.655 [WARNING][5479] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.655 [INFO][5479] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.658 [INFO][5479] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.662818 containerd[1477]: 2025-08-13 00:19:10.660 [INFO][5472] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.663922 containerd[1477]: time="2025-08-13T00:19:10.663565030Z" level=info msg="TearDown network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\" successfully" Aug 13 00:19:10.663922 containerd[1477]: time="2025-08-13T00:19:10.663608751Z" level=info msg="StopPodSandbox for \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\" returns successfully" Aug 13 00:19:10.664547 containerd[1477]: time="2025-08-13T00:19:10.664360007Z" level=info msg="RemovePodSandbox for \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\"" Aug 13 00:19:10.665001 containerd[1477]: time="2025-08-13T00:19:10.664417248Z" level=info msg="Forcibly stopping sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\"" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.713 [WARNING][5493] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"b5166162-63e4-49bd-85b9-e1c018b456ba", ResourceVersion:"1057", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"d35bf9fbd0216e706155936246be1e49ba196898195d34b0981a10b5d7f75352", Pod:"calico-apiserver-776d55f88-4lbfd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib78ad38f13a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.713 [INFO][5493] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.713 [INFO][5493] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" iface="eth0" netns="" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.713 [INFO][5493] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.713 [INFO][5493] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.745 [INFO][5500] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.746 [INFO][5500] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.746 [INFO][5500] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.757 [WARNING][5500] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.757 [INFO][5500] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" HandleID="k8s-pod-network.0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--4lbfd-eth0" Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.760 [INFO][5500] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.765086 containerd[1477]: 2025-08-13 00:19:10.761 [INFO][5493] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d" Aug 13 00:19:10.765756 containerd[1477]: time="2025-08-13T00:19:10.765159695Z" level=info msg="TearDown network for sandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\" successfully" Aug 13 00:19:10.771459 containerd[1477]: time="2025-08-13T00:19:10.770892373Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:10.771459 containerd[1477]: time="2025-08-13T00:19:10.771010736Z" level=info msg="RemovePodSandbox \"0533dd2b1fa3e8333fc155148bfe43f5697263ba8261473fdc9db67bc8538b1d\" returns successfully" Aug 13 00:19:10.773504 containerd[1477]: time="2025-08-13T00:19:10.772975217Z" level=info msg="StopPodSandbox for \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\"" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.822 [WARNING][5514] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"f7c4b773-00db-432a-98a5-27a94e2aa827", ResourceVersion:"1071", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13", Pod:"goldmane-768f4c5c69-tpx4h", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8d5098e7b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.823 [INFO][5514] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.823 [INFO][5514] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" iface="eth0" netns="" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.823 [INFO][5514] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.823 [INFO][5514] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.851 [INFO][5521] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.851 [INFO][5521] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.852 [INFO][5521] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.862 [WARNING][5521] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.862 [INFO][5521] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.865 [INFO][5521] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.869038 containerd[1477]: 2025-08-13 00:19:10.866 [INFO][5514] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.869038 containerd[1477]: time="2025-08-13T00:19:10.869015246Z" level=info msg="TearDown network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\" successfully" Aug 13 00:19:10.870669 containerd[1477]: time="2025-08-13T00:19:10.869043406Z" level=info msg="StopPodSandbox for \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\" returns successfully" Aug 13 00:19:10.870669 containerd[1477]: time="2025-08-13T00:19:10.869563857Z" level=info msg="RemovePodSandbox for \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\"" Aug 13 00:19:10.870669 containerd[1477]: time="2025-08-13T00:19:10.869614938Z" level=info msg="Forcibly stopping sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\"" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.921 [WARNING][5535] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"f7c4b773-00db-432a-98a5-27a94e2aa827", ResourceVersion:"1071", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"80b95f09f5ab8e2a4720b76d39305bc7c6f0c50d760a91f62c3e235cc3fb5f13", Pod:"goldmane-768f4c5c69-tpx4h", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8d5098e7b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.922 [INFO][5535] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.923 [INFO][5535] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" iface="eth0" netns="" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.923 [INFO][5535] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.923 [INFO][5535] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.953 [INFO][5543] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.953 [INFO][5543] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.953 [INFO][5543] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.967 [WARNING][5543] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.967 [INFO][5543] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" HandleID="k8s-pod-network.b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-goldmane--768f4c5c69--tpx4h-eth0" Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.970 [INFO][5543] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:10.976867 containerd[1477]: 2025-08-13 00:19:10.973 [INFO][5535] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b" Aug 13 00:19:10.977650 containerd[1477]: time="2025-08-13T00:19:10.976915961Z" level=info msg="TearDown network for sandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\" successfully" Aug 13 00:19:10.985181 containerd[1477]: time="2025-08-13T00:19:10.985125171Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:10.985332 containerd[1477]: time="2025-08-13T00:19:10.985224533Z" level=info msg="RemovePodSandbox \"b2d7f74eb1bd0c32413bd3363d1d9eb0f7b8886bab775d412aad279dde587a0b\" returns successfully" Aug 13 00:19:10.986092 containerd[1477]: time="2025-08-13T00:19:10.985927987Z" level=info msg="StopPodSandbox for \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\"" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.038 [WARNING][5557] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"549a8f55-4e42-4ab3-8e46-bed16219fe3c", ResourceVersion:"1019", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614", Pod:"csi-node-driver-hgndp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidf4a76dc10d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.038 [INFO][5557] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.038 [INFO][5557] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" iface="eth0" netns="" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.038 [INFO][5557] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.038 [INFO][5557] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.073 [INFO][5564] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.073 [INFO][5564] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.073 [INFO][5564] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.086 [WARNING][5564] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.086 [INFO][5564] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.089 [INFO][5564] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:11.094572 containerd[1477]: 2025-08-13 00:19:11.091 [INFO][5557] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.098107 containerd[1477]: time="2025-08-13T00:19:11.094617740Z" level=info msg="TearDown network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\" successfully" Aug 13 00:19:11.098107 containerd[1477]: time="2025-08-13T00:19:11.094652101Z" level=info msg="StopPodSandbox for \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\" returns successfully" Aug 13 00:19:11.098107 containerd[1477]: time="2025-08-13T00:19:11.095332154Z" level=info msg="RemovePodSandbox for \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\"" Aug 13 00:19:11.098107 containerd[1477]: time="2025-08-13T00:19:11.095381875Z" level=info msg="Forcibly stopping sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\"" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.143 [WARNING][5578] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"549a8f55-4e42-4ab3-8e46-bed16219fe3c", ResourceVersion:"1019", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614", Pod:"csi-node-driver-hgndp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidf4a76dc10d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.143 [INFO][5578] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.143 [INFO][5578] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" iface="eth0" netns="" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.144 [INFO][5578] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.144 [INFO][5578] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.167 [INFO][5585] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.168 [INFO][5585] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.168 [INFO][5585] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.184 [WARNING][5585] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.184 [INFO][5585] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" HandleID="k8s-pod-network.ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-csi--node--driver--hgndp-eth0" Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.188 [INFO][5585] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:11.192287 containerd[1477]: 2025-08-13 00:19:11.190 [INFO][5578] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3" Aug 13 00:19:11.192287 containerd[1477]: time="2025-08-13T00:19:11.192229541Z" level=info msg="TearDown network for sandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\" successfully" Aug 13 00:19:11.197616 containerd[1477]: time="2025-08-13T00:19:11.197534328Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:11.197800 containerd[1477]: time="2025-08-13T00:19:11.197765012Z" level=info msg="RemovePodSandbox \"ba07ad4f4bb0b0886c1fb111651047e0c359f171ae97361dd914eb83cd73f0e3\" returns successfully" Aug 13 00:19:11.198651 containerd[1477]: time="2025-08-13T00:19:11.198588349Z" level=info msg="StopPodSandbox for \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\"" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.251 [WARNING][5599] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"f6fff45d-a329-484b-9031-9dfa784fb836", ResourceVersion:"1052", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e", Pod:"calico-apiserver-776d55f88-sw2d2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid4492d0302f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.252 [INFO][5599] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.252 [INFO][5599] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" iface="eth0" netns="" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.252 [INFO][5599] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.252 [INFO][5599] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.281 [INFO][5607] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.282 [INFO][5607] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.282 [INFO][5607] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.298 [WARNING][5607] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.298 [INFO][5607] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.301 [INFO][5607] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:11.305502 containerd[1477]: 2025-08-13 00:19:11.303 [INFO][5599] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.305502 containerd[1477]: time="2025-08-13T00:19:11.305498057Z" level=info msg="TearDown network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\" successfully" Aug 13 00:19:11.306334 containerd[1477]: time="2025-08-13T00:19:11.305531418Z" level=info msg="StopPodSandbox for \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\" returns successfully" Aug 13 00:19:11.306572 containerd[1477]: time="2025-08-13T00:19:11.306528318Z" level=info msg="RemovePodSandbox for \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\"" Aug 13 00:19:11.306671 containerd[1477]: time="2025-08-13T00:19:11.306589399Z" level=info msg="Forcibly stopping sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\"" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.362 [WARNING][5621] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0", GenerateName:"calico-apiserver-776d55f88-", Namespace:"calico-apiserver", SelfLink:"", UID:"f6fff45d-a329-484b-9031-9dfa784fb836", ResourceVersion:"1052", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776d55f88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"e4e3cba2a856cb7d2d54696dde7c111da882d34477edba2ac19f8a3fcbca581e", Pod:"calico-apiserver-776d55f88-sw2d2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid4492d0302f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.362 [INFO][5621] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.362 [INFO][5621] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" iface="eth0" netns="" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.362 [INFO][5621] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.362 [INFO][5621] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.402 [INFO][5628] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.403 [INFO][5628] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.403 [INFO][5628] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.427 [WARNING][5628] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.428 [INFO][5628] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" HandleID="k8s-pod-network.2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-calico--apiserver--776d55f88--sw2d2-eth0" Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.430 [INFO][5628] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:11.435837 containerd[1477]: 2025-08-13 00:19:11.433 [INFO][5621] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6" Aug 13 00:19:11.437251 containerd[1477]: time="2025-08-13T00:19:11.435892477Z" level=info msg="TearDown network for sandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\" successfully" Aug 13 00:19:11.442406 containerd[1477]: time="2025-08-13T00:19:11.442259084Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:11.442406 containerd[1477]: time="2025-08-13T00:19:11.442349566Z" level=info msg="RemovePodSandbox \"2f4167d30fa99d0341f177560b5ee006a1b829387b72044e60eebf2c9957cab6\" returns successfully" Aug 13 00:19:11.443554 containerd[1477]: time="2025-08-13T00:19:11.443080621Z" level=info msg="StopPodSandbox for \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\"" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.491 [WARNING][5642] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.492 [INFO][5642] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.492 [INFO][5642] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" iface="eth0" netns="" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.492 [INFO][5642] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.492 [INFO][5642] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.516 [INFO][5649] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.516 [INFO][5649] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.516 [INFO][5649] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.529 [WARNING][5649] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.529 [INFO][5649] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.534 [INFO][5649] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:11.541481 containerd[1477]: 2025-08-13 00:19:11.537 [INFO][5642] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.542197 containerd[1477]: time="2025-08-13T00:19:11.541424157Z" level=info msg="TearDown network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\" successfully" Aug 13 00:19:11.542197 containerd[1477]: time="2025-08-13T00:19:11.541565640Z" level=info msg="StopPodSandbox for \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\" returns successfully" Aug 13 00:19:11.543746 containerd[1477]: time="2025-08-13T00:19:11.543671042Z" level=info msg="RemovePodSandbox for \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\"" Aug 13 00:19:11.543746 containerd[1477]: time="2025-08-13T00:19:11.543754444Z" level=info msg="Forcibly stopping sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\"" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.620 [WARNING][5663] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" WorkloadEndpoint="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.620 [INFO][5663] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.620 [INFO][5663] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" iface="eth0" netns="" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.620 [INFO][5663] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.620 [INFO][5663] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.654 [INFO][5673] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.654 [INFO][5673] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.654 [INFO][5673] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.671 [WARNING][5673] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.672 [INFO][5673] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" HandleID="k8s-pod-network.01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-whisker--85f6f6cf67--7zt6q-eth0" Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.676 [INFO][5673] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:11.684976 containerd[1477]: 2025-08-13 00:19:11.679 [INFO][5663] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962" Aug 13 00:19:11.685835 containerd[1477]: time="2025-08-13T00:19:11.685573853Z" level=info msg="TearDown network for sandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\" successfully" Aug 13 00:19:11.690407 containerd[1477]: time="2025-08-13T00:19:11.690126784Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:11.690995 containerd[1477]: time="2025-08-13T00:19:11.690727716Z" level=info msg="RemovePodSandbox \"01d41d326dfa3cb98a20cc63c6472afb06b89af2099ac2eb179f5d24b41c0962\" returns successfully" Aug 13 00:19:11.692019 containerd[1477]: time="2025-08-13T00:19:11.691923020Z" level=info msg="StopPodSandbox for \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\"" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.761 [WARNING][5691] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"28b87d22-e12e-4c68-948f-91d75e756004", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e", Pod:"coredns-674b8bbfcf-hrjxb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliabb883014ef", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.762 [INFO][5691] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.762 [INFO][5691] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" iface="eth0" netns="" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.762 [INFO][5691] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.762 [INFO][5691] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.804 [INFO][5699] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.804 [INFO][5699] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.804 [INFO][5699] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.817 [WARNING][5699] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.817 [INFO][5699] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.820 [INFO][5699] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:11.826745 containerd[1477]: 2025-08-13 00:19:11.821 [INFO][5691] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:11.828087 containerd[1477]: time="2025-08-13T00:19:11.827610306Z" level=info msg="TearDown network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\" successfully" Aug 13 00:19:11.828087 containerd[1477]: time="2025-08-13T00:19:11.827645787Z" level=info msg="StopPodSandbox for \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\" returns successfully" Aug 13 00:19:11.828666 containerd[1477]: time="2025-08-13T00:19:11.828629847Z" level=info msg="RemovePodSandbox for \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\"" Aug 13 00:19:11.828851 containerd[1477]: time="2025-08-13T00:19:11.828676008Z" level=info msg="Forcibly stopping sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\"" Aug 13 00:19:11.972548 containerd[1477]: time="2025-08-13T00:19:11.972477697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:11.974470 containerd[1477]: time="2025-08-13T00:19:11.974248932Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Aug 13 00:19:11.976152 containerd[1477]: time="2025-08-13T00:19:11.976093849Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:11.996941 containerd[1477]: time="2025-08-13T00:19:11.996568701Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:19:11.997604 containerd[1477]: time="2025-08-13T00:19:11.997425758Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 1.876866895s" Aug 13 00:19:11.997604 containerd[1477]: time="2025-08-13T00:19:11.997500440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Aug 13 00:19:12.003857 containerd[1477]: time="2025-08-13T00:19:12.003598560Z" level=info msg="CreateContainer within sandbox \"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.927 [WARNING][5714] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"28b87d22-e12e-4c68-948f-91d75e756004", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 18, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-8-f2ca23fedd", ContainerID:"6d39a3641b9a2742cc904031980cf176d4b3964208fc8d336a0902a515f1128e", Pod:"coredns-674b8bbfcf-hrjxb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliabb883014ef", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.928 [INFO][5714] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.928 [INFO][5714] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" iface="eth0" netns="" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.928 [INFO][5714] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.928 [INFO][5714] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.986 [INFO][5726] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.986 [INFO][5726] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:11.986 [INFO][5726] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:12.002 [WARNING][5726] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:12.002 [INFO][5726] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" HandleID="k8s-pod-network.e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Workload="ci--4081--3--5--8--f2ca23fedd-k8s-coredns--674b8bbfcf--hrjxb-eth0" Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:12.008 [INFO][5726] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:19:12.013399 containerd[1477]: 2025-08-13 00:19:12.009 [INFO][5714] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2" Aug 13 00:19:12.013399 containerd[1477]: time="2025-08-13T00:19:12.012557935Z" level=info msg="TearDown network for sandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\" successfully" Aug 13 00:19:12.026410 containerd[1477]: time="2025-08-13T00:19:12.026241801Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 13 00:19:12.026410 containerd[1477]: time="2025-08-13T00:19:12.026386164Z" level=info msg="RemovePodSandbox \"e190bb86f25192a95b79f1176689825fc679b0498d572a49c7fff78b2290f3c2\" returns successfully" Aug 13 00:19:12.036626 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3197089447.mount: Deactivated successfully. Aug 13 00:19:12.060083 containerd[1477]: time="2025-08-13T00:19:12.059569851Z" level=info msg="CreateContainer within sandbox \"a66e3710845ace0091831404571c0755449821c8c7dc2e1e26d19c6a29911614\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"8777660186568cf9eb8232b023db4238088bfb2f7df461be02d25df7aed9db0a\"" Aug 13 00:19:12.067477 containerd[1477]: time="2025-08-13T00:19:12.066593468Z" level=info msg="StartContainer for \"8777660186568cf9eb8232b023db4238088bfb2f7df461be02d25df7aed9db0a\"" Aug 13 00:19:12.114724 systemd[1]: Started cri-containerd-8777660186568cf9eb8232b023db4238088bfb2f7df461be02d25df7aed9db0a.scope - libcontainer container 8777660186568cf9eb8232b023db4238088bfb2f7df461be02d25df7aed9db0a. Aug 13 00:19:12.154408 containerd[1477]: time="2025-08-13T00:19:12.154341338Z" level=info msg="StartContainer for \"8777660186568cf9eb8232b023db4238088bfb2f7df461be02d25df7aed9db0a\" returns successfully" Aug 13 00:19:13.111369 kubelet[2623]: I0813 00:19:13.111197 2623 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 13 00:19:13.117221 kubelet[2623]: I0813 00:19:13.117183 2623 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 13 00:19:21.353996 kubelet[2623]: I0813 00:19:21.352458 2623 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-hgndp" podStartSLOduration=35.235688581 podStartE2EDuration="46.352421378s" podCreationTimestamp="2025-08-13 00:18:35 +0000 UTC" firstStartedPulling="2025-08-13 00:19:00.881935906 +0000 UTC m=+51.048310713" lastFinishedPulling="2025-08-13 00:19:11.998668703 +0000 UTC m=+62.165043510" observedRunningTime="2025-08-13 00:19:12.446411269 +0000 UTC m=+62.612786076" watchObservedRunningTime="2025-08-13 00:19:21.352421378 +0000 UTC m=+71.518796185" Aug 13 00:19:30.530376 kubelet[2623]: I0813 00:19:30.529918 2623 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:19:38.392090 systemd[1]: run-containerd-runc-k8s.io-8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32-runc.b0vOm6.mount: Deactivated successfully. Aug 13 00:19:58.103431 systemd[1]: run-containerd-runc-k8s.io-9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d-runc.gfqdbf.mount: Deactivated successfully. Aug 13 00:20:09.407307 systemd[1]: run-containerd-runc-k8s.io-9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d-runc.QD1XV2.mount: Deactivated successfully. Aug 13 00:20:38.386507 systemd[1]: run-containerd-runc-k8s.io-8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32-runc.9Fzeqx.mount: Deactivated successfully. Aug 13 00:20:39.397716 systemd[1]: run-containerd-runc-k8s.io-9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d-runc.VfuorR.mount: Deactivated successfully. Aug 13 00:20:51.635964 systemd[1]: Started sshd@7-91.99.89.242:22-139.178.89.65:42936.service - OpenSSH per-connection server daemon (139.178.89.65:42936). Aug 13 00:20:52.702272 sshd[6095]: Accepted publickey for core from 139.178.89.65 port 42936 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:20:52.705007 sshd[6095]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:20:52.711356 systemd-logind[1461]: New session 8 of user core. Aug 13 00:20:52.719766 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 13 00:20:53.560966 sshd[6095]: pam_unix(sshd:session): session closed for user core Aug 13 00:20:53.570785 systemd-logind[1461]: Session 8 logged out. Waiting for processes to exit. Aug 13 00:20:53.571516 systemd[1]: sshd@7-91.99.89.242:22-139.178.89.65:42936.service: Deactivated successfully. Aug 13 00:20:53.574427 systemd[1]: session-8.scope: Deactivated successfully. Aug 13 00:20:53.577732 systemd-logind[1461]: Removed session 8. Aug 13 00:20:58.756848 systemd[1]: Started sshd@8-91.99.89.242:22-139.178.89.65:42946.service - OpenSSH per-connection server daemon (139.178.89.65:42946). Aug 13 00:20:59.811551 sshd[6130]: Accepted publickey for core from 139.178.89.65 port 42946 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:20:59.814617 sshd[6130]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:20:59.823550 systemd-logind[1461]: New session 9 of user core. Aug 13 00:20:59.832778 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 13 00:21:00.637784 sshd[6130]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:00.644182 systemd[1]: sshd@8-91.99.89.242:22-139.178.89.65:42946.service: Deactivated successfully. Aug 13 00:21:00.647415 systemd[1]: session-9.scope: Deactivated successfully. Aug 13 00:21:00.649693 systemd-logind[1461]: Session 9 logged out. Waiting for processes to exit. Aug 13 00:21:00.651829 systemd-logind[1461]: Removed session 9. Aug 13 00:21:05.811859 systemd[1]: Started sshd@9-91.99.89.242:22-139.178.89.65:52598.service - OpenSSH per-connection server daemon (139.178.89.65:52598). Aug 13 00:21:06.805566 sshd[6145]: Accepted publickey for core from 139.178.89.65 port 52598 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:06.808286 sshd[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:06.817724 systemd-logind[1461]: New session 10 of user core. Aug 13 00:21:06.822856 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 13 00:21:07.586210 sshd[6145]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:07.591678 systemd[1]: sshd@9-91.99.89.242:22-139.178.89.65:52598.service: Deactivated successfully. Aug 13 00:21:07.594841 systemd[1]: session-10.scope: Deactivated successfully. Aug 13 00:21:07.596230 systemd-logind[1461]: Session 10 logged out. Waiting for processes to exit. Aug 13 00:21:07.599625 systemd-logind[1461]: Removed session 10. Aug 13 00:21:07.784925 systemd[1]: Started sshd@10-91.99.89.242:22-139.178.89.65:52600.service - OpenSSH per-connection server daemon (139.178.89.65:52600). Aug 13 00:21:08.835028 sshd[6159]: Accepted publickey for core from 139.178.89.65 port 52600 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:08.837537 sshd[6159]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:08.844552 systemd-logind[1461]: New session 11 of user core. Aug 13 00:21:08.849698 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 13 00:21:09.720805 sshd[6159]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:09.727302 systemd-logind[1461]: Session 11 logged out. Waiting for processes to exit. Aug 13 00:21:09.727524 systemd[1]: sshd@10-91.99.89.242:22-139.178.89.65:52600.service: Deactivated successfully. Aug 13 00:21:09.731187 systemd[1]: session-11.scope: Deactivated successfully. Aug 13 00:21:09.735516 systemd-logind[1461]: Removed session 11. Aug 13 00:21:09.895006 systemd[1]: Started sshd@11-91.99.89.242:22-139.178.89.65:43838.service - OpenSSH per-connection server daemon (139.178.89.65:43838). Aug 13 00:21:10.888853 sshd[6216]: Accepted publickey for core from 139.178.89.65 port 43838 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:10.890142 sshd[6216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:10.899370 systemd-logind[1461]: New session 12 of user core. Aug 13 00:21:10.903864 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 13 00:21:11.659667 sshd[6216]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:11.664764 systemd[1]: sshd@11-91.99.89.242:22-139.178.89.65:43838.service: Deactivated successfully. Aug 13 00:21:11.667375 systemd[1]: session-12.scope: Deactivated successfully. Aug 13 00:21:11.668986 systemd-logind[1461]: Session 12 logged out. Waiting for processes to exit. Aug 13 00:21:11.670276 systemd-logind[1461]: Removed session 12. Aug 13 00:21:16.837859 systemd[1]: Started sshd@12-91.99.89.242:22-139.178.89.65:43844.service - OpenSSH per-connection server daemon (139.178.89.65:43844). Aug 13 00:21:17.842614 sshd[6233]: Accepted publickey for core from 139.178.89.65 port 43844 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:17.844029 sshd[6233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:17.851335 systemd-logind[1461]: New session 13 of user core. Aug 13 00:21:17.861806 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 13 00:21:18.623574 sshd[6233]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:18.629179 systemd[1]: sshd@12-91.99.89.242:22-139.178.89.65:43844.service: Deactivated successfully. Aug 13 00:21:18.634061 systemd[1]: session-13.scope: Deactivated successfully. Aug 13 00:21:18.637146 systemd-logind[1461]: Session 13 logged out. Waiting for processes to exit. Aug 13 00:21:18.638223 systemd-logind[1461]: Removed session 13. Aug 13 00:21:18.820895 systemd[1]: Started sshd@13-91.99.89.242:22-139.178.89.65:43848.service - OpenSSH per-connection server daemon (139.178.89.65:43848). Aug 13 00:21:19.868586 sshd[6246]: Accepted publickey for core from 139.178.89.65 port 43848 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:19.871419 sshd[6246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:19.878819 systemd-logind[1461]: New session 14 of user core. Aug 13 00:21:19.886975 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 13 00:21:20.836517 sshd[6246]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:20.841802 systemd[1]: sshd@13-91.99.89.242:22-139.178.89.65:43848.service: Deactivated successfully. Aug 13 00:21:20.844694 systemd[1]: session-14.scope: Deactivated successfully. Aug 13 00:21:20.850955 systemd-logind[1461]: Session 14 logged out. Waiting for processes to exit. Aug 13 00:21:20.854167 systemd-logind[1461]: Removed session 14. Aug 13 00:21:21.011302 systemd[1]: Started sshd@14-91.99.89.242:22-139.178.89.65:59096.service - OpenSSH per-connection server daemon (139.178.89.65:59096). Aug 13 00:21:22.012857 sshd[6257]: Accepted publickey for core from 139.178.89.65 port 59096 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:22.016431 sshd[6257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:22.025468 systemd-logind[1461]: New session 15 of user core. Aug 13 00:21:22.029633 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 13 00:21:23.214491 systemd[1]: run-containerd-runc-k8s.io-8043420f863a433810e16b9a959377be67d044bd4b746a4e9adc6cf13e3b0c32-runc.s8GDcU.mount: Deactivated successfully. Aug 13 00:21:23.495468 sshd[6257]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:23.500652 systemd[1]: sshd@14-91.99.89.242:22-139.178.89.65:59096.service: Deactivated successfully. Aug 13 00:21:23.505248 systemd[1]: session-15.scope: Deactivated successfully. Aug 13 00:21:23.508960 systemd-logind[1461]: Session 15 logged out. Waiting for processes to exit. Aug 13 00:21:23.510137 systemd-logind[1461]: Removed session 15. Aug 13 00:21:23.674535 systemd[1]: Started sshd@15-91.99.89.242:22-139.178.89.65:59102.service - OpenSSH per-connection server daemon (139.178.89.65:59102). Aug 13 00:21:24.665529 sshd[6317]: Accepted publickey for core from 139.178.89.65 port 59102 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:24.668199 sshd[6317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:24.677969 systemd-logind[1461]: New session 16 of user core. Aug 13 00:21:24.682688 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 13 00:21:25.612710 sshd[6317]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:25.618801 systemd-logind[1461]: Session 16 logged out. Waiting for processes to exit. Aug 13 00:21:25.621617 systemd[1]: sshd@15-91.99.89.242:22-139.178.89.65:59102.service: Deactivated successfully. Aug 13 00:21:25.629544 systemd[1]: session-16.scope: Deactivated successfully. Aug 13 00:21:25.633807 systemd-logind[1461]: Removed session 16. Aug 13 00:21:25.788508 systemd[1]: Started sshd@16-91.99.89.242:22-139.178.89.65:59116.service - OpenSSH per-connection server daemon (139.178.89.65:59116). Aug 13 00:21:26.775522 sshd[6328]: Accepted publickey for core from 139.178.89.65 port 59116 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:26.777878 sshd[6328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:26.788178 systemd-logind[1461]: New session 17 of user core. Aug 13 00:21:26.794225 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 13 00:21:27.553815 sshd[6328]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:27.561279 systemd[1]: sshd@16-91.99.89.242:22-139.178.89.65:59116.service: Deactivated successfully. Aug 13 00:21:27.564086 systemd[1]: session-17.scope: Deactivated successfully. Aug 13 00:21:27.568333 systemd-logind[1461]: Session 17 logged out. Waiting for processes to exit. Aug 13 00:21:27.571143 systemd-logind[1461]: Removed session 17. Aug 13 00:21:32.733869 systemd[1]: Started sshd@17-91.99.89.242:22-139.178.89.65:42042.service - OpenSSH per-connection server daemon (139.178.89.65:42042). Aug 13 00:21:33.724300 sshd[6349]: Accepted publickey for core from 139.178.89.65 port 42042 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:33.727824 sshd[6349]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:33.736265 systemd-logind[1461]: New session 18 of user core. Aug 13 00:21:33.743651 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 13 00:21:34.488766 sshd[6349]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:34.494912 systemd[1]: sshd@17-91.99.89.242:22-139.178.89.65:42042.service: Deactivated successfully. Aug 13 00:21:34.498051 systemd[1]: session-18.scope: Deactivated successfully. Aug 13 00:21:34.499246 systemd-logind[1461]: Session 18 logged out. Waiting for processes to exit. Aug 13 00:21:34.500947 systemd-logind[1461]: Removed session 18. Aug 13 00:21:39.396720 systemd[1]: run-containerd-runc-k8s.io-9f52b38df3f69ac3a0b7c9d782e30da93c0d3d51c65f5dba76d642f4b489a05d-runc.D6zpIB.mount: Deactivated successfully. Aug 13 00:21:39.670932 systemd[1]: Started sshd@18-91.99.89.242:22-139.178.89.65:48286.service - OpenSSH per-connection server daemon (139.178.89.65:48286). Aug 13 00:21:40.666398 sshd[6401]: Accepted publickey for core from 139.178.89.65 port 48286 ssh2: RSA SHA256:TbpwDUqnmmr/6oeFI65A/iU5DlmHGueKflwEEvdqHG0 Aug 13 00:21:40.669981 sshd[6401]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:21:40.677652 systemd-logind[1461]: New session 19 of user core. Aug 13 00:21:40.683762 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 13 00:21:41.442817 sshd[6401]: pam_unix(sshd:session): session closed for user core Aug 13 00:21:41.447357 systemd[1]: sshd@18-91.99.89.242:22-139.178.89.65:48286.service: Deactivated successfully. Aug 13 00:21:41.449485 systemd[1]: session-19.scope: Deactivated successfully. Aug 13 00:21:41.453151 systemd-logind[1461]: Session 19 logged out. Waiting for processes to exit. Aug 13 00:21:41.454391 systemd-logind[1461]: Removed session 19.