Sep 3 23:25:04.041706 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Sep 3 23:25:04.041723 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Wed Sep 3 22:04:24 -00 2025 Sep 3 23:25:04.041730 kernel: KASLR enabled Sep 3 23:25:04.041734 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Sep 3 23:25:04.041738 kernel: printk: legacy bootconsole [pl11] enabled Sep 3 23:25:04.041742 kernel: efi: EFI v2.7 by EDK II Sep 3 23:25:04.041747 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20f698 RNG=0x3fd5f998 MEMRESERVE=0x3e477598 Sep 3 23:25:04.041751 kernel: random: crng init done Sep 3 23:25:04.041755 kernel: secureboot: Secure boot disabled Sep 3 23:25:04.041759 kernel: ACPI: Early table checksum verification disabled Sep 3 23:25:04.041762 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Sep 3 23:25:04.041766 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041770 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041775 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 3 23:25:04.041780 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041784 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041789 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041794 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041798 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041802 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041806 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Sep 3 23:25:04.041811 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:04.041815 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Sep 3 23:25:04.041819 kernel: ACPI: Use ACPI SPCR as default console: No Sep 3 23:25:04.041884 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Sep 3 23:25:04.041889 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Sep 3 23:25:04.041893 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Sep 3 23:25:04.041897 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Sep 3 23:25:04.041901 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Sep 3 23:25:04.041907 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Sep 3 23:25:04.041911 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Sep 3 23:25:04.041915 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Sep 3 23:25:04.041919 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Sep 3 23:25:04.041923 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Sep 3 23:25:04.041927 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Sep 3 23:25:04.041932 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Sep 3 23:25:04.041936 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Sep 3 23:25:04.041940 kernel: NODE_DATA(0) allocated [mem 0x1bf7fda00-0x1bf804fff] Sep 3 23:25:04.041944 kernel: Zone ranges: Sep 3 23:25:04.041948 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Sep 3 23:25:04.041955 kernel: DMA32 empty Sep 3 23:25:04.041959 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Sep 3 23:25:04.041963 kernel: Device empty Sep 3 23:25:04.041968 kernel: Movable zone start for each node Sep 3 23:25:04.041972 kernel: Early memory node ranges Sep 3 23:25:04.041977 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Sep 3 23:25:04.041981 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Sep 3 23:25:04.041986 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Sep 3 23:25:04.041990 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Sep 3 23:25:04.041994 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Sep 3 23:25:04.041998 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Sep 3 23:25:04.042003 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Sep 3 23:25:04.042007 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Sep 3 23:25:04.042011 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Sep 3 23:25:04.042015 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Sep 3 23:25:04.042020 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Sep 3 23:25:04.042024 kernel: cma: Reserved 16 MiB at 0x000000003d400000 on node -1 Sep 3 23:25:04.042029 kernel: psci: probing for conduit method from ACPI. Sep 3 23:25:04.042033 kernel: psci: PSCIv1.1 detected in firmware. Sep 3 23:25:04.042038 kernel: psci: Using standard PSCI v0.2 function IDs Sep 3 23:25:04.042042 kernel: psci: MIGRATE_INFO_TYPE not supported. Sep 3 23:25:04.042046 kernel: psci: SMC Calling Convention v1.4 Sep 3 23:25:04.042050 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Sep 3 23:25:04.042055 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Sep 3 23:25:04.042059 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 3 23:25:04.042063 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 3 23:25:04.042068 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 3 23:25:04.042072 kernel: Detected PIPT I-cache on CPU0 Sep 3 23:25:04.042077 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Sep 3 23:25:04.042082 kernel: CPU features: detected: GIC system register CPU interface Sep 3 23:25:04.042086 kernel: CPU features: detected: Spectre-v4 Sep 3 23:25:04.042090 kernel: CPU features: detected: Spectre-BHB Sep 3 23:25:04.042095 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 3 23:25:04.042099 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 3 23:25:04.042103 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Sep 3 23:25:04.042108 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 3 23:25:04.042112 kernel: alternatives: applying boot alternatives Sep 3 23:25:04.042117 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=cb633bb0c889435b58a5c40c9c9bc9d5899ece5018569c9fa08f911265d3f18e Sep 3 23:25:04.042122 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 3 23:25:04.042127 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 3 23:25:04.042131 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 3 23:25:04.042136 kernel: Fallback order for Node 0: 0 Sep 3 23:25:04.042140 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Sep 3 23:25:04.042144 kernel: Policy zone: Normal Sep 3 23:25:04.042148 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 3 23:25:04.042153 kernel: software IO TLB: area num 2. Sep 3 23:25:04.042157 kernel: software IO TLB: mapped [mem 0x0000000036280000-0x000000003a280000] (64MB) Sep 3 23:25:04.042161 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 3 23:25:04.042166 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 3 23:25:04.042171 kernel: rcu: RCU event tracing is enabled. Sep 3 23:25:04.042176 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 3 23:25:04.042180 kernel: Trampoline variant of Tasks RCU enabled. Sep 3 23:25:04.042185 kernel: Tracing variant of Tasks RCU enabled. Sep 3 23:25:04.042189 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 3 23:25:04.042193 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 3 23:25:04.042198 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 3 23:25:04.042202 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 3 23:25:04.042207 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 3 23:25:04.042211 kernel: GICv3: 960 SPIs implemented Sep 3 23:25:04.042215 kernel: GICv3: 0 Extended SPIs implemented Sep 3 23:25:04.042219 kernel: Root IRQ handler: gic_handle_irq Sep 3 23:25:04.042224 kernel: GICv3: GICv3 features: 16 PPIs, RSS Sep 3 23:25:04.042229 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Sep 3 23:25:04.042233 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Sep 3 23:25:04.042237 kernel: ITS: No ITS available, not enabling LPIs Sep 3 23:25:04.042242 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 3 23:25:04.042246 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Sep 3 23:25:04.042251 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 3 23:25:04.042255 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Sep 3 23:25:04.042259 kernel: Console: colour dummy device 80x25 Sep 3 23:25:04.042264 kernel: printk: legacy console [tty1] enabled Sep 3 23:25:04.042269 kernel: ACPI: Core revision 20240827 Sep 3 23:25:04.042273 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Sep 3 23:25:04.042279 kernel: pid_max: default: 32768 minimum: 301 Sep 3 23:25:04.042283 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 3 23:25:04.042288 kernel: landlock: Up and running. Sep 3 23:25:04.042292 kernel: SELinux: Initializing. Sep 3 23:25:04.042297 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 3 23:25:04.042304 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 3 23:25:04.042310 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Sep 3 23:25:04.042315 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Sep 3 23:25:04.042319 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 3 23:25:04.042324 kernel: rcu: Hierarchical SRCU implementation. Sep 3 23:25:04.042329 kernel: rcu: Max phase no-delay instances is 400. Sep 3 23:25:04.042334 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 3 23:25:04.042341 kernel: Remapping and enabling EFI services. Sep 3 23:25:04.042345 kernel: smp: Bringing up secondary CPUs ... Sep 3 23:25:04.042350 kernel: Detected PIPT I-cache on CPU1 Sep 3 23:25:04.042355 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Sep 3 23:25:04.042360 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Sep 3 23:25:04.042365 kernel: smp: Brought up 1 node, 2 CPUs Sep 3 23:25:04.042369 kernel: SMP: Total of 2 processors activated. Sep 3 23:25:04.042374 kernel: CPU: All CPU(s) started at EL1 Sep 3 23:25:04.042379 kernel: CPU features: detected: 32-bit EL0 Support Sep 3 23:25:04.042383 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Sep 3 23:25:04.042388 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 3 23:25:04.042393 kernel: CPU features: detected: Common not Private translations Sep 3 23:25:04.042397 kernel: CPU features: detected: CRC32 instructions Sep 3 23:25:04.042403 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Sep 3 23:25:04.042408 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 3 23:25:04.042413 kernel: CPU features: detected: LSE atomic instructions Sep 3 23:25:04.042417 kernel: CPU features: detected: Privileged Access Never Sep 3 23:25:04.042422 kernel: CPU features: detected: Speculation barrier (SB) Sep 3 23:25:04.042426 kernel: CPU features: detected: TLB range maintenance instructions Sep 3 23:25:04.042431 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 3 23:25:04.042436 kernel: CPU features: detected: Scalable Vector Extension Sep 3 23:25:04.042441 kernel: alternatives: applying system-wide alternatives Sep 3 23:25:04.042446 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 3 23:25:04.042451 kernel: SVE: maximum available vector length 16 bytes per vector Sep 3 23:25:04.042455 kernel: SVE: default vector length 16 bytes per vector Sep 3 23:25:04.042460 kernel: Memory: 3959604K/4194160K available (11136K kernel code, 2436K rwdata, 9076K rodata, 38976K init, 1038K bss, 213368K reserved, 16384K cma-reserved) Sep 3 23:25:04.042465 kernel: devtmpfs: initialized Sep 3 23:25:04.042470 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 3 23:25:04.042474 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 3 23:25:04.042479 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 3 23:25:04.042484 kernel: 0 pages in range for non-PLT usage Sep 3 23:25:04.042489 kernel: 508560 pages in range for PLT usage Sep 3 23:25:04.042494 kernel: pinctrl core: initialized pinctrl subsystem Sep 3 23:25:04.042499 kernel: SMBIOS 3.1.0 present. Sep 3 23:25:04.042503 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Sep 3 23:25:04.042508 kernel: DMI: Memory slots populated: 2/2 Sep 3 23:25:04.042513 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 3 23:25:04.042517 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 3 23:25:04.042522 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 3 23:25:04.042527 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 3 23:25:04.042532 kernel: audit: initializing netlink subsys (disabled) Sep 3 23:25:04.042537 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Sep 3 23:25:04.042542 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 3 23:25:04.042546 kernel: cpuidle: using governor menu Sep 3 23:25:04.042551 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 3 23:25:04.042556 kernel: ASID allocator initialised with 32768 entries Sep 3 23:25:04.042560 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 3 23:25:04.042565 kernel: Serial: AMBA PL011 UART driver Sep 3 23:25:04.042570 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 3 23:25:04.042575 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 3 23:25:04.042580 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 3 23:25:04.042584 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 3 23:25:04.042589 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 3 23:25:04.042594 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 3 23:25:04.042598 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 3 23:25:04.042603 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 3 23:25:04.042611 kernel: ACPI: Added _OSI(Module Device) Sep 3 23:25:04.042616 kernel: ACPI: Added _OSI(Processor Device) Sep 3 23:25:04.042621 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 3 23:25:04.042626 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 3 23:25:04.042631 kernel: ACPI: Interpreter enabled Sep 3 23:25:04.042635 kernel: ACPI: Using GIC for interrupt routing Sep 3 23:25:04.042640 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Sep 3 23:25:04.042644 kernel: printk: legacy console [ttyAMA0] enabled Sep 3 23:25:04.042649 kernel: printk: legacy bootconsole [pl11] disabled Sep 3 23:25:04.042654 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Sep 3 23:25:04.042659 kernel: ACPI: CPU0 has been hot-added Sep 3 23:25:04.042664 kernel: ACPI: CPU1 has been hot-added Sep 3 23:25:04.042669 kernel: iommu: Default domain type: Translated Sep 3 23:25:04.042673 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 3 23:25:04.042678 kernel: efivars: Registered efivars operations Sep 3 23:25:04.042683 kernel: vgaarb: loaded Sep 3 23:25:04.042687 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 3 23:25:04.042692 kernel: VFS: Disk quotas dquot_6.6.0 Sep 3 23:25:04.042697 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 3 23:25:04.042702 kernel: pnp: PnP ACPI init Sep 3 23:25:04.042707 kernel: pnp: PnP ACPI: found 0 devices Sep 3 23:25:04.042712 kernel: NET: Registered PF_INET protocol family Sep 3 23:25:04.042716 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 3 23:25:04.042721 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 3 23:25:04.042726 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 3 23:25:04.042731 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 3 23:25:04.042736 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 3 23:25:04.042740 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 3 23:25:04.042745 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 3 23:25:04.042751 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 3 23:25:04.042755 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 3 23:25:04.042760 kernel: PCI: CLS 0 bytes, default 64 Sep 3 23:25:04.042765 kernel: kvm [1]: HYP mode not available Sep 3 23:25:04.042769 kernel: Initialise system trusted keyrings Sep 3 23:25:04.042774 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 3 23:25:04.042779 kernel: Key type asymmetric registered Sep 3 23:25:04.042783 kernel: Asymmetric key parser 'x509' registered Sep 3 23:25:04.042788 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 3 23:25:04.042793 kernel: io scheduler mq-deadline registered Sep 3 23:25:04.042798 kernel: io scheduler kyber registered Sep 3 23:25:04.042803 kernel: io scheduler bfq registered Sep 3 23:25:04.042807 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 3 23:25:04.042812 kernel: thunder_xcv, ver 1.0 Sep 3 23:25:04.042816 kernel: thunder_bgx, ver 1.0 Sep 3 23:25:04.042821 kernel: nicpf, ver 1.0 Sep 3 23:25:04.044861 kernel: nicvf, ver 1.0 Sep 3 23:25:04.044974 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 3 23:25:04.045029 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-03T23:25:03 UTC (1756941903) Sep 3 23:25:04.045036 kernel: efifb: probing for efifb Sep 3 23:25:04.045042 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 3 23:25:04.045047 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 3 23:25:04.045051 kernel: efifb: scrolling: redraw Sep 3 23:25:04.045056 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 3 23:25:04.045061 kernel: Console: switching to colour frame buffer device 128x48 Sep 3 23:25:04.045066 kernel: fb0: EFI VGA frame buffer device Sep 3 23:25:04.045072 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Sep 3 23:25:04.045077 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 3 23:25:04.045082 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 3 23:25:04.045086 kernel: watchdog: NMI not fully supported Sep 3 23:25:04.045091 kernel: watchdog: Hard watchdog permanently disabled Sep 3 23:25:04.045096 kernel: NET: Registered PF_INET6 protocol family Sep 3 23:25:04.045101 kernel: Segment Routing with IPv6 Sep 3 23:25:04.045105 kernel: In-situ OAM (IOAM) with IPv6 Sep 3 23:25:04.045110 kernel: NET: Registered PF_PACKET protocol family Sep 3 23:25:04.045116 kernel: Key type dns_resolver registered Sep 3 23:25:04.045120 kernel: registered taskstats version 1 Sep 3 23:25:04.045125 kernel: Loading compiled-in X.509 certificates Sep 3 23:25:04.045130 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 08fc774dab168e64ce30c382a4517d40e72c4744' Sep 3 23:25:04.045135 kernel: Demotion targets for Node 0: null Sep 3 23:25:04.045139 kernel: Key type .fscrypt registered Sep 3 23:25:04.045144 kernel: Key type fscrypt-provisioning registered Sep 3 23:25:04.045149 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 3 23:25:04.045154 kernel: ima: Allocated hash algorithm: sha1 Sep 3 23:25:04.045159 kernel: ima: No architecture policies found Sep 3 23:25:04.045164 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 3 23:25:04.045168 kernel: clk: Disabling unused clocks Sep 3 23:25:04.045173 kernel: PM: genpd: Disabling unused power domains Sep 3 23:25:04.045178 kernel: Warning: unable to open an initial console. Sep 3 23:25:04.045183 kernel: Freeing unused kernel memory: 38976K Sep 3 23:25:04.045187 kernel: Run /init as init process Sep 3 23:25:04.045192 kernel: with arguments: Sep 3 23:25:04.045197 kernel: /init Sep 3 23:25:04.045202 kernel: with environment: Sep 3 23:25:04.045207 kernel: HOME=/ Sep 3 23:25:04.045212 kernel: TERM=linux Sep 3 23:25:04.045216 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 3 23:25:04.045222 systemd[1]: Successfully made /usr/ read-only. Sep 3 23:25:04.045229 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 3 23:25:04.045234 systemd[1]: Detected virtualization microsoft. Sep 3 23:25:04.045240 systemd[1]: Detected architecture arm64. Sep 3 23:25:04.045245 systemd[1]: Running in initrd. Sep 3 23:25:04.045250 systemd[1]: No hostname configured, using default hostname. Sep 3 23:25:04.045255 systemd[1]: Hostname set to . Sep 3 23:25:04.045260 systemd[1]: Initializing machine ID from random generator. Sep 3 23:25:04.045265 systemd[1]: Queued start job for default target initrd.target. Sep 3 23:25:04.045271 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 3 23:25:04.045276 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 3 23:25:04.045282 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 3 23:25:04.045288 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 3 23:25:04.045293 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 3 23:25:04.045298 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 3 23:25:04.045304 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 3 23:25:04.045310 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 3 23:25:04.045315 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 3 23:25:04.045321 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 3 23:25:04.045326 systemd[1]: Reached target paths.target - Path Units. Sep 3 23:25:04.045331 systemd[1]: Reached target slices.target - Slice Units. Sep 3 23:25:04.045336 systemd[1]: Reached target swap.target - Swaps. Sep 3 23:25:04.045341 systemd[1]: Reached target timers.target - Timer Units. Sep 3 23:25:04.045346 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 3 23:25:04.045351 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 3 23:25:04.045356 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 3 23:25:04.045361 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 3 23:25:04.045367 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 3 23:25:04.045373 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 3 23:25:04.045378 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 3 23:25:04.045383 systemd[1]: Reached target sockets.target - Socket Units. Sep 3 23:25:04.045388 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 3 23:25:04.045393 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 3 23:25:04.045398 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 3 23:25:04.045404 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 3 23:25:04.045410 systemd[1]: Starting systemd-fsck-usr.service... Sep 3 23:25:04.045415 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 3 23:25:04.045420 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 3 23:25:04.045437 systemd-journald[224]: Collecting audit messages is disabled. Sep 3 23:25:04.045451 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:04.045457 systemd-journald[224]: Journal started Sep 3 23:25:04.045471 systemd-journald[224]: Runtime Journal (/run/log/journal/3c6d83acdfd447fab2c7bb694ee2ac39) is 8M, max 78.5M, 70.5M free. Sep 3 23:25:04.053572 systemd-modules-load[226]: Inserted module 'overlay' Sep 3 23:25:04.069897 systemd[1]: Started systemd-journald.service - Journal Service. Sep 3 23:25:04.069924 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 3 23:25:04.077491 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 3 23:25:04.087881 kernel: Bridge firewalling registered Sep 3 23:25:04.088057 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 3 23:25:04.089617 systemd-modules-load[226]: Inserted module 'br_netfilter' Sep 3 23:25:04.096064 systemd[1]: Finished systemd-fsck-usr.service. Sep 3 23:25:04.105017 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 3 23:25:04.118237 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:04.132944 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 3 23:25:04.154283 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 3 23:25:04.158750 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 3 23:25:04.178264 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 3 23:25:04.190713 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 3 23:25:04.199676 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 3 23:25:04.212135 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 3 23:25:04.218633 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 3 23:25:04.225721 systemd-tmpfiles[249]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 3 23:25:04.240195 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 3 23:25:04.247392 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 3 23:25:04.267974 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 3 23:25:04.278955 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 3 23:25:04.290985 dracut-cmdline[259]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=cb633bb0c889435b58a5c40c9c9bc9d5899ece5018569c9fa08f911265d3f18e Sep 3 23:25:04.334308 systemd-resolved[274]: Positive Trust Anchors: Sep 3 23:25:04.334325 systemd-resolved[274]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 3 23:25:04.334344 systemd-resolved[274]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 3 23:25:04.336314 systemd-resolved[274]: Defaulting to hostname 'linux'. Sep 3 23:25:04.337660 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 3 23:25:04.343582 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 3 23:25:04.417839 kernel: SCSI subsystem initialized Sep 3 23:25:04.422837 kernel: Loading iSCSI transport class v2.0-870. Sep 3 23:25:04.430844 kernel: iscsi: registered transport (tcp) Sep 3 23:25:04.443889 kernel: iscsi: registered transport (qla4xxx) Sep 3 23:25:04.443921 kernel: QLogic iSCSI HBA Driver Sep 3 23:25:04.456399 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 3 23:25:04.476946 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 3 23:25:04.489745 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 3 23:25:04.531028 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 3 23:25:04.538942 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 3 23:25:04.597837 kernel: raid6: neonx8 gen() 18558 MB/s Sep 3 23:25:04.615830 kernel: raid6: neonx4 gen() 18553 MB/s Sep 3 23:25:04.634829 kernel: raid6: neonx2 gen() 17100 MB/s Sep 3 23:25:04.653830 kernel: raid6: neonx1 gen() 15029 MB/s Sep 3 23:25:04.673838 kernel: raid6: int64x8 gen() 10521 MB/s Sep 3 23:25:04.692907 kernel: raid6: int64x4 gen() 10612 MB/s Sep 3 23:25:04.711830 kernel: raid6: int64x2 gen() 8978 MB/s Sep 3 23:25:04.733879 kernel: raid6: int64x1 gen() 7001 MB/s Sep 3 23:25:04.733887 kernel: raid6: using algorithm neonx8 gen() 18558 MB/s Sep 3 23:25:04.756090 kernel: raid6: .... xor() 14903 MB/s, rmw enabled Sep 3 23:25:04.756134 kernel: raid6: using neon recovery algorithm Sep 3 23:25:04.764065 kernel: xor: measuring software checksum speed Sep 3 23:25:04.764107 kernel: 8regs : 28645 MB/sec Sep 3 23:25:04.766698 kernel: 32regs : 28806 MB/sec Sep 3 23:25:04.769258 kernel: arm64_neon : 37732 MB/sec Sep 3 23:25:04.772229 kernel: xor: using function: arm64_neon (37732 MB/sec) Sep 3 23:25:04.810846 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 3 23:25:04.815309 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 3 23:25:04.825173 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 3 23:25:04.854746 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 3 23:25:04.859158 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 3 23:25:04.871718 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 3 23:25:04.894575 dracut-pre-trigger[480]: rd.md=0: removing MD RAID activation Sep 3 23:25:04.913062 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 3 23:25:04.923214 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 3 23:25:04.966092 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 3 23:25:04.976979 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 3 23:25:05.034345 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:05.038726 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:05.060007 kernel: hv_vmbus: Vmbus version:5.3 Sep 3 23:25:05.060025 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 3 23:25:05.060032 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 3 23:25:05.060040 kernel: hv_vmbus: registering driver hid_hyperv Sep 3 23:25:05.051723 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:05.097254 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Sep 3 23:25:05.097271 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Sep 3 23:25:05.097279 kernel: hv_vmbus: registering driver hv_netvsc Sep 3 23:25:05.097285 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 3 23:25:05.071964 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:05.120148 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 3 23:25:05.120163 kernel: hv_vmbus: registering driver hv_storvsc Sep 3 23:25:05.111210 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 3 23:25:05.130338 kernel: PTP clock support registered Sep 3 23:25:05.128433 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:05.128780 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:05.156736 kernel: hv_utils: Registering HyperV Utility Driver Sep 3 23:25:05.156750 kernel: hv_vmbus: registering driver hv_utils Sep 3 23:25:05.156758 kernel: scsi host1: storvsc_host_t Sep 3 23:25:05.156781 kernel: hv_utils: Heartbeat IC version 3.0 Sep 3 23:25:05.147712 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:05.492178 kernel: hv_utils: Shutdown IC version 3.2 Sep 3 23:25:05.492192 kernel: scsi host0: storvsc_host_t Sep 3 23:25:05.492303 kernel: hv_utils: TimeSync IC version 4.0 Sep 3 23:25:05.492310 kernel: hv_netvsc 000d3afc-4736-000d-3afc-4736000d3afc eth0: VF slot 1 added Sep 3 23:25:05.492757 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Sep 3 23:25:05.478380 systemd-resolved[274]: Clock change detected. Flushing caches. Sep 3 23:25:05.504568 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 3 23:25:05.512545 kernel: hv_vmbus: registering driver hv_pci Sep 3 23:25:05.518922 kernel: hv_pci b4503121-9872-485f-9cc9-ebd40ab8e1c8: PCI VMBus probing: Using version 0x10004 Sep 3 23:25:05.519059 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Sep 3 23:25:05.525128 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 3 23:25:05.526538 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 3 23:25:05.526635 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Sep 3 23:25:05.536989 kernel: hv_pci b4503121-9872-485f-9cc9-ebd40ab8e1c8: PCI host bridge to bus 9872:00 Sep 3 23:25:05.546548 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Sep 3 23:25:05.546635 kernel: pci_bus 9872:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Sep 3 23:25:05.546706 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#102 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 3 23:25:05.546764 kernel: pci_bus 9872:00: No busn resource found for root bus, will use [bus 00-ff] Sep 3 23:25:05.555966 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#109 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 3 23:25:05.556072 kernel: pci 9872:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Sep 3 23:25:05.562922 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:05.588638 kernel: pci 9872:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Sep 3 23:25:05.588665 kernel: pci 9872:00:02.0: enabling Extended Tags Sep 3 23:25:05.588675 kernel: pci 9872:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 9872:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Sep 3 23:25:05.592406 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 3 23:25:05.592422 kernel: pci_bus 9872:00: busn_res: [bus 00-ff] end is updated to 00 Sep 3 23:25:05.604239 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 3 23:25:05.604372 kernel: pci 9872:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Sep 3 23:25:05.613804 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 3 23:25:05.613949 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 3 23:25:05.615546 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 3 23:25:05.633575 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#85 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 3 23:25:05.655577 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#115 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 3 23:25:05.670727 kernel: mlx5_core 9872:00:02.0: enabling device (0000 -> 0002) Sep 3 23:25:05.678104 kernel: mlx5_core 9872:00:02.0: PTM is not supported by PCIe Sep 3 23:25:05.678212 kernel: mlx5_core 9872:00:02.0: firmware version: 16.30.5006 Sep 3 23:25:05.847737 kernel: hv_netvsc 000d3afc-4736-000d-3afc-4736000d3afc eth0: VF registering: eth1 Sep 3 23:25:05.847943 kernel: mlx5_core 9872:00:02.0 eth1: joined to eth0 Sep 3 23:25:05.853273 kernel: mlx5_core 9872:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Sep 3 23:25:05.862548 kernel: mlx5_core 9872:00:02.0 enP39026s1: renamed from eth1 Sep 3 23:25:06.319566 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Sep 3 23:25:06.356215 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 3 23:25:06.395159 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Sep 3 23:25:06.405457 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Sep 3 23:25:06.411625 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Sep 3 23:25:06.424204 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 3 23:25:06.434802 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 3 23:25:06.443880 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 3 23:25:06.453228 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 3 23:25:06.462211 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 3 23:25:06.481167 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 3 23:25:06.499551 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#142 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 3 23:25:06.510989 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 3 23:25:06.523543 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 3 23:25:07.535425 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#181 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 3 23:25:07.546515 disk-uuid[655]: The operation has completed successfully. Sep 3 23:25:07.553056 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 3 23:25:07.615879 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 3 23:25:07.615958 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 3 23:25:07.641309 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 3 23:25:07.657851 sh[820]: Success Sep 3 23:25:07.690714 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 3 23:25:07.690765 kernel: device-mapper: uevent: version 1.0.3 Sep 3 23:25:07.695624 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 3 23:25:07.705550 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 3 23:25:08.134439 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 3 23:25:08.144198 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 3 23:25:08.151738 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 3 23:25:08.176549 kernel: BTRFS: device fsid e8b97e78-d30f-4a41-b431-d82f3afef949 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (838) Sep 3 23:25:08.186154 kernel: BTRFS info (device dm-0): first mount of filesystem e8b97e78-d30f-4a41-b431-d82f3afef949 Sep 3 23:25:08.186176 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:08.624795 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 3 23:25:08.624869 kernel: BTRFS info (device dm-0): enabling free space tree Sep 3 23:25:08.659164 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 3 23:25:08.663518 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 3 23:25:08.671233 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 3 23:25:08.671886 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 3 23:25:08.692196 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 3 23:25:08.722589 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (871) Sep 3 23:25:08.732738 kernel: BTRFS info (device sda6): first mount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:08.732764 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:08.781891 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 3 23:25:08.801580 kernel: BTRFS info (device sda6): turning on async discard Sep 3 23:25:08.801601 kernel: BTRFS info (device sda6): enabling free space tree Sep 3 23:25:08.801608 kernel: BTRFS info (device sda6): last unmount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:08.795310 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 3 23:25:08.815785 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 3 23:25:08.822629 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 3 23:25:08.846165 systemd-networkd[1003]: lo: Link UP Sep 3 23:25:08.846175 systemd-networkd[1003]: lo: Gained carrier Sep 3 23:25:08.846863 systemd-networkd[1003]: Enumeration completed Sep 3 23:25:08.847265 systemd-networkd[1003]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:08.847267 systemd-networkd[1003]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 3 23:25:08.848911 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 3 23:25:08.853471 systemd[1]: Reached target network.target - Network. Sep 3 23:25:08.924545 kernel: mlx5_core 9872:00:02.0 enP39026s1: Link up Sep 3 23:25:08.957064 systemd-networkd[1003]: enP39026s1: Link UP Sep 3 23:25:08.959972 kernel: hv_netvsc 000d3afc-4736-000d-3afc-4736000d3afc eth0: Data path switched to VF: enP39026s1 Sep 3 23:25:08.957123 systemd-networkd[1003]: eth0: Link UP Sep 3 23:25:08.957211 systemd-networkd[1003]: eth0: Gained carrier Sep 3 23:25:08.957221 systemd-networkd[1003]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:08.974133 systemd-networkd[1003]: enP39026s1: Gained carrier Sep 3 23:25:08.985556 systemd-networkd[1003]: eth0: DHCPv4 address 10.200.20.13/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 3 23:25:10.121587 ignition[1008]: Ignition 2.21.0 Sep 3 23:25:10.121599 ignition[1008]: Stage: fetch-offline Sep 3 23:25:10.125628 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 3 23:25:10.121676 ignition[1008]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:10.133395 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 3 23:25:10.121682 ignition[1008]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:10.121764 ignition[1008]: parsed url from cmdline: "" Sep 3 23:25:10.121766 ignition[1008]: no config URL provided Sep 3 23:25:10.121769 ignition[1008]: reading system config file "/usr/lib/ignition/user.ign" Sep 3 23:25:10.121774 ignition[1008]: no config at "/usr/lib/ignition/user.ign" Sep 3 23:25:10.121778 ignition[1008]: failed to fetch config: resource requires networking Sep 3 23:25:10.122182 ignition[1008]: Ignition finished successfully Sep 3 23:25:10.165476 ignition[1017]: Ignition 2.21.0 Sep 3 23:25:10.165481 ignition[1017]: Stage: fetch Sep 3 23:25:10.165872 ignition[1017]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:10.165881 ignition[1017]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:10.165958 ignition[1017]: parsed url from cmdline: "" Sep 3 23:25:10.165960 ignition[1017]: no config URL provided Sep 3 23:25:10.165964 ignition[1017]: reading system config file "/usr/lib/ignition/user.ign" Sep 3 23:25:10.165973 ignition[1017]: no config at "/usr/lib/ignition/user.ign" Sep 3 23:25:10.165994 ignition[1017]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 3 23:25:10.264012 ignition[1017]: GET result: OK Sep 3 23:25:10.264095 ignition[1017]: config has been read from IMDS userdata Sep 3 23:25:10.266900 unknown[1017]: fetched base config from "system" Sep 3 23:25:10.264117 ignition[1017]: parsing config with SHA512: c35030d9f9d7dd0bf7022d2cde309355479e9895693802def73ad6f8c55939047d46ad23de41d4a1f32f8852c83de75ac7e86aa59aa5437defe883af35b97103 Sep 3 23:25:10.266905 unknown[1017]: fetched base config from "system" Sep 3 23:25:10.267159 ignition[1017]: fetch: fetch complete Sep 3 23:25:10.266908 unknown[1017]: fetched user config from "azure" Sep 3 23:25:10.267162 ignition[1017]: fetch: fetch passed Sep 3 23:25:10.272383 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 3 23:25:10.267197 ignition[1017]: Ignition finished successfully Sep 3 23:25:10.284389 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 3 23:25:10.314800 ignition[1023]: Ignition 2.21.0 Sep 3 23:25:10.314813 ignition[1023]: Stage: kargs Sep 3 23:25:10.320478 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 3 23:25:10.314949 ignition[1023]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:10.327635 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 3 23:25:10.314956 ignition[1023]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:10.315675 ignition[1023]: kargs: kargs passed Sep 3 23:25:10.315713 ignition[1023]: Ignition finished successfully Sep 3 23:25:10.360726 ignition[1030]: Ignition 2.21.0 Sep 3 23:25:10.363067 ignition[1030]: Stage: disks Sep 3 23:25:10.363223 ignition[1030]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:10.368783 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 3 23:25:10.363230 ignition[1030]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:10.376144 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 3 23:25:10.363973 ignition[1030]: disks: disks passed Sep 3 23:25:10.384057 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 3 23:25:10.364022 ignition[1030]: Ignition finished successfully Sep 3 23:25:10.393504 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 3 23:25:10.401655 systemd[1]: Reached target sysinit.target - System Initialization. Sep 3 23:25:10.408201 systemd[1]: Reached target basic.target - Basic System. Sep 3 23:25:10.417126 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 3 23:25:10.496326 systemd-fsck[1039]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 3 23:25:10.503946 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 3 23:25:10.510017 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 3 23:25:10.882642 systemd-networkd[1003]: eth0: Gained IPv6LL Sep 3 23:25:12.899548 kernel: EXT4-fs (sda9): mounted filesystem d953e3b7-a0cb-45f7-b3a7-216a9a578dda r/w with ordered data mode. Quota mode: none. Sep 3 23:25:12.899752 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 3 23:25:12.903643 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 3 23:25:12.937885 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 3 23:25:12.955104 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 3 23:25:12.960518 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 3 23:25:12.971936 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 3 23:25:12.971966 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 3 23:25:13.013242 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1053) Sep 3 23:25:13.013260 kernel: BTRFS info (device sda6): first mount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:13.013267 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:12.990971 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 3 23:25:12.995651 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 3 23:25:13.028613 kernel: BTRFS info (device sda6): turning on async discard Sep 3 23:25:13.028646 kernel: BTRFS info (device sda6): enabling free space tree Sep 3 23:25:13.030318 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 3 23:25:13.784853 coreos-metadata[1055]: Sep 03 23:25:13.784 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 3 23:25:13.791993 coreos-metadata[1055]: Sep 03 23:25:13.791 INFO Fetch successful Sep 3 23:25:13.791993 coreos-metadata[1055]: Sep 03 23:25:13.791 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 3 23:25:13.806085 coreos-metadata[1055]: Sep 03 23:25:13.800 INFO Fetch successful Sep 3 23:25:13.806085 coreos-metadata[1055]: Sep 03 23:25:13.800 INFO wrote hostname ci-4372.1.0-n-989a023a05 to /sysroot/etc/hostname Sep 3 23:25:13.806411 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 3 23:25:14.078638 initrd-setup-root[1083]: cut: /sysroot/etc/passwd: No such file or directory Sep 3 23:25:14.181104 initrd-setup-root[1090]: cut: /sysroot/etc/group: No such file or directory Sep 3 23:25:14.187943 initrd-setup-root[1097]: cut: /sysroot/etc/shadow: No such file or directory Sep 3 23:25:14.210567 initrd-setup-root[1104]: cut: /sysroot/etc/gshadow: No such file or directory Sep 3 23:25:15.373820 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 3 23:25:15.383468 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 3 23:25:15.387443 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 3 23:25:15.405061 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 3 23:25:15.414584 kernel: BTRFS info (device sda6): last unmount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:15.431572 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 3 23:25:15.441044 ignition[1172]: INFO : Ignition 2.21.0 Sep 3 23:25:15.441044 ignition[1172]: INFO : Stage: mount Sep 3 23:25:15.441044 ignition[1172]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:15.441044 ignition[1172]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:15.441044 ignition[1172]: INFO : mount: mount passed Sep 3 23:25:15.441044 ignition[1172]: INFO : Ignition finished successfully Sep 3 23:25:15.441055 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 3 23:25:15.446495 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 3 23:25:15.471394 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 3 23:25:15.503289 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1185) Sep 3 23:25:15.503319 kernel: BTRFS info (device sda6): first mount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:15.507916 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:15.516470 kernel: BTRFS info (device sda6): turning on async discard Sep 3 23:25:15.516481 kernel: BTRFS info (device sda6): enabling free space tree Sep 3 23:25:15.517852 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 3 23:25:15.545316 ignition[1202]: INFO : Ignition 2.21.0 Sep 3 23:25:15.545316 ignition[1202]: INFO : Stage: files Sep 3 23:25:15.552016 ignition[1202]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:15.552016 ignition[1202]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:15.552016 ignition[1202]: DEBUG : files: compiled without relabeling support, skipping Sep 3 23:25:15.592355 ignition[1202]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 3 23:25:15.592355 ignition[1202]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 3 23:25:15.651734 ignition[1202]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 3 23:25:15.656942 ignition[1202]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 3 23:25:15.656942 ignition[1202]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 3 23:25:15.652066 unknown[1202]: wrote ssh authorized keys file for user: core Sep 3 23:25:15.715204 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 3 23:25:15.722771 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 3 23:25:15.746173 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 3 23:25:15.839559 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 3 23:25:15.848449 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 3 23:25:15.906731 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 3 23:25:15.906731 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 3 23:25:15.906731 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 3 23:25:15.906731 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 3 23:25:15.906731 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 3 23:25:15.906731 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 3 23:25:16.366622 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 3 23:25:16.591104 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 3 23:25:16.591104 ignition[1202]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 3 23:25:16.641303 ignition[1202]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 3 23:25:16.656529 ignition[1202]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 3 23:25:16.665616 ignition[1202]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 3 23:25:16.665616 ignition[1202]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 3 23:25:16.665616 ignition[1202]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 3 23:25:16.665616 ignition[1202]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 3 23:25:16.665616 ignition[1202]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 3 23:25:16.665616 ignition[1202]: INFO : files: files passed Sep 3 23:25:16.665616 ignition[1202]: INFO : Ignition finished successfully Sep 3 23:25:16.666563 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 3 23:25:16.675309 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 3 23:25:16.704095 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 3 23:25:16.714374 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 3 23:25:16.719820 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 3 23:25:16.745228 initrd-setup-root-after-ignition[1231]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 3 23:25:16.745228 initrd-setup-root-after-ignition[1231]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 3 23:25:16.758203 initrd-setup-root-after-ignition[1235]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 3 23:25:16.752246 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 3 23:25:16.763245 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 3 23:25:16.773978 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 3 23:25:16.824288 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 3 23:25:16.824393 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 3 23:25:16.833577 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 3 23:25:16.842843 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 3 23:25:16.850887 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 3 23:25:16.851520 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 3 23:25:16.880553 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 3 23:25:16.886756 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 3 23:25:16.911283 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 3 23:25:16.916110 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 3 23:25:16.925203 systemd[1]: Stopped target timers.target - Timer Units. Sep 3 23:25:16.933109 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 3 23:25:16.933200 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 3 23:25:16.945113 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 3 23:25:16.950013 systemd[1]: Stopped target basic.target - Basic System. Sep 3 23:25:16.958218 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 3 23:25:16.966507 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 3 23:25:16.974631 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 3 23:25:16.983292 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 3 23:25:16.992708 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 3 23:25:17.001736 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 3 23:25:17.011150 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 3 23:25:17.019031 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 3 23:25:17.028058 systemd[1]: Stopped target swap.target - Swaps. Sep 3 23:25:17.035428 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 3 23:25:17.035541 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 3 23:25:17.047219 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 3 23:25:17.052558 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 3 23:25:17.062119 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 3 23:25:17.062186 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 3 23:25:17.072106 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 3 23:25:17.072200 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 3 23:25:17.087000 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 3 23:25:17.087094 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 3 23:25:17.092899 systemd[1]: ignition-files.service: Deactivated successfully. Sep 3 23:25:17.092983 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 3 23:25:17.101654 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 3 23:25:17.171466 ignition[1255]: INFO : Ignition 2.21.0 Sep 3 23:25:17.171466 ignition[1255]: INFO : Stage: umount Sep 3 23:25:17.171466 ignition[1255]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:17.171466 ignition[1255]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:17.171466 ignition[1255]: INFO : umount: umount passed Sep 3 23:25:17.171466 ignition[1255]: INFO : Ignition finished successfully Sep 3 23:25:17.101729 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 3 23:25:17.112098 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 3 23:25:17.126414 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 3 23:25:17.126521 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 3 23:25:17.142357 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 3 23:25:17.152845 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 3 23:25:17.152956 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 3 23:25:17.166733 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 3 23:25:17.166845 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 3 23:25:17.181759 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 3 23:25:17.182484 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 3 23:25:17.184550 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 3 23:25:17.190429 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 3 23:25:17.190495 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 3 23:25:17.200428 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 3 23:25:17.200501 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 3 23:25:17.205925 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 3 23:25:17.205966 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 3 23:25:17.211578 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 3 23:25:17.211615 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 3 23:25:17.223073 systemd[1]: Stopped target network.target - Network. Sep 3 23:25:17.231149 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 3 23:25:17.231197 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 3 23:25:17.238987 systemd[1]: Stopped target paths.target - Path Units. Sep 3 23:25:17.247347 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 3 23:25:17.251014 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 3 23:25:17.256285 systemd[1]: Stopped target slices.target - Slice Units. Sep 3 23:25:17.263524 systemd[1]: Stopped target sockets.target - Socket Units. Sep 3 23:25:17.272079 systemd[1]: iscsid.socket: Deactivated successfully. Sep 3 23:25:17.272118 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 3 23:25:17.280359 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 3 23:25:17.280390 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 3 23:25:17.289642 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 3 23:25:17.289689 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 3 23:25:17.298666 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 3 23:25:17.298691 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 3 23:25:17.306978 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 3 23:25:17.508232 kernel: hv_netvsc 000d3afc-4736-000d-3afc-4736000d3afc eth0: Data path switched from VF: enP39026s1 Sep 3 23:25:17.314982 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 3 23:25:17.328248 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 3 23:25:17.328340 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 3 23:25:17.341732 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 3 23:25:17.341887 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 3 23:25:17.343552 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 3 23:25:17.354907 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 3 23:25:17.355781 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 3 23:25:17.362775 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 3 23:25:17.362805 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 3 23:25:17.372255 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 3 23:25:17.389732 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 3 23:25:17.389785 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 3 23:25:17.394910 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 3 23:25:17.394944 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 3 23:25:17.406567 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 3 23:25:17.406603 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 3 23:25:17.410887 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 3 23:25:17.410918 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 3 23:25:17.424963 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 3 23:25:17.432627 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 3 23:25:17.432677 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 3 23:25:17.471000 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 3 23:25:17.471153 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 3 23:25:17.480739 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 3 23:25:17.482549 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 3 23:25:17.495196 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 3 23:25:17.495240 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 3 23:25:17.503818 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 3 23:25:17.503843 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 3 23:25:17.512526 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 3 23:25:17.512578 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 3 23:25:17.526633 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 3 23:25:17.526685 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 3 23:25:17.541065 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 3 23:25:17.541111 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 3 23:25:17.553813 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 3 23:25:17.553858 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 3 23:25:17.563573 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 3 23:25:17.582852 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 3 23:25:17.582912 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 3 23:25:17.592157 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 3 23:25:17.592201 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 3 23:25:17.601378 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:17.601419 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:17.785537 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). Sep 3 23:25:17.611488 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 3 23:25:17.611561 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 3 23:25:17.611589 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 3 23:25:17.611815 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 3 23:25:17.611890 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 3 23:25:17.618651 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 3 23:25:17.618725 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 3 23:25:17.628012 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 3 23:25:17.636929 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 3 23:25:17.674060 systemd[1]: Switching root. Sep 3 23:25:17.832906 systemd-journald[224]: Journal stopped Sep 3 23:25:25.300009 kernel: SELinux: policy capability network_peer_controls=1 Sep 3 23:25:25.300028 kernel: SELinux: policy capability open_perms=1 Sep 3 23:25:25.300035 kernel: SELinux: policy capability extended_socket_class=1 Sep 3 23:25:25.300040 kernel: SELinux: policy capability always_check_network=0 Sep 3 23:25:25.300046 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 3 23:25:25.300051 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 3 23:25:25.300057 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 3 23:25:25.300063 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 3 23:25:25.300068 kernel: SELinux: policy capability userspace_initial_context=0 Sep 3 23:25:25.300073 kernel: audit: type=1403 audit(1756941918.788:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 3 23:25:25.300080 systemd[1]: Successfully loaded SELinux policy in 132.480ms. Sep 3 23:25:25.300087 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.852ms. Sep 3 23:25:25.300094 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 3 23:25:25.300100 systemd[1]: Detected virtualization microsoft. Sep 3 23:25:25.300106 systemd[1]: Detected architecture arm64. Sep 3 23:25:25.300113 systemd[1]: Detected first boot. Sep 3 23:25:25.300119 systemd[1]: Hostname set to . Sep 3 23:25:25.300125 systemd[1]: Initializing machine ID from random generator. Sep 3 23:25:25.300133 zram_generator::config[1298]: No configuration found. Sep 3 23:25:25.300139 kernel: NET: Registered PF_VSOCK protocol family Sep 3 23:25:25.300144 systemd[1]: Populated /etc with preset unit settings. Sep 3 23:25:25.300151 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 3 23:25:25.300158 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 3 23:25:25.300164 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 3 23:25:25.300170 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 3 23:25:25.300176 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 3 23:25:25.300182 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 3 23:25:25.300188 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 3 23:25:25.300194 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 3 23:25:25.300201 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 3 23:25:25.300207 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 3 23:25:25.300213 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 3 23:25:25.300219 systemd[1]: Created slice user.slice - User and Session Slice. Sep 3 23:25:25.300225 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 3 23:25:25.300231 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 3 23:25:25.300237 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 3 23:25:25.300243 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 3 23:25:25.300249 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 3 23:25:25.300256 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 3 23:25:25.300263 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 3 23:25:25.300270 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 3 23:25:25.300276 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 3 23:25:25.300283 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 3 23:25:25.300289 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 3 23:25:25.300295 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 3 23:25:25.300302 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 3 23:25:25.300308 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 3 23:25:25.300314 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 3 23:25:25.300320 systemd[1]: Reached target slices.target - Slice Units. Sep 3 23:25:25.300326 systemd[1]: Reached target swap.target - Swaps. Sep 3 23:25:25.300332 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 3 23:25:25.300338 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 3 23:25:25.300346 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 3 23:25:25.300352 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 3 23:25:25.300358 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 3 23:25:25.300364 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 3 23:25:25.300370 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 3 23:25:25.300376 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 3 23:25:25.300383 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 3 23:25:25.300389 systemd[1]: Mounting media.mount - External Media Directory... Sep 3 23:25:25.300395 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 3 23:25:25.300402 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 3 23:25:25.300408 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 3 23:25:25.300414 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 3 23:25:25.300420 systemd[1]: Reached target machines.target - Containers. Sep 3 23:25:25.300426 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 3 23:25:25.300433 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 3 23:25:25.300440 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 3 23:25:25.300446 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 3 23:25:25.300452 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 3 23:25:25.300458 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 3 23:25:25.300464 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 3 23:25:25.300470 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 3 23:25:25.300476 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 3 23:25:25.300482 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 3 23:25:25.300489 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 3 23:25:25.300495 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 3 23:25:25.300501 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 3 23:25:25.300507 systemd[1]: Stopped systemd-fsck-usr.service. Sep 3 23:25:25.300514 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 3 23:25:25.300520 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 3 23:25:25.300536 kernel: fuse: init (API version 7.41) Sep 3 23:25:25.300542 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 3 23:25:25.300549 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 3 23:25:25.300555 kernel: loop: module loaded Sep 3 23:25:25.300560 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 3 23:25:25.300567 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 3 23:25:25.300573 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 3 23:25:25.300579 systemd[1]: verity-setup.service: Deactivated successfully. Sep 3 23:25:25.300585 systemd[1]: Stopped verity-setup.service. Sep 3 23:25:25.300602 systemd-journald[1378]: Collecting audit messages is disabled. Sep 3 23:25:25.300618 kernel: ACPI: bus type drm_connector registered Sep 3 23:25:25.300624 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 3 23:25:25.300631 systemd-journald[1378]: Journal started Sep 3 23:25:25.300645 systemd-journald[1378]: Runtime Journal (/run/log/journal/de4713bb722e4b058aeba336c37ca6bb) is 8M, max 78.5M, 70.5M free. Sep 3 23:25:24.461107 systemd[1]: Queued start job for default target multi-user.target. Sep 3 23:25:24.473931 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 3 23:25:24.474284 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 3 23:25:24.474554 systemd[1]: systemd-journald.service: Consumed 2.408s CPU time. Sep 3 23:25:25.318679 systemd[1]: Started systemd-journald.service - Journal Service. Sep 3 23:25:25.320214 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 3 23:25:25.325105 systemd[1]: Mounted media.mount - External Media Directory. Sep 3 23:25:25.329337 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 3 23:25:25.334356 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 3 23:25:25.339421 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 3 23:25:25.344202 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 3 23:25:25.349161 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 3 23:25:25.355126 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 3 23:25:25.355260 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 3 23:25:25.360648 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 3 23:25:25.364841 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 3 23:25:25.370006 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 3 23:25:25.370133 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 3 23:25:25.374510 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 3 23:25:25.374812 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 3 23:25:25.380395 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 3 23:25:25.380547 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 3 23:25:25.385844 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 3 23:25:25.385965 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 3 23:25:25.394050 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 3 23:25:25.399319 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 3 23:25:25.405036 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 3 23:25:25.411073 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 3 23:25:25.416943 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 3 23:25:25.431915 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 3 23:25:25.438210 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 3 23:25:25.452177 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 3 23:25:25.457455 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 3 23:25:25.457485 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 3 23:25:25.462518 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 3 23:25:25.468625 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 3 23:25:25.473156 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 3 23:25:25.487787 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 3 23:25:25.501106 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 3 23:25:25.506467 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 3 23:25:25.507188 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 3 23:25:25.512205 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 3 23:25:25.512952 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 3 23:25:25.521055 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 3 23:25:25.528658 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 3 23:25:25.535687 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 3 23:25:25.542891 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 3 23:25:25.551770 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 3 23:25:25.559089 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 3 23:25:25.565095 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 3 23:25:25.574826 systemd-journald[1378]: Time spent on flushing to /var/log/journal/de4713bb722e4b058aeba336c37ca6bb is 8.572ms for 941 entries. Sep 3 23:25:25.574826 systemd-journald[1378]: System Journal (/var/log/journal/de4713bb722e4b058aeba336c37ca6bb) is 8M, max 2.6G, 2.6G free. Sep 3 23:25:25.593912 kernel: loop0: detected capacity change from 0 to 211168 Sep 3 23:25:25.593930 systemd-journald[1378]: Received client request to flush runtime journal. Sep 3 23:25:25.595449 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 3 23:25:25.630154 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 3 23:25:25.632590 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 3 23:25:25.649143 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 3 23:25:25.661648 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 3 23:25:25.711551 kernel: loop1: detected capacity change from 0 to 28936 Sep 3 23:25:26.170862 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 3 23:25:26.176816 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 3 23:25:26.300555 kernel: loop2: detected capacity change from 0 to 138376 Sep 3 23:25:26.343303 systemd-tmpfiles[1454]: ACLs are not supported, ignoring. Sep 3 23:25:26.343313 systemd-tmpfiles[1454]: ACLs are not supported, ignoring. Sep 3 23:25:26.360217 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 3 23:25:26.871559 kernel: loop3: detected capacity change from 0 to 107312 Sep 3 23:25:27.353560 kernel: loop4: detected capacity change from 0 to 211168 Sep 3 23:25:27.370546 kernel: loop5: detected capacity change from 0 to 28936 Sep 3 23:25:27.381544 kernel: loop6: detected capacity change from 0 to 138376 Sep 3 23:25:27.398547 kernel: loop7: detected capacity change from 0 to 107312 Sep 3 23:25:27.404903 (sd-merge)[1460]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 3 23:25:27.405238 (sd-merge)[1460]: Merged extensions into '/usr'. Sep 3 23:25:27.408576 systemd[1]: Reload requested from client PID 1437 ('systemd-sysext') (unit systemd-sysext.service)... Sep 3 23:25:27.408770 systemd[1]: Reloading... Sep 3 23:25:27.460558 zram_generator::config[1485]: No configuration found. Sep 3 23:25:27.527405 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:25:27.605275 systemd[1]: Reloading finished in 196 ms. Sep 3 23:25:27.634579 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 3 23:25:27.646352 systemd[1]: Starting ensure-sysext.service... Sep 3 23:25:27.649920 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 3 23:25:27.691156 systemd[1]: Reload requested from client PID 1541 ('systemctl') (unit ensure-sysext.service)... Sep 3 23:25:27.691169 systemd[1]: Reloading... Sep 3 23:25:27.718668 systemd-tmpfiles[1542]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 3 23:25:27.718693 systemd-tmpfiles[1542]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 3 23:25:27.718900 systemd-tmpfiles[1542]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 3 23:25:27.719040 systemd-tmpfiles[1542]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 3 23:25:27.719458 systemd-tmpfiles[1542]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 3 23:25:27.720859 systemd-tmpfiles[1542]: ACLs are not supported, ignoring. Sep 3 23:25:27.720972 systemd-tmpfiles[1542]: ACLs are not supported, ignoring. Sep 3 23:25:27.743554 zram_generator::config[1570]: No configuration found. Sep 3 23:25:27.777935 systemd-tmpfiles[1542]: Detected autofs mount point /boot during canonicalization of boot. Sep 3 23:25:27.777944 systemd-tmpfiles[1542]: Skipping /boot Sep 3 23:25:27.786018 systemd-tmpfiles[1542]: Detected autofs mount point /boot during canonicalization of boot. Sep 3 23:25:27.786029 systemd-tmpfiles[1542]: Skipping /boot Sep 3 23:25:27.810080 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:25:27.872173 systemd[1]: Reloading finished in 180 ms. Sep 3 23:25:27.895561 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 3 23:25:27.909437 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 3 23:25:27.954873 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 3 23:25:27.964157 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 3 23:25:27.971211 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 3 23:25:27.975713 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 3 23:25:27.988997 systemd[1]: Finished ensure-sysext.service. Sep 3 23:25:27.993012 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 3 23:25:27.998027 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 3 23:25:27.998850 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 3 23:25:28.011654 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 3 23:25:28.018651 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 3 23:25:28.025189 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 3 23:25:28.029558 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 3 23:25:28.029672 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 3 23:25:28.029777 systemd[1]: Reached target time-set.target - System Time Set. Sep 3 23:25:28.036512 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 3 23:25:28.036649 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 3 23:25:28.041375 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 3 23:25:28.042566 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 3 23:25:28.046986 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 3 23:25:28.047104 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 3 23:25:28.052829 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 3 23:25:28.052936 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 3 23:25:28.065383 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 3 23:25:28.071309 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 3 23:25:28.071358 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 3 23:25:28.072454 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 3 23:25:28.135665 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 3 23:25:28.229746 systemd-resolved[1631]: Positive Trust Anchors: Sep 3 23:25:28.229756 systemd-resolved[1631]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 3 23:25:28.229776 systemd-resolved[1631]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 3 23:25:28.291304 systemd-resolved[1631]: Using system hostname 'ci-4372.1.0-n-989a023a05'. Sep 3 23:25:28.292411 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 3 23:25:28.297885 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 3 23:25:28.353688 augenrules[1668]: No rules Sep 3 23:25:28.354938 systemd[1]: audit-rules.service: Deactivated successfully. Sep 3 23:25:28.356575 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 3 23:25:28.377364 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 3 23:25:29.538572 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 3 23:25:29.545846 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 3 23:25:29.572360 systemd-udevd[1676]: Using default interface naming scheme 'v255'. Sep 3 23:25:30.554385 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 3 23:25:30.566277 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 3 23:25:30.675861 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 3 23:25:30.726590 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 3 23:25:30.745841 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 3 23:25:30.770627 kernel: mousedev: PS/2 mouse device common for all mice Sep 3 23:25:30.770683 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#187 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 3 23:25:30.769517 systemd-networkd[1697]: lo: Link UP Sep 3 23:25:30.769526 systemd-networkd[1697]: lo: Gained carrier Sep 3 23:25:30.770504 systemd-networkd[1697]: Enumeration completed Sep 3 23:25:30.770759 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 3 23:25:30.771332 systemd-networkd[1697]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:30.771404 systemd-networkd[1697]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 3 23:25:30.784212 systemd[1]: Reached target network.target - Network. Sep 3 23:25:30.791651 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 3 23:25:30.800644 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 3 23:25:30.808974 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 3 23:25:30.819567 kernel: hv_vmbus: registering driver hv_balloon Sep 3 23:25:30.819621 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 3 23:25:30.819634 kernel: hv_balloon: Memory hot add disabled on ARM64 Sep 3 23:25:30.826227 kernel: hv_vmbus: registering driver hyperv_fb Sep 3 23:25:30.840322 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:30.842545 kernel: mlx5_core 9872:00:02.0 enP39026s1: Link up Sep 3 23:25:30.854730 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:30.854874 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:30.879381 kernel: hv_netvsc 000d3afc-4736-000d-3afc-4736000d3afc eth0: Data path switched to VF: enP39026s1 Sep 3 23:25:30.879636 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 3 23:25:30.879655 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 3 23:25:30.880186 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:30.888142 kernel: Console: switching to colour dummy device 80x25 Sep 3 23:25:30.895904 kernel: Console: switching to colour frame buffer device 128x48 Sep 3 23:25:30.894975 systemd-networkd[1697]: enP39026s1: Link UP Sep 3 23:25:30.895112 systemd-networkd[1697]: eth0: Link UP Sep 3 23:25:30.895115 systemd-networkd[1697]: eth0: Gained carrier Sep 3 23:25:30.895132 systemd-networkd[1697]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:30.904375 systemd-networkd[1697]: enP39026s1: Gained carrier Sep 3 23:25:30.906168 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 3 23:25:30.913612 systemd-networkd[1697]: eth0: DHCPv4 address 10.200.20.13/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 3 23:25:30.915624 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:30.915771 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:30.924079 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 3 23:25:30.925667 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:30.998254 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 3 23:25:31.005630 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 3 23:25:31.038551 kernel: MACsec IEEE 802.1AE Sep 3 23:25:31.106276 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 3 23:25:32.499257 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:32.770760 systemd-networkd[1697]: eth0: Gained IPv6LL Sep 3 23:25:32.773602 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 3 23:25:32.779515 systemd[1]: Reached target network-online.target - Network is Online. Sep 3 23:25:35.338214 ldconfig[1432]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 3 23:25:35.346890 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 3 23:25:35.353289 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 3 23:25:35.382836 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 3 23:25:35.387846 systemd[1]: Reached target sysinit.target - System Initialization. Sep 3 23:25:35.392274 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 3 23:25:35.397224 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 3 23:25:35.403382 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 3 23:25:35.407806 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 3 23:25:35.413388 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 3 23:25:35.418507 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 3 23:25:35.418533 systemd[1]: Reached target paths.target - Path Units. Sep 3 23:25:35.422164 systemd[1]: Reached target timers.target - Timer Units. Sep 3 23:25:35.454925 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 3 23:25:35.460456 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 3 23:25:35.466811 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 3 23:25:35.472245 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 3 23:25:35.477661 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 3 23:25:35.483575 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 3 23:25:35.487856 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 3 23:25:35.493008 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 3 23:25:35.497411 systemd[1]: Reached target sockets.target - Socket Units. Sep 3 23:25:35.501430 systemd[1]: Reached target basic.target - Basic System. Sep 3 23:25:35.505160 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 3 23:25:35.505178 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 3 23:25:35.539287 systemd[1]: Starting chronyd.service - NTP client/server... Sep 3 23:25:35.554619 systemd[1]: Starting containerd.service - containerd container runtime... Sep 3 23:25:35.560044 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 3 23:25:35.565791 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 3 23:25:35.578544 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 3 23:25:35.585453 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 3 23:25:35.591505 (chronyd)[1828]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Sep 3 23:25:35.592667 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 3 23:25:35.597325 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 3 23:25:35.601823 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 3 23:25:35.606992 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 3 23:25:35.607726 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:25:35.608243 jq[1836]: false Sep 3 23:25:35.613401 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 3 23:25:35.618384 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 3 23:25:35.631690 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 3 23:25:35.637096 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 3 23:25:35.642704 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 3 23:25:35.649427 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 3 23:25:35.655092 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 3 23:25:35.655501 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 3 23:25:35.657733 systemd[1]: Starting update-engine.service - Update Engine... Sep 3 23:25:35.665316 KVP[1838]: KVP starting; pid is:1838 Sep 3 23:25:35.666271 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 3 23:25:35.672154 chronyd[1857]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Sep 3 23:25:35.677913 KVP[1838]: KVP LIC Version: 3.1 Sep 3 23:25:35.678648 kernel: hv_utils: KVP IC version 4.0 Sep 3 23:25:35.679261 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 3 23:25:35.687130 extend-filesystems[1837]: Found /dev/sda6 Sep 3 23:25:35.689904 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 3 23:25:35.707600 jq[1853]: true Sep 3 23:25:35.690087 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 3 23:25:35.698520 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 3 23:25:35.698718 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 3 23:25:35.708732 systemd[1]: motdgen.service: Deactivated successfully. Sep 3 23:25:35.709934 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 3 23:25:35.722589 chronyd[1857]: Timezone right/UTC failed leap second check, ignoring Sep 3 23:25:35.722725 chronyd[1857]: Loaded seccomp filter (level 2) Sep 3 23:25:35.723809 extend-filesystems[1837]: Found /dev/sda9 Sep 3 23:25:35.729494 extend-filesystems[1837]: Checking size of /dev/sda9 Sep 3 23:25:35.728958 systemd[1]: Started chronyd.service - NTP client/server. Sep 3 23:25:35.739830 jq[1868]: true Sep 3 23:25:35.737779 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 3 23:25:35.745823 (ntainerd)[1869]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 3 23:25:35.754456 systemd-logind[1851]: New seat seat0. Sep 3 23:25:35.758792 systemd-logind[1851]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 3 23:25:35.759609 systemd[1]: Started systemd-logind.service - User Login Management. Sep 3 23:25:35.767007 update_engine[1852]: I20250903 23:25:35.766647 1852 main.cc:92] Flatcar Update Engine starting Sep 3 23:25:35.774616 extend-filesystems[1837]: Old size kept for /dev/sda9 Sep 3 23:25:35.779727 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 3 23:25:35.779910 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 3 23:25:35.805779 tar[1867]: linux-arm64/LICENSE Sep 3 23:25:35.805779 tar[1867]: linux-arm64/helm Sep 3 23:25:35.857160 bash[1913]: Updated "/home/core/.ssh/authorized_keys" Sep 3 23:25:35.857591 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 3 23:25:35.866884 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 3 23:25:35.958414 sshd_keygen[1870]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 3 23:25:36.001573 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 3 23:25:36.009524 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 3 23:25:36.016644 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 3 23:25:36.036792 systemd[1]: issuegen.service: Deactivated successfully. Sep 3 23:25:36.036960 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 3 23:25:36.045336 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 3 23:25:36.056127 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 3 23:25:36.102975 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 3 23:25:36.115691 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 3 23:25:36.123665 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 3 23:25:36.132401 systemd[1]: Reached target getty.target - Login Prompts. Sep 3 23:25:36.223864 dbus-daemon[1831]: [system] SELinux support is enabled Sep 3 23:25:36.224256 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 3 23:25:36.235697 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 3 23:25:36.235725 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 3 23:25:36.242284 dbus-daemon[1831]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 3 23:25:36.245296 update_engine[1852]: I20250903 23:25:36.244467 1852 update_check_scheduler.cc:74] Next update check in 10m13s Sep 3 23:25:36.245857 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 3 23:25:36.245879 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 3 23:25:36.254410 systemd[1]: Started update-engine.service - Update Engine. Sep 3 23:25:36.264731 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 3 23:25:36.328493 coreos-metadata[1830]: Sep 03 23:25:36.328 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 3 23:25:36.331660 coreos-metadata[1830]: Sep 03 23:25:36.331 INFO Fetch successful Sep 3 23:25:36.331737 coreos-metadata[1830]: Sep 03 23:25:36.331 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 3 23:25:36.336316 coreos-metadata[1830]: Sep 03 23:25:36.336 INFO Fetch successful Sep 3 23:25:36.336316 coreos-metadata[1830]: Sep 03 23:25:36.336 INFO Fetching http://168.63.129.16/machine/125bbce3-aa50-481c-913b-9a3b16d9a6d2/e2e38e57%2D22cb%2D42ae%2Dab6b%2D1638e8ab9652.%5Fci%2D4372.1.0%2Dn%2D989a023a05?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 3 23:25:36.339644 coreos-metadata[1830]: Sep 03 23:25:36.339 INFO Fetch successful Sep 3 23:25:36.339846 coreos-metadata[1830]: Sep 03 23:25:36.339 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 3 23:25:36.347669 tar[1867]: linux-arm64/README.md Sep 3 23:25:36.349460 coreos-metadata[1830]: Sep 03 23:25:36.349 INFO Fetch successful Sep 3 23:25:36.363904 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 3 23:25:36.490140 containerd[1869]: time="2025-09-03T23:25:36Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 3 23:25:36.491547 containerd[1869]: time="2025-09-03T23:25:36.491328596Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 3 23:25:36.498312 containerd[1869]: time="2025-09-03T23:25:36.498280268Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.56µs" Sep 3 23:25:36.498312 containerd[1869]: time="2025-09-03T23:25:36.498306404Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 3 23:25:36.498389 containerd[1869]: time="2025-09-03T23:25:36.498320988Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 3 23:25:36.498458 containerd[1869]: time="2025-09-03T23:25:36.498443036Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 3 23:25:36.498480 containerd[1869]: time="2025-09-03T23:25:36.498458868Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 3 23:25:36.498480 containerd[1869]: time="2025-09-03T23:25:36.498477980Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498579 containerd[1869]: time="2025-09-03T23:25:36.498516260Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498579 containerd[1869]: time="2025-09-03T23:25:36.498539892Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498717 containerd[1869]: time="2025-09-03T23:25:36.498696148Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498717 containerd[1869]: time="2025-09-03T23:25:36.498711276Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498750 containerd[1869]: time="2025-09-03T23:25:36.498718948Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498750 containerd[1869]: time="2025-09-03T23:25:36.498724444Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498792 containerd[1869]: time="2025-09-03T23:25:36.498781012Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498943 containerd[1869]: time="2025-09-03T23:25:36.498928172Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 3 23:25:36.498959 containerd[1869]: time="2025-09-03T23:25:36.498952420Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 3 23:25:36.499051 containerd[1869]: time="2025-09-03T23:25:36.498958716Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 3 23:25:36.499051 containerd[1869]: time="2025-09-03T23:25:36.498981732Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 3 23:25:36.499126 containerd[1869]: time="2025-09-03T23:25:36.499105204Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 3 23:25:36.499175 containerd[1869]: time="2025-09-03T23:25:36.499159524Z" level=info msg="metadata content store policy set" policy=shared Sep 3 23:25:36.511200 containerd[1869]: time="2025-09-03T23:25:36.511170708Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 3 23:25:36.511200 containerd[1869]: time="2025-09-03T23:25:36.511212740Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 3 23:25:36.511200 containerd[1869]: time="2025-09-03T23:25:36.511223028Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511241340Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511249988Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511258452Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511266412Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511273964Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511281988Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511288596Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511294244Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 3 23:25:36.511328 containerd[1869]: time="2025-09-03T23:25:36.511302668Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 3 23:25:36.511439 containerd[1869]: time="2025-09-03T23:25:36.511396644Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 3 23:25:36.511439 containerd[1869]: time="2025-09-03T23:25:36.511412916Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 3 23:25:36.511439 containerd[1869]: time="2025-09-03T23:25:36.511423580Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 3 23:25:36.511471 containerd[1869]: time="2025-09-03T23:25:36.511431732Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 3 23:25:36.511471 containerd[1869]: time="2025-09-03T23:25:36.511450492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 3 23:25:36.511471 containerd[1869]: time="2025-09-03T23:25:36.511457172Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 3 23:25:36.511471 containerd[1869]: time="2025-09-03T23:25:36.511464468Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 3 23:25:36.511471 containerd[1869]: time="2025-09-03T23:25:36.511471044Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 3 23:25:36.511591 containerd[1869]: time="2025-09-03T23:25:36.511479580Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 3 23:25:36.511591 containerd[1869]: time="2025-09-03T23:25:36.511486508Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 3 23:25:36.511591 containerd[1869]: time="2025-09-03T23:25:36.511493188Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 3 23:25:36.511591 containerd[1869]: time="2025-09-03T23:25:36.511576844Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 3 23:25:36.511591 containerd[1869]: time="2025-09-03T23:25:36.511588868Z" level=info msg="Start snapshots syncer" Sep 3 23:25:36.511734 containerd[1869]: time="2025-09-03T23:25:36.511611004Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 3 23:25:36.511790 containerd[1869]: time="2025-09-03T23:25:36.511759708Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 3 23:25:36.511880 containerd[1869]: time="2025-09-03T23:25:36.511797140Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 3 23:25:36.511880 containerd[1869]: time="2025-09-03T23:25:36.511856860Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 3 23:25:36.511964 containerd[1869]: time="2025-09-03T23:25:36.511948052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 3 23:25:36.511981 containerd[1869]: time="2025-09-03T23:25:36.511969988Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 3 23:25:36.511981 containerd[1869]: time="2025-09-03T23:25:36.511977276Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 3 23:25:36.512004 containerd[1869]: time="2025-09-03T23:25:36.511985172Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 3 23:25:36.512004 containerd[1869]: time="2025-09-03T23:25:36.511994004Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 3 23:25:36.512004 containerd[1869]: time="2025-09-03T23:25:36.512001828Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 3 23:25:36.512043 containerd[1869]: time="2025-09-03T23:25:36.512011676Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 3 23:25:36.512043 containerd[1869]: time="2025-09-03T23:25:36.512030924Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 3 23:25:36.512043 containerd[1869]: time="2025-09-03T23:25:36.512038420Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 3 23:25:36.512091 containerd[1869]: time="2025-09-03T23:25:36.512045564Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 3 23:25:36.512091 containerd[1869]: time="2025-09-03T23:25:36.512071716Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 3 23:25:36.512091 containerd[1869]: time="2025-09-03T23:25:36.512080644Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 3 23:25:36.512091 containerd[1869]: time="2025-09-03T23:25:36.512085780Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512091644Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512096604Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512102428Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512109252Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512121396Z" level=info msg="runtime interface created" Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512124884Z" level=info msg="created NRI interface" Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512132332Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512141276Z" level=info msg="Connect containerd service" Sep 3 23:25:36.512213 containerd[1869]: time="2025-09-03T23:25:36.512164644Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 3 23:25:36.512722 containerd[1869]: time="2025-09-03T23:25:36.512698812Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 3 23:25:36.520157 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:25:36.680828 (kubelet)[2011]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:25:36.692013 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 3 23:25:36.697394 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 3 23:25:36.796763 locksmithd[1995]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 3 23:25:37.060597 kubelet[2011]: E0903 23:25:37.060443 2011 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:25:37.062625 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:25:37.062731 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:25:37.063092 systemd[1]: kubelet.service: Consumed 543ms CPU time, 257.6M memory peak. Sep 3 23:25:37.355109 containerd[1869]: time="2025-09-03T23:25:37.355010116Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 3 23:25:37.355328 containerd[1869]: time="2025-09-03T23:25:37.355125676Z" level=info msg="Start subscribing containerd event" Sep 3 23:25:37.355398 containerd[1869]: time="2025-09-03T23:25:37.355388884Z" level=info msg="Start recovering state" Sep 3 23:25:37.355507 containerd[1869]: time="2025-09-03T23:25:37.355496324Z" level=info msg="Start event monitor" Sep 3 23:25:37.355579 containerd[1869]: time="2025-09-03T23:25:37.355567028Z" level=info msg="Start cni network conf syncer for default" Sep 3 23:25:37.355638 containerd[1869]: time="2025-09-03T23:25:37.355626444Z" level=info msg="Start streaming server" Sep 3 23:25:37.355687 containerd[1869]: time="2025-09-03T23:25:37.355676268Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 3 23:25:37.355771 containerd[1869]: time="2025-09-03T23:25:37.355757636Z" level=info msg="runtime interface starting up..." Sep 3 23:25:37.355815 containerd[1869]: time="2025-09-03T23:25:37.355805284Z" level=info msg="starting plugins..." Sep 3 23:25:37.355866 containerd[1869]: time="2025-09-03T23:25:37.355857444Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 3 23:25:37.356083 containerd[1869]: time="2025-09-03T23:25:37.356064004Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 3 23:25:37.356345 systemd[1]: Started containerd.service - containerd container runtime. Sep 3 23:25:37.356457 containerd[1869]: time="2025-09-03T23:25:37.356438084Z" level=info msg="containerd successfully booted in 0.866704s" Sep 3 23:25:37.362583 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 3 23:25:37.370601 systemd[1]: Startup finished in 1.590s (kernel) + 14.727s (initrd) + 18.713s (userspace) = 35.031s. Sep 3 23:25:37.965909 login[1990]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:25:37.966053 login[1991]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:25:37.975899 systemd-logind[1851]: New session 1 of user core. Sep 3 23:25:37.977009 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 3 23:25:37.977945 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 3 23:25:37.980085 systemd-logind[1851]: New session 2 of user core. Sep 3 23:25:38.025559 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 3 23:25:38.027152 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 3 23:25:38.052999 (systemd)[2044]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 3 23:25:38.055057 systemd-logind[1851]: New session c1 of user core. Sep 3 23:25:38.258730 waagent[1988]: 2025-09-03T23:25:38.258665Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 3 23:25:38.266393 waagent[1988]: 2025-09-03T23:25:38.263145Z INFO Daemon Daemon OS: flatcar 4372.1.0 Sep 3 23:25:38.266667 waagent[1988]: 2025-09-03T23:25:38.266628Z INFO Daemon Daemon Python: 3.11.12 Sep 3 23:25:38.270565 waagent[1988]: 2025-09-03T23:25:38.270019Z INFO Daemon Daemon Run daemon Sep 3 23:25:38.275560 waagent[1988]: 2025-09-03T23:25:38.274951Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4372.1.0' Sep 3 23:25:38.282390 waagent[1988]: 2025-09-03T23:25:38.282343Z INFO Daemon Daemon Using waagent for provisioning Sep 3 23:25:38.286151 waagent[1988]: 2025-09-03T23:25:38.286116Z INFO Daemon Daemon Activate resource disk Sep 3 23:25:38.289775 waagent[1988]: 2025-09-03T23:25:38.289738Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 3 23:25:38.298547 waagent[1988]: 2025-09-03T23:25:38.297974Z INFO Daemon Daemon Found device: None Sep 3 23:25:38.301258 waagent[1988]: 2025-09-03T23:25:38.301224Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 3 23:25:38.307618 waagent[1988]: 2025-09-03T23:25:38.307574Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 3 23:25:38.316045 waagent[1988]: 2025-09-03T23:25:38.316004Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 3 23:25:38.320240 waagent[1988]: 2025-09-03T23:25:38.320208Z INFO Daemon Daemon Running default provisioning handler Sep 3 23:25:38.329536 waagent[1988]: 2025-09-03T23:25:38.329208Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 3 23:25:38.341744 waagent[1988]: 2025-09-03T23:25:38.341703Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 3 23:25:38.348746 waagent[1988]: 2025-09-03T23:25:38.348711Z INFO Daemon Daemon cloud-init is enabled: False Sep 3 23:25:38.352885 waagent[1988]: 2025-09-03T23:25:38.352850Z INFO Daemon Daemon Copying ovf-env.xml Sep 3 23:25:38.373601 systemd[2044]: Queued start job for default target default.target. Sep 3 23:25:38.394261 systemd[2044]: Created slice app.slice - User Application Slice. Sep 3 23:25:38.394587 systemd[2044]: Reached target paths.target - Paths. Sep 3 23:25:38.394628 systemd[2044]: Reached target timers.target - Timers. Sep 3 23:25:38.395625 systemd[2044]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 3 23:25:38.402615 systemd[2044]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 3 23:25:38.402729 systemd[2044]: Reached target sockets.target - Sockets. Sep 3 23:25:38.402770 systemd[2044]: Reached target basic.target - Basic System. Sep 3 23:25:38.402792 systemd[2044]: Reached target default.target - Main User Target. Sep 3 23:25:38.402812 systemd[2044]: Startup finished in 343ms. Sep 3 23:25:38.402877 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 3 23:25:38.412675 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 3 23:25:38.413220 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 3 23:25:38.459094 waagent[1988]: 2025-09-03T23:25:38.456575Z INFO Daemon Daemon Successfully mounted dvd Sep 3 23:25:38.482899 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 3 23:25:38.484721 waagent[1988]: 2025-09-03T23:25:38.484673Z INFO Daemon Daemon Detect protocol endpoint Sep 3 23:25:38.488135 waagent[1988]: 2025-09-03T23:25:38.488105Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 3 23:25:38.492810 waagent[1988]: 2025-09-03T23:25:38.492763Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 3 23:25:38.498030 waagent[1988]: 2025-09-03T23:25:38.497987Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 3 23:25:38.501961 waagent[1988]: 2025-09-03T23:25:38.501840Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 3 23:25:38.505479 waagent[1988]: 2025-09-03T23:25:38.505451Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 3 23:25:38.587386 waagent[1988]: 2025-09-03T23:25:38.587305Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 3 23:25:38.596041 waagent[1988]: 2025-09-03T23:25:38.592831Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 3 23:25:38.596822 waagent[1988]: 2025-09-03T23:25:38.596795Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 3 23:25:38.696652 waagent[1988]: 2025-09-03T23:25:38.696577Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 3 23:25:38.701610 waagent[1988]: 2025-09-03T23:25:38.701575Z INFO Daemon Daemon Forcing an update of the goal state. Sep 3 23:25:38.709167 waagent[1988]: 2025-09-03T23:25:38.709124Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 3 23:25:38.765627 waagent[1988]: 2025-09-03T23:25:38.765600Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 3 23:25:38.770074 waagent[1988]: 2025-09-03T23:25:38.770040Z INFO Daemon Sep 3 23:25:38.772189 waagent[1988]: 2025-09-03T23:25:38.772162Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 64c1c933-b242-4267-8f5c-134354cc37aa eTag: 6551842607142672528 source: Fabric] Sep 3 23:25:38.780597 waagent[1988]: 2025-09-03T23:25:38.780567Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 3 23:25:38.785350 waagent[1988]: 2025-09-03T23:25:38.785321Z INFO Daemon Sep 3 23:25:38.787417 waagent[1988]: 2025-09-03T23:25:38.787393Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 3 23:25:38.796295 waagent[1988]: 2025-09-03T23:25:38.796269Z INFO Daemon Daemon Downloading artifacts profile blob Sep 3 23:25:38.857607 waagent[1988]: 2025-09-03T23:25:38.856591Z INFO Daemon Downloaded certificate {'thumbprint': '14D5058C7703BE9772610DFE3DD8C32E48DE5450', 'hasPrivateKey': True} Sep 3 23:25:38.863805 waagent[1988]: 2025-09-03T23:25:38.863771Z INFO Daemon Fetch goal state completed Sep 3 23:25:38.872510 waagent[1988]: 2025-09-03T23:25:38.872479Z INFO Daemon Daemon Starting provisioning Sep 3 23:25:38.876776 waagent[1988]: 2025-09-03T23:25:38.876748Z INFO Daemon Daemon Handle ovf-env.xml. Sep 3 23:25:38.880372 waagent[1988]: 2025-09-03T23:25:38.880350Z INFO Daemon Daemon Set hostname [ci-4372.1.0-n-989a023a05] Sep 3 23:25:38.914678 waagent[1988]: 2025-09-03T23:25:38.914645Z INFO Daemon Daemon Publish hostname [ci-4372.1.0-n-989a023a05] Sep 3 23:25:38.919242 waagent[1988]: 2025-09-03T23:25:38.919209Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 3 23:25:38.923698 waagent[1988]: 2025-09-03T23:25:38.923670Z INFO Daemon Daemon Primary interface is [eth0] Sep 3 23:25:38.932741 systemd-networkd[1697]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:38.932747 systemd-networkd[1697]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 3 23:25:38.932771 systemd-networkd[1697]: eth0: DHCP lease lost Sep 3 23:25:38.934077 waagent[1988]: 2025-09-03T23:25:38.934037Z INFO Daemon Daemon Create user account if not exists Sep 3 23:25:38.938130 waagent[1988]: 2025-09-03T23:25:38.938101Z INFO Daemon Daemon User core already exists, skip useradd Sep 3 23:25:38.942192 waagent[1988]: 2025-09-03T23:25:38.942169Z INFO Daemon Daemon Configure sudoer Sep 3 23:25:38.948012 waagent[1988]: 2025-09-03T23:25:38.947977Z INFO Daemon Daemon Configure sshd Sep 3 23:25:38.953320 waagent[1988]: 2025-09-03T23:25:38.953285Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 3 23:25:38.962713 waagent[1988]: 2025-09-03T23:25:38.962640Z INFO Daemon Daemon Deploy ssh public key. Sep 3 23:25:38.973584 systemd-networkd[1697]: eth0: DHCPv4 address 10.200.20.13/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 3 23:25:40.110761 waagent[1988]: 2025-09-03T23:25:40.110707Z INFO Daemon Daemon Provisioning complete Sep 3 23:25:40.123921 waagent[1988]: 2025-09-03T23:25:40.123882Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 3 23:25:40.128446 waagent[1988]: 2025-09-03T23:25:40.128416Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 3 23:25:40.135572 waagent[1988]: 2025-09-03T23:25:40.135548Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 3 23:25:40.231383 waagent[2094]: 2025-09-03T23:25:40.231326Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 3 23:25:40.232052 waagent[2094]: 2025-09-03T23:25:40.231744Z INFO ExtHandler ExtHandler OS: flatcar 4372.1.0 Sep 3 23:25:40.232052 waagent[2094]: 2025-09-03T23:25:40.231800Z INFO ExtHandler ExtHandler Python: 3.11.12 Sep 3 23:25:40.232052 waagent[2094]: 2025-09-03T23:25:40.231835Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Sep 3 23:25:40.295611 waagent[2094]: 2025-09-03T23:25:40.295562Z INFO ExtHandler ExtHandler Distro: flatcar-4372.1.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.12; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 3 23:25:40.295857 waagent[2094]: 2025-09-03T23:25:40.295826Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 3 23:25:40.295999 waagent[2094]: 2025-09-03T23:25:40.295972Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 3 23:25:40.304372 waagent[2094]: 2025-09-03T23:25:40.304323Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 3 23:25:40.321367 waagent[2094]: 2025-09-03T23:25:40.321330Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 3 23:25:40.321888 waagent[2094]: 2025-09-03T23:25:40.321851Z INFO ExtHandler Sep 3 23:25:40.322018 waagent[2094]: 2025-09-03T23:25:40.321992Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 4a503a24-cc9d-4b5d-a07e-cd4237501a18 eTag: 6551842607142672528 source: Fabric] Sep 3 23:25:40.322324 waagent[2094]: 2025-09-03T23:25:40.322291Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 3 23:25:40.322876 waagent[2094]: 2025-09-03T23:25:40.322838Z INFO ExtHandler Sep 3 23:25:40.322998 waagent[2094]: 2025-09-03T23:25:40.322972Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 3 23:25:40.328281 waagent[2094]: 2025-09-03T23:25:40.328247Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 3 23:25:40.381595 waagent[2094]: 2025-09-03T23:25:40.380974Z INFO ExtHandler Downloaded certificate {'thumbprint': '14D5058C7703BE9772610DFE3DD8C32E48DE5450', 'hasPrivateKey': True} Sep 3 23:25:40.381595 waagent[2094]: 2025-09-03T23:25:40.381334Z INFO ExtHandler Fetch goal state completed Sep 3 23:25:40.392611 waagent[2094]: 2025-09-03T23:25:40.392568Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) Sep 3 23:25:40.395791 waagent[2094]: 2025-09-03T23:25:40.395747Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2094 Sep 3 23:25:40.395889 waagent[2094]: 2025-09-03T23:25:40.395864Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 3 23:25:40.396123 waagent[2094]: 2025-09-03T23:25:40.396097Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 3 23:25:40.397165 waagent[2094]: 2025-09-03T23:25:40.397130Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4372.1.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 3 23:25:40.397478 waagent[2094]: 2025-09-03T23:25:40.397449Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4372.1.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 3 23:25:40.397621 waagent[2094]: 2025-09-03T23:25:40.397596Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 3 23:25:40.398052 waagent[2094]: 2025-09-03T23:25:40.398022Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 3 23:25:40.477409 waagent[2094]: 2025-09-03T23:25:40.477083Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 3 23:25:40.477409 waagent[2094]: 2025-09-03T23:25:40.477233Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 3 23:25:40.481296 waagent[2094]: 2025-09-03T23:25:40.481277Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 3 23:25:40.485839 systemd[1]: Reload requested from client PID 2109 ('systemctl') (unit waagent.service)... Sep 3 23:25:40.485853 systemd[1]: Reloading... Sep 3 23:25:40.555552 zram_generator::config[2147]: No configuration found. Sep 3 23:25:40.621524 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:25:40.703259 systemd[1]: Reloading finished in 217 ms. Sep 3 23:25:40.718548 waagent[2094]: 2025-09-03T23:25:40.717927Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 3 23:25:40.718548 waagent[2094]: 2025-09-03T23:25:40.718066Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 3 23:25:42.177742 waagent[2094]: 2025-09-03T23:25:42.177662Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 3 23:25:42.178097 waagent[2094]: 2025-09-03T23:25:42.177977Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 3 23:25:42.178649 waagent[2094]: 2025-09-03T23:25:42.178607Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 3 23:25:42.178945 waagent[2094]: 2025-09-03T23:25:42.178909Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 3 23:25:42.179154 waagent[2094]: 2025-09-03T23:25:42.179109Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 3 23:25:42.179310 waagent[2094]: 2025-09-03T23:25:42.179257Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 3 23:25:42.179729 waagent[2094]: 2025-09-03T23:25:42.179689Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 3 23:25:42.179899 waagent[2094]: 2025-09-03T23:25:42.179852Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 3 23:25:42.180137 waagent[2094]: 2025-09-03T23:25:42.180057Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 3 23:25:42.180650 waagent[2094]: 2025-09-03T23:25:42.180608Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 3 23:25:42.180762 waagent[2094]: 2025-09-03T23:25:42.180731Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 3 23:25:42.180853 waagent[2094]: 2025-09-03T23:25:42.180830Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 3 23:25:42.182137 waagent[2094]: 2025-09-03T23:25:42.181998Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 3 23:25:42.182318 waagent[2094]: 2025-09-03T23:25:42.182284Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 3 23:25:42.182692 waagent[2094]: 2025-09-03T23:25:42.182605Z INFO EnvHandler ExtHandler Configure routes Sep 3 23:25:42.182782 waagent[2094]: 2025-09-03T23:25:42.182755Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 3 23:25:42.182782 waagent[2094]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 3 23:25:42.182782 waagent[2094]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Sep 3 23:25:42.182782 waagent[2094]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 3 23:25:42.182782 waagent[2094]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 3 23:25:42.182782 waagent[2094]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 3 23:25:42.182782 waagent[2094]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 3 23:25:42.183268 waagent[2094]: 2025-09-03T23:25:42.183186Z INFO EnvHandler ExtHandler Gateway:None Sep 3 23:25:42.183268 waagent[2094]: 2025-09-03T23:25:42.183239Z INFO EnvHandler ExtHandler Routes:None Sep 3 23:25:42.187024 waagent[2094]: 2025-09-03T23:25:42.186959Z INFO ExtHandler ExtHandler Sep 3 23:25:42.187085 waagent[2094]: 2025-09-03T23:25:42.187065Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: aa17a395-f9b7-4a2d-9e42-9f53faa5bd63 correlation 3be3cea3-0372-4436-b416-68dafa2b7a23 created: 2025-09-03T23:24:15.312157Z] Sep 3 23:25:42.187351 waagent[2094]: 2025-09-03T23:25:42.187319Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 3 23:25:42.187760 waagent[2094]: 2025-09-03T23:25:42.187735Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Sep 3 23:25:42.214684 waagent[2094]: 2025-09-03T23:25:42.214639Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 3 23:25:42.214684 waagent[2094]: Try `iptables -h' or 'iptables --help' for more information.) Sep 3 23:25:42.214972 waagent[2094]: 2025-09-03T23:25:42.214936Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 3684B028-20F6-4195-9725-633BE4714DE3;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 3 23:25:42.262115 waagent[2094]: 2025-09-03T23:25:42.262070Z INFO MonitorHandler ExtHandler Network interfaces: Sep 3 23:25:42.262115 waagent[2094]: Executing ['ip', '-a', '-o', 'link']: Sep 3 23:25:42.262115 waagent[2094]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 3 23:25:42.262115 waagent[2094]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:47:36 brd ff:ff:ff:ff:ff:ff Sep 3 23:25:42.262115 waagent[2094]: 3: enP39026s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:47:36 brd ff:ff:ff:ff:ff:ff\ altname enP39026p0s2 Sep 3 23:25:42.262115 waagent[2094]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 3 23:25:42.262115 waagent[2094]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 3 23:25:42.262115 waagent[2094]: 2: eth0 inet 10.200.20.13/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 3 23:25:42.262115 waagent[2094]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 3 23:25:42.262115 waagent[2094]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 3 23:25:42.262115 waagent[2094]: 2: eth0 inet6 fe80::20d:3aff:fefc:4736/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 3 23:25:42.319333 waagent[2094]: 2025-09-03T23:25:42.319285Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 3 23:25:42.319333 waagent[2094]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:42.319333 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:42.319333 waagent[2094]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:42.319333 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:42.319333 waagent[2094]: Chain OUTPUT (policy ACCEPT 3 packets, 164 bytes) Sep 3 23:25:42.319333 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:42.319333 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 3 23:25:42.319333 waagent[2094]: 3 535 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 3 23:25:42.319333 waagent[2094]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 3 23:25:42.321643 waagent[2094]: 2025-09-03T23:25:42.321600Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 3 23:25:42.321643 waagent[2094]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:42.321643 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:42.321643 waagent[2094]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:42.321643 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:42.321643 waagent[2094]: Chain OUTPUT (policy ACCEPT 3 packets, 164 bytes) Sep 3 23:25:42.321643 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:42.321643 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 3 23:25:42.321643 waagent[2094]: 4 587 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 3 23:25:42.321643 waagent[2094]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 3 23:25:42.321821 waagent[2094]: 2025-09-03T23:25:42.321796Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Sep 3 23:25:47.254737 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 3 23:25:47.256089 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:25:47.349666 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:25:47.353769 (kubelet)[2242]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:25:47.481178 kubelet[2242]: E0903 23:25:47.481128 2242 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:25:47.484044 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:25:47.484156 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:25:47.484638 systemd[1]: kubelet.service: Consumed 106ms CPU time, 106.3M memory peak. Sep 3 23:25:57.504890 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 3 23:25:57.506194 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:25:57.598675 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:25:57.603722 (kubelet)[2257]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:25:57.730883 kubelet[2257]: E0903 23:25:57.730832 2257 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:25:57.733051 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:25:57.733280 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:25:57.735601 systemd[1]: kubelet.service: Consumed 100ms CPU time, 105M memory peak. Sep 3 23:25:59.510920 chronyd[1857]: Selected source PHC0 Sep 3 23:26:02.372705 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 3 23:26:02.373947 systemd[1]: Started sshd@0-10.200.20.13:22-10.200.16.10:51950.service - OpenSSH per-connection server daemon (10.200.16.10:51950). Sep 3 23:26:03.075114 sshd[2264]: Accepted publickey for core from 10.200.16.10 port 51950 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:03.076258 sshd-session[2264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:03.080354 systemd-logind[1851]: New session 3 of user core. Sep 3 23:26:03.095655 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 3 23:26:03.522163 systemd[1]: Started sshd@1-10.200.20.13:22-10.200.16.10:51958.service - OpenSSH per-connection server daemon (10.200.16.10:51958). Sep 3 23:26:04.017082 sshd[2269]: Accepted publickey for core from 10.200.16.10 port 51958 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:04.018186 sshd-session[2269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:04.021910 systemd-logind[1851]: New session 4 of user core. Sep 3 23:26:04.030844 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 3 23:26:04.368181 sshd[2271]: Connection closed by 10.200.16.10 port 51958 Sep 3 23:26:04.368724 sshd-session[2269]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:04.371717 systemd[1]: sshd@1-10.200.20.13:22-10.200.16.10:51958.service: Deactivated successfully. Sep 3 23:26:04.372982 systemd[1]: session-4.scope: Deactivated successfully. Sep 3 23:26:04.373573 systemd-logind[1851]: Session 4 logged out. Waiting for processes to exit. Sep 3 23:26:04.374661 systemd-logind[1851]: Removed session 4. Sep 3 23:26:04.455892 systemd[1]: Started sshd@2-10.200.20.13:22-10.200.16.10:51960.service - OpenSSH per-connection server daemon (10.200.16.10:51960). Sep 3 23:26:04.950155 sshd[2277]: Accepted publickey for core from 10.200.16.10 port 51960 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:04.951242 sshd-session[2277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:04.955697 systemd-logind[1851]: New session 5 of user core. Sep 3 23:26:04.961671 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 3 23:26:05.311206 sshd[2279]: Connection closed by 10.200.16.10 port 51960 Sep 3 23:26:05.310457 sshd-session[2277]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:05.313303 systemd[1]: sshd@2-10.200.20.13:22-10.200.16.10:51960.service: Deactivated successfully. Sep 3 23:26:05.314795 systemd[1]: session-5.scope: Deactivated successfully. Sep 3 23:26:05.315479 systemd-logind[1851]: Session 5 logged out. Waiting for processes to exit. Sep 3 23:26:05.316520 systemd-logind[1851]: Removed session 5. Sep 3 23:26:05.393750 systemd[1]: Started sshd@3-10.200.20.13:22-10.200.16.10:51966.service - OpenSSH per-connection server daemon (10.200.16.10:51966). Sep 3 23:26:05.832999 sshd[2285]: Accepted publickey for core from 10.200.16.10 port 51966 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:05.834097 sshd-session[2285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:05.837688 systemd-logind[1851]: New session 6 of user core. Sep 3 23:26:05.840825 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 3 23:26:06.146065 sshd[2287]: Connection closed by 10.200.16.10 port 51966 Sep 3 23:26:06.145902 sshd-session[2285]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:06.148935 systemd[1]: sshd@3-10.200.20.13:22-10.200.16.10:51966.service: Deactivated successfully. Sep 3 23:26:06.151636 systemd[1]: session-6.scope: Deactivated successfully. Sep 3 23:26:06.152175 systemd-logind[1851]: Session 6 logged out. Waiting for processes to exit. Sep 3 23:26:06.153263 systemd-logind[1851]: Removed session 6. Sep 3 23:26:06.245007 systemd[1]: Started sshd@4-10.200.20.13:22-10.200.16.10:51980.service - OpenSSH per-connection server daemon (10.200.16.10:51980). Sep 3 23:26:06.738828 sshd[2293]: Accepted publickey for core from 10.200.16.10 port 51980 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:06.739841 sshd-session[2293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:06.743491 systemd-logind[1851]: New session 7 of user core. Sep 3 23:26:06.750653 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 3 23:26:07.214104 sudo[2296]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 3 23:26:07.214329 sudo[2296]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:07.239478 sudo[2296]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:07.329292 sshd[2295]: Connection closed by 10.200.16.10 port 51980 Sep 3 23:26:07.329881 sshd-session[2293]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:07.333171 systemd[1]: sshd@4-10.200.20.13:22-10.200.16.10:51980.service: Deactivated successfully. Sep 3 23:26:07.334361 systemd[1]: session-7.scope: Deactivated successfully. Sep 3 23:26:07.335315 systemd-logind[1851]: Session 7 logged out. Waiting for processes to exit. Sep 3 23:26:07.336363 systemd-logind[1851]: Removed session 7. Sep 3 23:26:07.417710 systemd[1]: Started sshd@5-10.200.20.13:22-10.200.16.10:51984.service - OpenSSH per-connection server daemon (10.200.16.10:51984). Sep 3 23:26:07.754737 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 3 23:26:07.756759 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:07.855239 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:07.857525 (kubelet)[2312]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:26:07.911359 sshd[2302]: Accepted publickey for core from 10.200.16.10 port 51984 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:07.912056 sshd-session[2302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:07.916994 systemd-logind[1851]: New session 8 of user core. Sep 3 23:26:07.922653 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 3 23:26:07.981462 kubelet[2312]: E0903 23:26:07.981421 2312 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:26:07.983372 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:26:07.983474 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:26:07.983871 systemd[1]: kubelet.service: Consumed 100ms CPU time, 105.5M memory peak. Sep 3 23:26:08.186149 sudo[2321]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 3 23:26:08.186956 sudo[2321]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:08.273151 sudo[2321]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:08.276775 sudo[2320]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 3 23:26:08.276971 sudo[2320]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:08.283882 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 3 23:26:08.310549 augenrules[2343]: No rules Sep 3 23:26:08.311667 systemd[1]: audit-rules.service: Deactivated successfully. Sep 3 23:26:08.313562 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 3 23:26:08.315082 sudo[2320]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:08.404769 sshd[2317]: Connection closed by 10.200.16.10 port 51984 Sep 3 23:26:08.404696 sshd-session[2302]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:08.407680 systemd-logind[1851]: Session 8 logged out. Waiting for processes to exit. Sep 3 23:26:08.407971 systemd[1]: sshd@5-10.200.20.13:22-10.200.16.10:51984.service: Deactivated successfully. Sep 3 23:26:08.409255 systemd[1]: session-8.scope: Deactivated successfully. Sep 3 23:26:08.410627 systemd-logind[1851]: Removed session 8. Sep 3 23:26:08.495679 systemd[1]: Started sshd@6-10.200.20.13:22-10.200.16.10:52000.service - OpenSSH per-connection server daemon (10.200.16.10:52000). Sep 3 23:26:08.988852 sshd[2352]: Accepted publickey for core from 10.200.16.10 port 52000 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:08.989914 sshd-session[2352]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:08.993347 systemd-logind[1851]: New session 9 of user core. Sep 3 23:26:09.004639 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 3 23:26:09.264079 sudo[2355]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 3 23:26:09.264284 sudo[2355]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:10.738357 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 3 23:26:10.747780 (dockerd)[2372]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 3 23:26:11.695221 dockerd[2372]: time="2025-09-03T23:26:11.695166885Z" level=info msg="Starting up" Sep 3 23:26:11.695882 dockerd[2372]: time="2025-09-03T23:26:11.695859213Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 3 23:26:11.774949 dockerd[2372]: time="2025-09-03T23:26:11.774918925Z" level=info msg="Loading containers: start." Sep 3 23:26:11.849549 kernel: Initializing XFRM netlink socket Sep 3 23:26:12.289777 systemd-networkd[1697]: docker0: Link UP Sep 3 23:26:12.304462 dockerd[2372]: time="2025-09-03T23:26:12.304402024Z" level=info msg="Loading containers: done." Sep 3 23:26:12.314497 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3183481113-merged.mount: Deactivated successfully. Sep 3 23:26:12.323692 dockerd[2372]: time="2025-09-03T23:26:12.323661184Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 3 23:26:12.323775 dockerd[2372]: time="2025-09-03T23:26:12.323720850Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 3 23:26:12.323873 dockerd[2372]: time="2025-09-03T23:26:12.323857103Z" level=info msg="Initializing buildkit" Sep 3 23:26:12.375791 dockerd[2372]: time="2025-09-03T23:26:12.375761591Z" level=info msg="Completed buildkit initialization" Sep 3 23:26:12.379089 dockerd[2372]: time="2025-09-03T23:26:12.379059043Z" level=info msg="Daemon has completed initialization" Sep 3 23:26:12.379600 dockerd[2372]: time="2025-09-03T23:26:12.379145886Z" level=info msg="API listen on /run/docker.sock" Sep 3 23:26:12.380082 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 3 23:26:13.348492 containerd[1869]: time="2025-09-03T23:26:13.348450898Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 3 23:26:14.242754 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2993168504.mount: Deactivated successfully. Sep 3 23:26:15.412437 containerd[1869]: time="2025-09-03T23:26:15.412382237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:15.414497 containerd[1869]: time="2025-09-03T23:26:15.414465230Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=27352613" Sep 3 23:26:15.418121 containerd[1869]: time="2025-09-03T23:26:15.418083166Z" level=info msg="ImageCreate event name:\"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:15.421574 containerd[1869]: time="2025-09-03T23:26:15.421539256Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:15.422256 containerd[1869]: time="2025-09-03T23:26:15.422093067Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"27349413\" in 2.073568191s" Sep 3 23:26:15.422256 containerd[1869]: time="2025-09-03T23:26:15.422119804Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\"" Sep 3 23:26:15.423327 containerd[1869]: time="2025-09-03T23:26:15.423305230Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 3 23:26:17.013021 containerd[1869]: time="2025-09-03T23:26:17.012969398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:17.015885 containerd[1869]: time="2025-09-03T23:26:17.015858141Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=23536977" Sep 3 23:26:17.019396 containerd[1869]: time="2025-09-03T23:26:17.019361201Z" level=info msg="ImageCreate event name:\"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:17.029309 containerd[1869]: time="2025-09-03T23:26:17.029278807Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:17.030345 containerd[1869]: time="2025-09-03T23:26:17.030319249Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"25093155\" in 1.606990466s" Sep 3 23:26:17.030382 containerd[1869]: time="2025-09-03T23:26:17.030349002Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\"" Sep 3 23:26:17.030942 containerd[1869]: time="2025-09-03T23:26:17.030922621Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 3 23:26:18.004691 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 3 23:26:18.005950 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:18.101877 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:18.105314 (kubelet)[2642]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:26:18.219450 kubelet[2642]: E0903 23:26:18.219408 2642 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:26:18.222115 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:26:18.222354 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:26:18.222769 systemd[1]: kubelet.service: Consumed 105ms CPU time, 105.2M memory peak. Sep 3 23:26:18.631275 containerd[1869]: time="2025-09-03T23:26:18.630687079Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:18.955785 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Sep 3 23:26:19.349070 containerd[1869]: time="2025-09-03T23:26:19.349025904Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=18292014" Sep 3 23:26:19.507725 containerd[1869]: time="2025-09-03T23:26:19.507657731Z" level=info msg="ImageCreate event name:\"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:19.514041 containerd[1869]: time="2025-09-03T23:26:19.514005100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:19.514598 containerd[1869]: time="2025-09-03T23:26:19.514578143Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"19848210\" in 2.48363061s" Sep 3 23:26:19.514714 containerd[1869]: time="2025-09-03T23:26:19.514648449Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\"" Sep 3 23:26:19.515155 containerd[1869]: time="2025-09-03T23:26:19.515121161Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 3 23:26:20.620341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4115597809.mount: Deactivated successfully. Sep 3 23:26:20.901930 containerd[1869]: time="2025-09-03T23:26:20.901706152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:20.905822 containerd[1869]: time="2025-09-03T23:26:20.905777542Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=28199959" Sep 3 23:26:20.908496 containerd[1869]: time="2025-09-03T23:26:20.908472494Z" level=info msg="ImageCreate event name:\"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:20.912016 containerd[1869]: time="2025-09-03T23:26:20.911987938Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:20.912367 containerd[1869]: time="2025-09-03T23:26:20.912344678Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"28198978\" in 1.397094432s" Sep 3 23:26:20.912387 containerd[1869]: time="2025-09-03T23:26:20.912370479Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\"" Sep 3 23:26:20.913138 containerd[1869]: time="2025-09-03T23:26:20.913115319Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 3 23:26:21.348620 update_engine[1852]: I20250903 23:26:21.348569 1852 update_attempter.cc:509] Updating boot flags... Sep 3 23:26:21.590676 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2901523841.mount: Deactivated successfully. Sep 3 23:26:22.453569 containerd[1869]: time="2025-09-03T23:26:22.452941734Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:22.455198 containerd[1869]: time="2025-09-03T23:26:22.455030299Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" Sep 3 23:26:22.458391 containerd[1869]: time="2025-09-03T23:26:22.458365696Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:22.462537 containerd[1869]: time="2025-09-03T23:26:22.462500064Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:22.463633 containerd[1869]: time="2025-09-03T23:26:22.463202039Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.550059535s" Sep 3 23:26:22.463633 containerd[1869]: time="2025-09-03T23:26:22.463226088Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 3 23:26:22.463828 containerd[1869]: time="2025-09-03T23:26:22.463725401Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 3 23:26:23.010756 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1591368015.mount: Deactivated successfully. Sep 3 23:26:23.034245 containerd[1869]: time="2025-09-03T23:26:23.033786758Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 3 23:26:23.036541 containerd[1869]: time="2025-09-03T23:26:23.036517840Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 3 23:26:23.039945 containerd[1869]: time="2025-09-03T23:26:23.039921376Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 3 23:26:23.043323 containerd[1869]: time="2025-09-03T23:26:23.043303447Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 3 23:26:23.043612 containerd[1869]: time="2025-09-03T23:26:23.043588616Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 579.837247ms" Sep 3 23:26:23.043657 containerd[1869]: time="2025-09-03T23:26:23.043615065Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 3 23:26:23.044216 containerd[1869]: time="2025-09-03T23:26:23.044194636Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 3 23:26:23.640920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3299328653.mount: Deactivated successfully. Sep 3 23:26:25.570439 containerd[1869]: time="2025-09-03T23:26:25.570382725Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:25.573432 containerd[1869]: time="2025-09-03T23:26:25.573399015Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465295" Sep 3 23:26:25.703136 containerd[1869]: time="2025-09-03T23:26:25.703069809Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:25.707417 containerd[1869]: time="2025-09-03T23:26:25.707380632Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:25.707945 containerd[1869]: time="2025-09-03T23:26:25.707799791Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.66358181s" Sep 3 23:26:25.707945 containerd[1869]: time="2025-09-03T23:26:25.707823584Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 3 23:26:28.254712 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 3 23:26:28.255910 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:28.345659 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:28.346489 (kubelet)[2913]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:26:28.371947 kubelet[2913]: E0903 23:26:28.371912 2913 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:26:28.373863 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:26:28.373958 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:26:28.374191 systemd[1]: kubelet.service: Consumed 98ms CPU time, 106.7M memory peak. Sep 3 23:26:29.643991 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:29.644411 systemd[1]: kubelet.service: Consumed 98ms CPU time, 106.7M memory peak. Sep 3 23:26:29.646463 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:29.663848 systemd[1]: Reload requested from client PID 2927 ('systemctl') (unit session-9.scope)... Sep 3 23:26:29.663859 systemd[1]: Reloading... Sep 3 23:26:29.753563 zram_generator::config[2973]: No configuration found. Sep 3 23:26:29.822540 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:26:29.904814 systemd[1]: Reloading finished in 240 ms. Sep 3 23:26:29.946351 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 3 23:26:29.946410 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 3 23:26:29.946733 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:29.946776 systemd[1]: kubelet.service: Consumed 71ms CPU time, 95M memory peak. Sep 3 23:26:29.948821 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:30.213500 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:30.220755 (kubelet)[3041]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 3 23:26:30.353029 kubelet[3041]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:30.353029 kubelet[3041]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 3 23:26:30.353029 kubelet[3041]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:30.353347 kubelet[3041]: I0903 23:26:30.353069 3041 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 3 23:26:30.811625 kubelet[3041]: I0903 23:26:30.811585 3041 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 3 23:26:30.811625 kubelet[3041]: I0903 23:26:30.811616 3041 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 3 23:26:30.811819 kubelet[3041]: I0903 23:26:30.811788 3041 server.go:956] "Client rotation is on, will bootstrap in background" Sep 3 23:26:30.830487 kubelet[3041]: I0903 23:26:30.830092 3041 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 3 23:26:30.830592 kubelet[3041]: E0903 23:26:30.830490 3041 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.13:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 3 23:26:30.835525 kubelet[3041]: I0903 23:26:30.835509 3041 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 3 23:26:30.837824 kubelet[3041]: I0903 23:26:30.837808 3041 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 3 23:26:30.839122 kubelet[3041]: I0903 23:26:30.839096 3041 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 3 23:26:30.839309 kubelet[3041]: I0903 23:26:30.839194 3041 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-989a023a05","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 3 23:26:30.839625 kubelet[3041]: I0903 23:26:30.839452 3041 topology_manager.go:138] "Creating topology manager with none policy" Sep 3 23:26:30.839625 kubelet[3041]: I0903 23:26:30.839467 3041 container_manager_linux.go:303] "Creating device plugin manager" Sep 3 23:26:30.840209 kubelet[3041]: I0903 23:26:30.840158 3041 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:30.843729 kubelet[3041]: I0903 23:26:30.843645 3041 kubelet.go:480] "Attempting to sync node with API server" Sep 3 23:26:30.843729 kubelet[3041]: I0903 23:26:30.843666 3041 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 3 23:26:30.843729 kubelet[3041]: I0903 23:26:30.843687 3041 kubelet.go:386] "Adding apiserver pod source" Sep 3 23:26:30.844702 kubelet[3041]: I0903 23:26:30.844640 3041 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 3 23:26:30.845538 kubelet[3041]: I0903 23:26:30.845261 3041 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 3 23:26:30.845632 kubelet[3041]: I0903 23:26:30.845616 3041 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 3 23:26:30.845672 kubelet[3041]: W0903 23:26:30.845661 3041 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 3 23:26:30.847256 kubelet[3041]: I0903 23:26:30.847216 3041 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 3 23:26:30.847256 kubelet[3041]: I0903 23:26:30.847247 3041 server.go:1289] "Started kubelet" Sep 3 23:26:30.847406 kubelet[3041]: E0903 23:26:30.847382 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.13:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-n-989a023a05&limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 3 23:26:30.851477 kubelet[3041]: E0903 23:26:30.851176 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.13:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 3 23:26:30.853724 kubelet[3041]: I0903 23:26:30.853679 3041 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 3 23:26:30.854052 kubelet[3041]: I0903 23:26:30.854024 3041 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 3 23:26:30.855089 kubelet[3041]: I0903 23:26:30.855072 3041 server.go:317] "Adding debug handlers to kubelet server" Sep 3 23:26:30.856130 kubelet[3041]: I0903 23:26:30.856086 3041 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 3 23:26:30.856433 kubelet[3041]: I0903 23:26:30.856416 3041 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 3 23:26:30.857570 kubelet[3041]: I0903 23:26:30.857550 3041 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 3 23:26:30.859972 kubelet[3041]: I0903 23:26:30.859950 3041 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 3 23:26:30.860152 kubelet[3041]: E0903 23:26:30.860133 3041 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:30.860623 kubelet[3041]: I0903 23:26:30.860601 3041 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 3 23:26:30.860680 kubelet[3041]: I0903 23:26:30.860648 3041 reconciler.go:26] "Reconciler: start to sync state" Sep 3 23:26:30.861517 kubelet[3041]: E0903 23:26:30.861072 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.13:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 3 23:26:30.861517 kubelet[3041]: E0903 23:26:30.861123 3041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.13:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-989a023a05?timeout=10s\": dial tcp 10.200.20.13:6443: connect: connection refused" interval="200ms" Sep 3 23:26:30.862746 kubelet[3041]: E0903 23:26:30.862511 3041 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 3 23:26:30.862746 kubelet[3041]: I0903 23:26:30.862667 3041 factory.go:223] Registration of the containerd container factory successfully Sep 3 23:26:30.862746 kubelet[3041]: I0903 23:26:30.862678 3041 factory.go:223] Registration of the systemd container factory successfully Sep 3 23:26:30.862919 kubelet[3041]: I0903 23:26:30.862892 3041 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 3 23:26:30.864220 kubelet[3041]: E0903 23:26:30.863311 3041 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.13:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.13:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372.1.0-n-989a023a05.1861e96e976aa99f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372.1.0-n-989a023a05,UID:ci-4372.1.0-n-989a023a05,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372.1.0-n-989a023a05,},FirstTimestamp:2025-09-03 23:26:30.847228319 +0000 UTC m=+0.623600679,LastTimestamp:2025-09-03 23:26:30.847228319 +0000 UTC m=+0.623600679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372.1.0-n-989a023a05,}" Sep 3 23:26:30.881912 kubelet[3041]: I0903 23:26:30.881898 3041 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 3 23:26:30.882074 kubelet[3041]: I0903 23:26:30.882053 3041 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 3 23:26:30.882203 kubelet[3041]: I0903 23:26:30.882138 3041 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:30.888616 kubelet[3041]: I0903 23:26:30.888552 3041 policy_none.go:49] "None policy: Start" Sep 3 23:26:30.888712 kubelet[3041]: I0903 23:26:30.888699 3041 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 3 23:26:30.888766 kubelet[3041]: I0903 23:26:30.888758 3041 state_mem.go:35] "Initializing new in-memory state store" Sep 3 23:26:30.898305 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 3 23:26:30.909221 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 3 23:26:30.911968 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 3 23:26:30.920329 kubelet[3041]: E0903 23:26:30.920217 3041 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 3 23:26:30.921198 kubelet[3041]: I0903 23:26:30.920595 3041 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 3 23:26:30.921198 kubelet[3041]: I0903 23:26:30.920616 3041 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 3 23:26:30.921198 kubelet[3041]: I0903 23:26:30.920976 3041 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 3 23:26:30.922667 kubelet[3041]: E0903 23:26:30.922644 3041 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 3 23:26:30.922715 kubelet[3041]: E0903 23:26:30.922685 3041 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:31.014364 kubelet[3041]: I0903 23:26:31.014336 3041 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 3 23:26:31.015678 kubelet[3041]: I0903 23:26:31.015662 3041 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 3 23:26:31.015759 kubelet[3041]: I0903 23:26:31.015751 3041 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 3 23:26:31.015839 kubelet[3041]: I0903 23:26:31.015832 3041 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 3 23:26:31.015908 kubelet[3041]: I0903 23:26:31.015901 3041 kubelet.go:2436] "Starting kubelet main sync loop" Sep 3 23:26:31.016012 kubelet[3041]: E0903 23:26:31.016002 3041 kubelet.go:2460] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Sep 3 23:26:31.017338 kubelet[3041]: E0903 23:26:31.017319 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.13:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 3 23:26:31.022013 kubelet[3041]: I0903 23:26:31.021987 3041 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.022282 kubelet[3041]: E0903 23:26:31.022252 3041 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.13:6443/api/v1/nodes\": dial tcp 10.200.20.13:6443: connect: connection refused" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.061755 kubelet[3041]: E0903 23:26:31.061672 3041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.13:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-989a023a05?timeout=10s\": dial tcp 10.200.20.13:6443: connect: connection refused" interval="400ms" Sep 3 23:26:31.161645 kubelet[3041]: I0903 23:26:31.161603 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb4df84dfd93cd529adf3d9682f4ed8e-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-989a023a05\" (UID: \"fb4df84dfd93cd529adf3d9682f4ed8e\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.161645 kubelet[3041]: I0903 23:26:31.161642 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb4df84dfd93cd529adf3d9682f4ed8e-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-989a023a05\" (UID: \"fb4df84dfd93cd529adf3d9682f4ed8e\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.161645 kubelet[3041]: I0903 23:26:31.161654 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb4df84dfd93cd529adf3d9682f4ed8e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-989a023a05\" (UID: \"fb4df84dfd93cd529adf3d9682f4ed8e\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.224061 kubelet[3041]: I0903 23:26:31.224008 3041 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.224511 kubelet[3041]: E0903 23:26:31.224488 3041 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.13:6443/api/v1/nodes\": dial tcp 10.200.20.13:6443: connect: connection refused" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.462324 kubelet[3041]: E0903 23:26:31.462206 3041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.13:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-989a023a05?timeout=10s\": dial tcp 10.200.20.13:6443: connect: connection refused" interval="800ms" Sep 3 23:26:31.626552 kubelet[3041]: I0903 23:26:31.626286 3041 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.626665 kubelet[3041]: E0903 23:26:31.626610 3041 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.13:6443/api/v1/nodes\": dial tcp 10.200.20.13:6443: connect: connection refused" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:31.909993 kubelet[3041]: E0903 23:26:31.909945 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.13:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-n-989a023a05&limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 3 23:26:32.204960 kubelet[3041]: E0903 23:26:32.075254 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.13:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 3 23:26:32.211841 systemd[1]: Created slice kubepods-burstable-podfb4df84dfd93cd529adf3d9682f4ed8e.slice - libcontainer container kubepods-burstable-podfb4df84dfd93cd529adf3d9682f4ed8e.slice. Sep 3 23:26:32.218179 kubelet[3041]: E0903 23:26:32.218125 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.219060 containerd[1869]: time="2025-09-03T23:26:32.218795268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-989a023a05,Uid:fb4df84dfd93cd529adf3d9682f4ed8e,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:32.220362 kubelet[3041]: E0903 23:26:32.220195 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.13:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 3 23:26:32.224307 systemd[1]: Created slice kubepods-burstable-pod5944b36be7c0d54ec8162df5a785d5b1.slice - libcontainer container kubepods-burstable-pod5944b36be7c0d54ec8162df5a785d5b1.slice. Sep 3 23:26:32.226978 kubelet[3041]: E0903 23:26:32.226957 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.229772 systemd[1]: Created slice kubepods-burstable-pod201996bf328543a979ecb40794119953.slice - libcontainer container kubepods-burstable-pod201996bf328543a979ecb40794119953.slice. Sep 3 23:26:32.231159 kubelet[3041]: E0903 23:26:32.231135 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.263600 kubelet[3041]: E0903 23:26:32.263567 3041 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.13:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-989a023a05?timeout=10s\": dial tcp 10.200.20.13:6443: connect: connection refused" interval="1.6s" Sep 3 23:26:32.268863 kubelet[3041]: I0903 23:26:32.268666 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.268863 kubelet[3041]: I0903 23:26:32.268695 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.268863 kubelet[3041]: I0903 23:26:32.268707 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.268863 kubelet[3041]: I0903 23:26:32.268715 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.268863 kubelet[3041]: I0903 23:26:32.268725 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.268984 kubelet[3041]: I0903 23:26:32.268741 3041 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/201996bf328543a979ecb40794119953-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-989a023a05\" (UID: \"201996bf328543a979ecb40794119953\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.272936 containerd[1869]: time="2025-09-03T23:26:32.272888661Z" level=info msg="connecting to shim e20efd399702c47cf9b6d1c212195293bc497ba0f198e0998640343f44cdfc97" address="unix:///run/containerd/s/52c4063801f05ab55e91e466b3d4526ba9fa762ba0c6df09bb00e4ec16cdbde5" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:32.292655 systemd[1]: Started cri-containerd-e20efd399702c47cf9b6d1c212195293bc497ba0f198e0998640343f44cdfc97.scope - libcontainer container e20efd399702c47cf9b6d1c212195293bc497ba0f198e0998640343f44cdfc97. Sep 3 23:26:32.326982 containerd[1869]: time="2025-09-03T23:26:32.326945278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-989a023a05,Uid:fb4df84dfd93cd529adf3d9682f4ed8e,Namespace:kube-system,Attempt:0,} returns sandbox id \"e20efd399702c47cf9b6d1c212195293bc497ba0f198e0998640343f44cdfc97\"" Sep 3 23:26:32.334421 containerd[1869]: time="2025-09-03T23:26:32.334392077Z" level=info msg="CreateContainer within sandbox \"e20efd399702c47cf9b6d1c212195293bc497ba0f198e0998640343f44cdfc97\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 3 23:26:32.359869 containerd[1869]: time="2025-09-03T23:26:32.358828439Z" level=info msg="Container 0ef0f3972c8cb070389d3db9fbfe9fdf85096c3d1e538f58ff755ad92f468856: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:32.372596 containerd[1869]: time="2025-09-03T23:26:32.372562559Z" level=info msg="CreateContainer within sandbox \"e20efd399702c47cf9b6d1c212195293bc497ba0f198e0998640343f44cdfc97\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"0ef0f3972c8cb070389d3db9fbfe9fdf85096c3d1e538f58ff755ad92f468856\"" Sep 3 23:26:32.373614 containerd[1869]: time="2025-09-03T23:26:32.373593625Z" level=info msg="StartContainer for \"0ef0f3972c8cb070389d3db9fbfe9fdf85096c3d1e538f58ff755ad92f468856\"" Sep 3 23:26:32.374422 containerd[1869]: time="2025-09-03T23:26:32.374380467Z" level=info msg="connecting to shim 0ef0f3972c8cb070389d3db9fbfe9fdf85096c3d1e538f58ff755ad92f468856" address="unix:///run/containerd/s/52c4063801f05ab55e91e466b3d4526ba9fa762ba0c6df09bb00e4ec16cdbde5" protocol=ttrpc version=3 Sep 3 23:26:32.392647 systemd[1]: Started cri-containerd-0ef0f3972c8cb070389d3db9fbfe9fdf85096c3d1e538f58ff755ad92f468856.scope - libcontainer container 0ef0f3972c8cb070389d3db9fbfe9fdf85096c3d1e538f58ff755ad92f468856. Sep 3 23:26:32.395703 kubelet[3041]: E0903 23:26:32.395665 3041 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.13:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.13:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 3 23:26:32.423558 containerd[1869]: time="2025-09-03T23:26:32.423311161Z" level=info msg="StartContainer for \"0ef0f3972c8cb070389d3db9fbfe9fdf85096c3d1e538f58ff755ad92f468856\" returns successfully" Sep 3 23:26:32.428631 kubelet[3041]: I0903 23:26:32.428602 3041 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.428894 kubelet[3041]: E0903 23:26:32.428848 3041 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.13:6443/api/v1/nodes\": dial tcp 10.200.20.13:6443: connect: connection refused" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:32.527923 containerd[1869]: time="2025-09-03T23:26:32.527883085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-989a023a05,Uid:5944b36be7c0d54ec8162df5a785d5b1,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:32.533998 containerd[1869]: time="2025-09-03T23:26:32.533876348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-989a023a05,Uid:201996bf328543a979ecb40794119953,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:32.617418 containerd[1869]: time="2025-09-03T23:26:32.617385125Z" level=info msg="connecting to shim 0ae91a65b9150f97a638b5836d5fa4eaa32aa65d2b285722def22d9cc63baafb" address="unix:///run/containerd/s/0cca0fcd563fe49908deebd3c5d5a633f38de7ba9f03884d9907cdc45092eb9c" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:32.633675 systemd[1]: Started cri-containerd-0ae91a65b9150f97a638b5836d5fa4eaa32aa65d2b285722def22d9cc63baafb.scope - libcontainer container 0ae91a65b9150f97a638b5836d5fa4eaa32aa65d2b285722def22d9cc63baafb. Sep 3 23:26:32.639058 containerd[1869]: time="2025-09-03T23:26:32.639025194Z" level=info msg="connecting to shim 34e3024502a6182e5399b680028de687dff4c1dd0f7fd46523e960767905bf3e" address="unix:///run/containerd/s/35c07d2e6d4a42a07ba2679b2cf51b9bd192f9d7f71b6db5800b9198df6ebf0a" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:32.657825 systemd[1]: Started cri-containerd-34e3024502a6182e5399b680028de687dff4c1dd0f7fd46523e960767905bf3e.scope - libcontainer container 34e3024502a6182e5399b680028de687dff4c1dd0f7fd46523e960767905bf3e. Sep 3 23:26:32.706377 containerd[1869]: time="2025-09-03T23:26:32.706323394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-989a023a05,Uid:201996bf328543a979ecb40794119953,Namespace:kube-system,Attempt:0,} returns sandbox id \"34e3024502a6182e5399b680028de687dff4c1dd0f7fd46523e960767905bf3e\"" Sep 3 23:26:32.709486 containerd[1869]: time="2025-09-03T23:26:32.709461818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-989a023a05,Uid:5944b36be7c0d54ec8162df5a785d5b1,Namespace:kube-system,Attempt:0,} returns sandbox id \"0ae91a65b9150f97a638b5836d5fa4eaa32aa65d2b285722def22d9cc63baafb\"" Sep 3 23:26:32.715523 containerd[1869]: time="2025-09-03T23:26:32.715499834Z" level=info msg="CreateContainer within sandbox \"34e3024502a6182e5399b680028de687dff4c1dd0f7fd46523e960767905bf3e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 3 23:26:32.719214 containerd[1869]: time="2025-09-03T23:26:32.719193429Z" level=info msg="CreateContainer within sandbox \"0ae91a65b9150f97a638b5836d5fa4eaa32aa65d2b285722def22d9cc63baafb\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 3 23:26:32.746269 containerd[1869]: time="2025-09-03T23:26:32.742860885Z" level=info msg="Container fc16c27e5a64694f55d84f1a6de0a8b01a8de1cd25a3a7f00721697d23abe076: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:32.749587 containerd[1869]: time="2025-09-03T23:26:32.749561676Z" level=info msg="Container d6a64c450059755b53b7d7e8ae9d01f678eb3dbd8cbb811e52f47ef956528a3e: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:32.762696 containerd[1869]: time="2025-09-03T23:26:32.762665846Z" level=info msg="CreateContainer within sandbox \"34e3024502a6182e5399b680028de687dff4c1dd0f7fd46523e960767905bf3e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fc16c27e5a64694f55d84f1a6de0a8b01a8de1cd25a3a7f00721697d23abe076\"" Sep 3 23:26:32.763028 containerd[1869]: time="2025-09-03T23:26:32.763007921Z" level=info msg="StartContainer for \"fc16c27e5a64694f55d84f1a6de0a8b01a8de1cd25a3a7f00721697d23abe076\"" Sep 3 23:26:32.763704 containerd[1869]: time="2025-09-03T23:26:32.763681040Z" level=info msg="connecting to shim fc16c27e5a64694f55d84f1a6de0a8b01a8de1cd25a3a7f00721697d23abe076" address="unix:///run/containerd/s/35c07d2e6d4a42a07ba2679b2cf51b9bd192f9d7f71b6db5800b9198df6ebf0a" protocol=ttrpc version=3 Sep 3 23:26:32.774003 containerd[1869]: time="2025-09-03T23:26:32.773973973Z" level=info msg="CreateContainer within sandbox \"0ae91a65b9150f97a638b5836d5fa4eaa32aa65d2b285722def22d9cc63baafb\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d6a64c450059755b53b7d7e8ae9d01f678eb3dbd8cbb811e52f47ef956528a3e\"" Sep 3 23:26:32.774628 containerd[1869]: time="2025-09-03T23:26:32.774606858Z" level=info msg="StartContainer for \"d6a64c450059755b53b7d7e8ae9d01f678eb3dbd8cbb811e52f47ef956528a3e\"" Sep 3 23:26:32.779524 containerd[1869]: time="2025-09-03T23:26:32.779460499Z" level=info msg="connecting to shim d6a64c450059755b53b7d7e8ae9d01f678eb3dbd8cbb811e52f47ef956528a3e" address="unix:///run/containerd/s/0cca0fcd563fe49908deebd3c5d5a633f38de7ba9f03884d9907cdc45092eb9c" protocol=ttrpc version=3 Sep 3 23:26:32.783694 systemd[1]: Started cri-containerd-fc16c27e5a64694f55d84f1a6de0a8b01a8de1cd25a3a7f00721697d23abe076.scope - libcontainer container fc16c27e5a64694f55d84f1a6de0a8b01a8de1cd25a3a7f00721697d23abe076. Sep 3 23:26:32.800625 systemd[1]: Started cri-containerd-d6a64c450059755b53b7d7e8ae9d01f678eb3dbd8cbb811e52f47ef956528a3e.scope - libcontainer container d6a64c450059755b53b7d7e8ae9d01f678eb3dbd8cbb811e52f47ef956528a3e. Sep 3 23:26:32.850884 containerd[1869]: time="2025-09-03T23:26:32.850851530Z" level=info msg="StartContainer for \"d6a64c450059755b53b7d7e8ae9d01f678eb3dbd8cbb811e52f47ef956528a3e\" returns successfully" Sep 3 23:26:32.860232 containerd[1869]: time="2025-09-03T23:26:32.860195408Z" level=info msg="StartContainer for \"fc16c27e5a64694f55d84f1a6de0a8b01a8de1cd25a3a7f00721697d23abe076\" returns successfully" Sep 3 23:26:33.028414 kubelet[3041]: E0903 23:26:33.028383 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:33.031817 kubelet[3041]: E0903 23:26:33.031733 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:33.033630 kubelet[3041]: E0903 23:26:33.033608 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:33.990852 kubelet[3041]: E0903 23:26:33.990730 3041 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:33.990852 kubelet[3041]: E0903 23:26:33.990735 3041 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4372.1.0-n-989a023a05" not found Sep 3 23:26:34.032502 kubelet[3041]: I0903 23:26:34.032229 3041 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.035818 kubelet[3041]: E0903 23:26:34.035724 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.036194 kubelet[3041]: E0903 23:26:34.036086 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.048172 kubelet[3041]: I0903 23:26:34.048152 3041 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.048662 kubelet[3041]: E0903 23:26:34.048600 3041 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4372.1.0-n-989a023a05\": node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:34.097801 kubelet[3041]: E0903 23:26:34.097771 3041 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:34.198069 kubelet[3041]: E0903 23:26:34.198029 3041 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:34.298580 kubelet[3041]: E0903 23:26:34.298543 3041 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:34.398158 kubelet[3041]: E0903 23:26:34.398133 3041 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-989a023a05\" not found" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.399176 kubelet[3041]: E0903 23:26:34.399159 3041 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:34.499658 kubelet[3041]: E0903 23:26:34.499625 3041 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-989a023a05\" not found" Sep 3 23:26:34.561144 kubelet[3041]: I0903 23:26:34.560855 3041 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.564691 kubelet[3041]: E0903 23:26:34.564545 3041 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-n-989a023a05\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.564691 kubelet[3041]: I0903 23:26:34.564566 3041 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.567178 kubelet[3041]: E0903 23:26:34.567155 3041 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.567178 kubelet[3041]: I0903 23:26:34.567174 3041 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:34.589824 kubelet[3041]: I0903 23:26:34.589804 3041 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:34.852569 kubelet[3041]: I0903 23:26:34.851612 3041 apiserver.go:52] "Watching apiserver" Sep 3 23:26:34.861754 kubelet[3041]: I0903 23:26:34.861709 3041 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 3 23:26:35.035775 kubelet[3041]: I0903 23:26:35.035577 3041 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:35.042719 kubelet[3041]: I0903 23:26:35.042526 3041 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:35.042719 kubelet[3041]: E0903 23:26:35.042584 3041 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-n-989a023a05\" already exists" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:35.801920 systemd[1]: Reload requested from client PID 3316 ('systemctl') (unit session-9.scope)... Sep 3 23:26:35.801932 systemd[1]: Reloading... Sep 3 23:26:35.900630 zram_generator::config[3368]: No configuration found. Sep 3 23:26:35.990487 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:26:36.084917 systemd[1]: Reloading finished in 282 ms. Sep 3 23:26:36.112721 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:36.127111 systemd[1]: kubelet.service: Deactivated successfully. Sep 3 23:26:36.127268 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:36.127301 systemd[1]: kubelet.service: Consumed 746ms CPU time, 124.7M memory peak. Sep 3 23:26:36.130757 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:36.234373 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:36.237977 (kubelet)[3426]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 3 23:26:36.267139 kubelet[3426]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:36.267139 kubelet[3426]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 3 23:26:36.267139 kubelet[3426]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:36.267426 kubelet[3426]: I0903 23:26:36.267260 3426 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 3 23:26:36.272049 kubelet[3426]: I0903 23:26:36.272022 3426 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 3 23:26:36.272049 kubelet[3426]: I0903 23:26:36.272045 3426 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 3 23:26:36.272194 kubelet[3426]: I0903 23:26:36.272177 3426 server.go:956] "Client rotation is on, will bootstrap in background" Sep 3 23:26:36.273128 kubelet[3426]: I0903 23:26:36.273109 3426 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 3 23:26:36.361760 kubelet[3426]: I0903 23:26:36.361315 3426 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 3 23:26:36.367385 kubelet[3426]: I0903 23:26:36.367363 3426 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 3 23:26:36.370197 kubelet[3426]: I0903 23:26:36.370173 3426 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 3 23:26:36.371110 kubelet[3426]: I0903 23:26:36.371075 3426 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 3 23:26:36.371279 kubelet[3426]: I0903 23:26:36.371108 3426 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-989a023a05","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 3 23:26:36.371360 kubelet[3426]: I0903 23:26:36.371287 3426 topology_manager.go:138] "Creating topology manager with none policy" Sep 3 23:26:36.371360 kubelet[3426]: I0903 23:26:36.371294 3426 container_manager_linux.go:303] "Creating device plugin manager" Sep 3 23:26:36.371360 kubelet[3426]: I0903 23:26:36.371332 3426 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:36.371455 kubelet[3426]: I0903 23:26:36.371439 3426 kubelet.go:480] "Attempting to sync node with API server" Sep 3 23:26:36.371455 kubelet[3426]: I0903 23:26:36.371454 3426 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 3 23:26:36.371509 kubelet[3426]: I0903 23:26:36.371470 3426 kubelet.go:386] "Adding apiserver pod source" Sep 3 23:26:36.371509 kubelet[3426]: I0903 23:26:36.371478 3426 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 3 23:26:36.379651 kubelet[3426]: I0903 23:26:36.379631 3426 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 3 23:26:36.380025 kubelet[3426]: I0903 23:26:36.380010 3426 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 3 23:26:36.387149 kubelet[3426]: I0903 23:26:36.387063 3426 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 3 23:26:36.387149 kubelet[3426]: I0903 23:26:36.387098 3426 server.go:1289] "Started kubelet" Sep 3 23:26:36.390095 kubelet[3426]: I0903 23:26:36.389336 3426 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 3 23:26:36.390494 kubelet[3426]: I0903 23:26:36.390275 3426 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 3 23:26:36.392693 kubelet[3426]: I0903 23:26:36.392679 3426 server.go:317] "Adding debug handlers to kubelet server" Sep 3 23:26:36.395737 kubelet[3426]: I0903 23:26:36.395682 3426 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 3 23:26:36.395873 kubelet[3426]: I0903 23:26:36.395855 3426 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 3 23:26:36.400018 kubelet[3426]: I0903 23:26:36.399951 3426 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 3 23:26:36.401574 kubelet[3426]: I0903 23:26:36.400427 3426 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 3 23:26:36.402056 kubelet[3426]: I0903 23:26:36.402037 3426 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 3 23:26:36.402436 kubelet[3426]: I0903 23:26:36.402420 3426 reconciler.go:26] "Reconciler: start to sync state" Sep 3 23:26:36.405280 kubelet[3426]: I0903 23:26:36.405256 3426 factory.go:223] Registration of the systemd container factory successfully Sep 3 23:26:36.405362 kubelet[3426]: I0903 23:26:36.405340 3426 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 3 23:26:36.409005 kubelet[3426]: E0903 23:26:36.408784 3426 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 3 23:26:36.409926 kubelet[3426]: I0903 23:26:36.409610 3426 factory.go:223] Registration of the containerd container factory successfully Sep 3 23:26:36.410308 kubelet[3426]: I0903 23:26:36.409507 3426 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 3 23:26:36.413755 kubelet[3426]: I0903 23:26:36.413565 3426 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 3 23:26:36.413755 kubelet[3426]: I0903 23:26:36.413586 3426 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 3 23:26:36.413755 kubelet[3426]: I0903 23:26:36.413600 3426 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 3 23:26:36.413755 kubelet[3426]: I0903 23:26:36.413605 3426 kubelet.go:2436] "Starting kubelet main sync loop" Sep 3 23:26:36.413755 kubelet[3426]: E0903 23:26:36.413636 3426 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 3 23:26:36.442909 kubelet[3426]: I0903 23:26:36.442891 3426 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 3 23:26:36.443055 kubelet[3426]: I0903 23:26:36.443041 3426 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 3 23:26:36.443122 kubelet[3426]: I0903 23:26:36.443113 3426 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:36.443294 kubelet[3426]: I0903 23:26:36.443280 3426 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 3 23:26:36.443467 kubelet[3426]: I0903 23:26:36.443355 3426 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 3 23:26:36.443467 kubelet[3426]: I0903 23:26:36.443380 3426 policy_none.go:49] "None policy: Start" Sep 3 23:26:36.443467 kubelet[3426]: I0903 23:26:36.443388 3426 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 3 23:26:36.443467 kubelet[3426]: I0903 23:26:36.443399 3426 state_mem.go:35] "Initializing new in-memory state store" Sep 3 23:26:36.443647 kubelet[3426]: I0903 23:26:36.443632 3426 state_mem.go:75] "Updated machine memory state" Sep 3 23:26:36.446945 kubelet[3426]: E0903 23:26:36.446930 3426 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 3 23:26:36.447420 kubelet[3426]: I0903 23:26:36.447328 3426 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 3 23:26:36.447420 kubelet[3426]: I0903 23:26:36.447349 3426 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 3 23:26:36.447727 kubelet[3426]: I0903 23:26:36.447713 3426 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 3 23:26:36.448931 kubelet[3426]: E0903 23:26:36.448893 3426 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 3 23:26:36.514854 kubelet[3426]: I0903 23:26:36.514802 3426 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.515049 kubelet[3426]: I0903 23:26:36.515034 3426 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.515510 kubelet[3426]: I0903 23:26:36.515220 3426 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.524058 kubelet[3426]: I0903 23:26:36.523810 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:36.529667 kubelet[3426]: I0903 23:26:36.529649 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:36.530107 kubelet[3426]: I0903 23:26:36.530093 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:36.530332 kubelet[3426]: E0903 23:26:36.530304 3426 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-n-989a023a05\" already exists" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.555679 kubelet[3426]: I0903 23:26:36.555660 3426 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.575414 kubelet[3426]: I0903 23:26:36.575344 3426 kubelet_node_status.go:124] "Node was previously registered" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.575648 kubelet[3426]: I0903 23:26:36.575635 3426 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606731 kubelet[3426]: I0903 23:26:36.606703 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb4df84dfd93cd529adf3d9682f4ed8e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-989a023a05\" (UID: \"fb4df84dfd93cd529adf3d9682f4ed8e\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606731 kubelet[3426]: I0903 23:26:36.606732 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606827 kubelet[3426]: I0903 23:26:36.606745 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606827 kubelet[3426]: I0903 23:26:36.606755 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606827 kubelet[3426]: I0903 23:26:36.606776 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606827 kubelet[3426]: I0903 23:26:36.606786 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/201996bf328543a979ecb40794119953-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-989a023a05\" (UID: \"201996bf328543a979ecb40794119953\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606827 kubelet[3426]: I0903 23:26:36.606796 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb4df84dfd93cd529adf3d9682f4ed8e-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-989a023a05\" (UID: \"fb4df84dfd93cd529adf3d9682f4ed8e\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606914 kubelet[3426]: I0903 23:26:36.606804 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb4df84dfd93cd529adf3d9682f4ed8e-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-989a023a05\" (UID: \"fb4df84dfd93cd529adf3d9682f4ed8e\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:36.606914 kubelet[3426]: I0903 23:26:36.606815 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5944b36be7c0d54ec8162df5a785d5b1-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" (UID: \"5944b36be7c0d54ec8162df5a785d5b1\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:37.377356 kubelet[3426]: I0903 23:26:37.377058 3426 apiserver.go:52] "Watching apiserver" Sep 3 23:26:37.402722 kubelet[3426]: I0903 23:26:37.402697 3426 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 3 23:26:37.430672 kubelet[3426]: I0903 23:26:37.430650 3426 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:37.430921 kubelet[3426]: I0903 23:26:37.430901 3426 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:37.431118 kubelet[3426]: I0903 23:26:37.431099 3426 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:37.443232 kubelet[3426]: I0903 23:26:37.443145 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:37.443546 kubelet[3426]: E0903 23:26:37.443447 3426 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-n-989a023a05\" already exists" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" Sep 3 23:26:37.444328 kubelet[3426]: I0903 23:26:37.443791 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:37.444328 kubelet[3426]: E0903 23:26:37.444117 3426 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-n-989a023a05\" already exists" pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" Sep 3 23:26:37.444658 kubelet[3426]: I0903 23:26:37.444639 3426 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 3 23:26:37.444764 kubelet[3426]: E0903 23:26:37.444751 3426 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-n-989a023a05\" already exists" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" Sep 3 23:26:37.449385 kubelet[3426]: I0903 23:26:37.448723 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372.1.0-n-989a023a05" podStartSLOduration=1.448706049 podStartE2EDuration="1.448706049s" podCreationTimestamp="2025-09-03 23:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:37.448034002 +0000 UTC m=+1.206548641" watchObservedRunningTime="2025-09-03 23:26:37.448706049 +0000 UTC m=+1.207220696" Sep 3 23:26:37.469638 kubelet[3426]: I0903 23:26:37.469468 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372.1.0-n-989a023a05" podStartSLOduration=3.469456617 podStartE2EDuration="3.469456617s" podCreationTimestamp="2025-09-03 23:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:37.459874203 +0000 UTC m=+1.218388850" watchObservedRunningTime="2025-09-03 23:26:37.469456617 +0000 UTC m=+1.227971264" Sep 3 23:26:40.677594 kubelet[3426]: I0903 23:26:40.677514 3426 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 3 23:26:40.678065 containerd[1869]: time="2025-09-03T23:26:40.677831742Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 3 23:26:40.678511 kubelet[3426]: I0903 23:26:40.678307 3426 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 3 23:26:41.279453 kubelet[3426]: I0903 23:26:41.279398 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-989a023a05" podStartSLOduration=5.279381776 podStartE2EDuration="5.279381776s" podCreationTimestamp="2025-09-03 23:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:37.469447033 +0000 UTC m=+1.227961696" watchObservedRunningTime="2025-09-03 23:26:41.279381776 +0000 UTC m=+5.037896415" Sep 3 23:26:41.296791 systemd[1]: Created slice kubepods-besteffort-pod596db41c_3183_41f3_b6e4_694c87c6d512.slice - libcontainer container kubepods-besteffort-pod596db41c_3183_41f3_b6e4_694c87c6d512.slice. Sep 3 23:26:41.332938 kubelet[3426]: I0903 23:26:41.332903 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/596db41c-3183-41f3-b6e4-694c87c6d512-xtables-lock\") pod \"kube-proxy-sm924\" (UID: \"596db41c-3183-41f3-b6e4-694c87c6d512\") " pod="kube-system/kube-proxy-sm924" Sep 3 23:26:41.332938 kubelet[3426]: I0903 23:26:41.332937 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/596db41c-3183-41f3-b6e4-694c87c6d512-kube-proxy\") pod \"kube-proxy-sm924\" (UID: \"596db41c-3183-41f3-b6e4-694c87c6d512\") " pod="kube-system/kube-proxy-sm924" Sep 3 23:26:41.333048 kubelet[3426]: I0903 23:26:41.332947 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/596db41c-3183-41f3-b6e4-694c87c6d512-lib-modules\") pod \"kube-proxy-sm924\" (UID: \"596db41c-3183-41f3-b6e4-694c87c6d512\") " pod="kube-system/kube-proxy-sm924" Sep 3 23:26:41.333048 kubelet[3426]: I0903 23:26:41.332958 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqczs\" (UniqueName: \"kubernetes.io/projected/596db41c-3183-41f3-b6e4-694c87c6d512-kube-api-access-dqczs\") pod \"kube-proxy-sm924\" (UID: \"596db41c-3183-41f3-b6e4-694c87c6d512\") " pod="kube-system/kube-proxy-sm924" Sep 3 23:26:41.610358 containerd[1869]: time="2025-09-03T23:26:41.610248853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sm924,Uid:596db41c-3183-41f3-b6e4-694c87c6d512,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:41.665981 containerd[1869]: time="2025-09-03T23:26:41.665748167Z" level=info msg="connecting to shim 4a42129646850d6dca860ac3b603d87bcba69ef46d76ce539083cbc170bc5fef" address="unix:///run/containerd/s/996511cd54ac3f07289f08fa6470b9ea5c0418ae5774eebd8d7c5fc5f8383155" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:41.682661 systemd[1]: Started cri-containerd-4a42129646850d6dca860ac3b603d87bcba69ef46d76ce539083cbc170bc5fef.scope - libcontainer container 4a42129646850d6dca860ac3b603d87bcba69ef46d76ce539083cbc170bc5fef. Sep 3 23:26:41.708982 containerd[1869]: time="2025-09-03T23:26:41.708870921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sm924,Uid:596db41c-3183-41f3-b6e4-694c87c6d512,Namespace:kube-system,Attempt:0,} returns sandbox id \"4a42129646850d6dca860ac3b603d87bcba69ef46d76ce539083cbc170bc5fef\"" Sep 3 23:26:41.717123 containerd[1869]: time="2025-09-03T23:26:41.717097990Z" level=info msg="CreateContainer within sandbox \"4a42129646850d6dca860ac3b603d87bcba69ef46d76ce539083cbc170bc5fef\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 3 23:26:41.739549 containerd[1869]: time="2025-09-03T23:26:41.739453629Z" level=info msg="Container 692fbfa7c9ecb1dc9bb3a58d5ee2d26db35ca8741169a2313ab1ec8079f0616f: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:41.759487 containerd[1869]: time="2025-09-03T23:26:41.759448050Z" level=info msg="CreateContainer within sandbox \"4a42129646850d6dca860ac3b603d87bcba69ef46d76ce539083cbc170bc5fef\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"692fbfa7c9ecb1dc9bb3a58d5ee2d26db35ca8741169a2313ab1ec8079f0616f\"" Sep 3 23:26:41.760360 containerd[1869]: time="2025-09-03T23:26:41.760308794Z" level=info msg="StartContainer for \"692fbfa7c9ecb1dc9bb3a58d5ee2d26db35ca8741169a2313ab1ec8079f0616f\"" Sep 3 23:26:41.762353 containerd[1869]: time="2025-09-03T23:26:41.762327042Z" level=info msg="connecting to shim 692fbfa7c9ecb1dc9bb3a58d5ee2d26db35ca8741169a2313ab1ec8079f0616f" address="unix:///run/containerd/s/996511cd54ac3f07289f08fa6470b9ea5c0418ae5774eebd8d7c5fc5f8383155" protocol=ttrpc version=3 Sep 3 23:26:41.779658 systemd[1]: Started cri-containerd-692fbfa7c9ecb1dc9bb3a58d5ee2d26db35ca8741169a2313ab1ec8079f0616f.scope - libcontainer container 692fbfa7c9ecb1dc9bb3a58d5ee2d26db35ca8741169a2313ab1ec8079f0616f. Sep 3 23:26:41.809261 containerd[1869]: time="2025-09-03T23:26:41.809223941Z" level=info msg="StartContainer for \"692fbfa7c9ecb1dc9bb3a58d5ee2d26db35ca8741169a2313ab1ec8079f0616f\" returns successfully" Sep 3 23:26:41.891634 systemd[1]: Created slice kubepods-besteffort-pod9c6c98ac_bca1_4248_9ad6_d3556e32e6cb.slice - libcontainer container kubepods-besteffort-pod9c6c98ac_bca1_4248_9ad6_d3556e32e6cb.slice. Sep 3 23:26:41.935392 kubelet[3426]: I0903 23:26:41.935359 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9c6c98ac-bca1-4248-9ad6-d3556e32e6cb-var-lib-calico\") pod \"tigera-operator-755d956888-jsgrg\" (UID: \"9c6c98ac-bca1-4248-9ad6-d3556e32e6cb\") " pod="tigera-operator/tigera-operator-755d956888-jsgrg" Sep 3 23:26:41.935392 kubelet[3426]: I0903 23:26:41.935389 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbfd\" (UniqueName: \"kubernetes.io/projected/9c6c98ac-bca1-4248-9ad6-d3556e32e6cb-kube-api-access-swbfd\") pod \"tigera-operator-755d956888-jsgrg\" (UID: \"9c6c98ac-bca1-4248-9ad6-d3556e32e6cb\") " pod="tigera-operator/tigera-operator-755d956888-jsgrg" Sep 3 23:26:42.198476 containerd[1869]: time="2025-09-03T23:26:42.198131306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-jsgrg,Uid:9c6c98ac-bca1-4248-9ad6-d3556e32e6cb,Namespace:tigera-operator,Attempt:0,}" Sep 3 23:26:42.252337 containerd[1869]: time="2025-09-03T23:26:42.252306768Z" level=info msg="connecting to shim 92839bf8e6f131061832c03837a2afc3563cf36704af63b141c3be616ddec3a5" address="unix:///run/containerd/s/aa3ee555a28979566b30bf1cdf3ea391e7f78ee5365789d204386a626e7ed519" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:42.271652 systemd[1]: Started cri-containerd-92839bf8e6f131061832c03837a2afc3563cf36704af63b141c3be616ddec3a5.scope - libcontainer container 92839bf8e6f131061832c03837a2afc3563cf36704af63b141c3be616ddec3a5. Sep 3 23:26:42.304043 containerd[1869]: time="2025-09-03T23:26:42.304009033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-jsgrg,Uid:9c6c98ac-bca1-4248-9ad6-d3556e32e6cb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"92839bf8e6f131061832c03837a2afc3563cf36704af63b141c3be616ddec3a5\"" Sep 3 23:26:42.307567 containerd[1869]: time="2025-09-03T23:26:42.306451045Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 3 23:26:42.638132 kubelet[3426]: I0903 23:26:42.637521 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-sm924" podStartSLOduration=1.637507158 podStartE2EDuration="1.637507158s" podCreationTimestamp="2025-09-03 23:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:42.451949635 +0000 UTC m=+6.210464274" watchObservedRunningTime="2025-09-03 23:26:42.637507158 +0000 UTC m=+6.396021797" Sep 3 23:26:43.833568 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1628922859.mount: Deactivated successfully. Sep 3 23:26:44.158658 containerd[1869]: time="2025-09-03T23:26:44.157963367Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:44.161092 containerd[1869]: time="2025-09-03T23:26:44.161070182Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 3 23:26:44.165687 containerd[1869]: time="2025-09-03T23:26:44.165654285Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:44.169559 containerd[1869]: time="2025-09-03T23:26:44.169500577Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:44.170141 containerd[1869]: time="2025-09-03T23:26:44.169854858Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.863376901s" Sep 3 23:26:44.170141 containerd[1869]: time="2025-09-03T23:26:44.169880603Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 3 23:26:44.177008 containerd[1869]: time="2025-09-03T23:26:44.176980297Z" level=info msg="CreateContainer within sandbox \"92839bf8e6f131061832c03837a2afc3563cf36704af63b141c3be616ddec3a5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 3 23:26:44.202624 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2934344460.mount: Deactivated successfully. Sep 3 23:26:44.203284 containerd[1869]: time="2025-09-03T23:26:44.203251053Z" level=info msg="Container fcdc3b6f9234d7bc3b87d5cc09e247e55e972590fff2a23bee1946ee399c4eef: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:44.216269 containerd[1869]: time="2025-09-03T23:26:44.216240775Z" level=info msg="CreateContainer within sandbox \"92839bf8e6f131061832c03837a2afc3563cf36704af63b141c3be616ddec3a5\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"fcdc3b6f9234d7bc3b87d5cc09e247e55e972590fff2a23bee1946ee399c4eef\"" Sep 3 23:26:44.216673 containerd[1869]: time="2025-09-03T23:26:44.216645962Z" level=info msg="StartContainer for \"fcdc3b6f9234d7bc3b87d5cc09e247e55e972590fff2a23bee1946ee399c4eef\"" Sep 3 23:26:44.218084 containerd[1869]: time="2025-09-03T23:26:44.217412768Z" level=info msg="connecting to shim fcdc3b6f9234d7bc3b87d5cc09e247e55e972590fff2a23bee1946ee399c4eef" address="unix:///run/containerd/s/aa3ee555a28979566b30bf1cdf3ea391e7f78ee5365789d204386a626e7ed519" protocol=ttrpc version=3 Sep 3 23:26:44.233641 systemd[1]: Started cri-containerd-fcdc3b6f9234d7bc3b87d5cc09e247e55e972590fff2a23bee1946ee399c4eef.scope - libcontainer container fcdc3b6f9234d7bc3b87d5cc09e247e55e972590fff2a23bee1946ee399c4eef. Sep 3 23:26:44.256153 containerd[1869]: time="2025-09-03T23:26:44.256130263Z" level=info msg="StartContainer for \"fcdc3b6f9234d7bc3b87d5cc09e247e55e972590fff2a23bee1946ee399c4eef\" returns successfully" Sep 3 23:26:44.483311 kubelet[3426]: I0903 23:26:44.482813 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-jsgrg" podStartSLOduration=1.617479968 podStartE2EDuration="3.482797619s" podCreationTimestamp="2025-09-03 23:26:41 +0000 UTC" firstStartedPulling="2025-09-03 23:26:42.305378535 +0000 UTC m=+6.063893174" lastFinishedPulling="2025-09-03 23:26:44.170696186 +0000 UTC m=+7.929210825" observedRunningTime="2025-09-03 23:26:44.482631758 +0000 UTC m=+8.241146405" watchObservedRunningTime="2025-09-03 23:26:44.482797619 +0000 UTC m=+8.241312258" Sep 3 23:26:49.307023 sudo[2355]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:49.395670 sshd[2354]: Connection closed by 10.200.16.10 port 52000 Sep 3 23:26:49.396193 sshd-session[2352]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:49.401420 systemd[1]: sshd@6-10.200.20.13:22-10.200.16.10:52000.service: Deactivated successfully. Sep 3 23:26:49.402706 systemd-logind[1851]: Session 9 logged out. Waiting for processes to exit. Sep 3 23:26:49.405041 systemd[1]: session-9.scope: Deactivated successfully. Sep 3 23:26:49.405202 systemd[1]: session-9.scope: Consumed 4.598s CPU time, 232.8M memory peak. Sep 3 23:26:49.407833 systemd-logind[1851]: Removed session 9. Sep 3 23:26:54.079730 systemd[1]: Created slice kubepods-besteffort-podf3f07b80_df84_46bd_8e6a_597b65b3d7cf.slice - libcontainer container kubepods-besteffort-podf3f07b80_df84_46bd_8e6a_597b65b3d7cf.slice. Sep 3 23:26:54.104642 kubelet[3426]: I0903 23:26:54.104601 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f3f07b80-df84-46bd-8e6a-597b65b3d7cf-typha-certs\") pod \"calico-typha-74b5cc6c54-ks59g\" (UID: \"f3f07b80-df84-46bd-8e6a-597b65b3d7cf\") " pod="calico-system/calico-typha-74b5cc6c54-ks59g" Sep 3 23:26:54.104906 kubelet[3426]: I0903 23:26:54.104668 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3f07b80-df84-46bd-8e6a-597b65b3d7cf-tigera-ca-bundle\") pod \"calico-typha-74b5cc6c54-ks59g\" (UID: \"f3f07b80-df84-46bd-8e6a-597b65b3d7cf\") " pod="calico-system/calico-typha-74b5cc6c54-ks59g" Sep 3 23:26:54.104906 kubelet[3426]: I0903 23:26:54.104685 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h45bp\" (UniqueName: \"kubernetes.io/projected/f3f07b80-df84-46bd-8e6a-597b65b3d7cf-kube-api-access-h45bp\") pod \"calico-typha-74b5cc6c54-ks59g\" (UID: \"f3f07b80-df84-46bd-8e6a-597b65b3d7cf\") " pod="calico-system/calico-typha-74b5cc6c54-ks59g" Sep 3 23:26:54.200760 systemd[1]: Created slice kubepods-besteffort-pode741f3ce_3bef_47fd_90ee_65317942f679.slice - libcontainer container kubepods-besteffort-pode741f3ce_3bef_47fd_90ee_65317942f679.slice. Sep 3 23:26:54.207250 kubelet[3426]: I0903 23:26:54.207031 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-var-run-calico\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207250 kubelet[3426]: I0903 23:26:54.207064 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-cni-net-dir\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207250 kubelet[3426]: I0903 23:26:54.207076 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-flexvol-driver-host\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207250 kubelet[3426]: I0903 23:26:54.207098 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-cni-bin-dir\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207250 kubelet[3426]: I0903 23:26:54.207107 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-lib-modules\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207407 kubelet[3426]: I0903 23:26:54.207117 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-var-lib-calico\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207407 kubelet[3426]: I0903 23:26:54.207125 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-xtables-lock\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207407 kubelet[3426]: I0903 23:26:54.207141 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-cni-log-dir\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207407 kubelet[3426]: I0903 23:26:54.207150 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e741f3ce-3bef-47fd-90ee-65317942f679-node-certs\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207407 kubelet[3426]: I0903 23:26:54.207158 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48lg\" (UniqueName: \"kubernetes.io/projected/e741f3ce-3bef-47fd-90ee-65317942f679-kube-api-access-n48lg\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207481 kubelet[3426]: I0903 23:26:54.207174 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e741f3ce-3bef-47fd-90ee-65317942f679-policysync\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.207481 kubelet[3426]: I0903 23:26:54.207185 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e741f3ce-3bef-47fd-90ee-65317942f679-tigera-ca-bundle\") pod \"calico-node-g2jc7\" (UID: \"e741f3ce-3bef-47fd-90ee-65317942f679\") " pod="calico-system/calico-node-g2jc7" Sep 3 23:26:54.308545 kubelet[3426]: E0903 23:26:54.308415 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.308545 kubelet[3426]: W0903 23:26:54.308438 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.308545 kubelet[3426]: E0903 23:26:54.308461 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.309099 kubelet[3426]: E0903 23:26:54.309026 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.309251 kubelet[3426]: W0903 23:26:54.309236 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.309393 kubelet[3426]: E0903 23:26:54.309332 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.309833 kubelet[3426]: E0903 23:26:54.309811 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.309833 kubelet[3426]: W0903 23:26:54.309825 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.309833 kubelet[3426]: E0903 23:26:54.309837 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.310248 kubelet[3426]: E0903 23:26:54.310232 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.310248 kubelet[3426]: W0903 23:26:54.310245 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.310437 kubelet[3426]: E0903 23:26:54.310255 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.310702 kubelet[3426]: E0903 23:26:54.310680 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.310702 kubelet[3426]: W0903 23:26:54.310696 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.310906 kubelet[3426]: E0903 23:26:54.310707 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.311621 kubelet[3426]: E0903 23:26:54.311600 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.311621 kubelet[3426]: W0903 23:26:54.311613 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.311621 kubelet[3426]: E0903 23:26:54.311625 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.314033 kubelet[3426]: E0903 23:26:54.313967 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.314033 kubelet[3426]: W0903 23:26:54.313986 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.314033 kubelet[3426]: E0903 23:26:54.313997 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.316683 kubelet[3426]: E0903 23:26:54.316663 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.316683 kubelet[3426]: W0903 23:26:54.316677 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.316683 kubelet[3426]: E0903 23:26:54.316688 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.317667 kubelet[3426]: E0903 23:26:54.317642 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.317667 kubelet[3426]: W0903 23:26:54.317657 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.317667 kubelet[3426]: E0903 23:26:54.317668 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.317806 kubelet[3426]: E0903 23:26:54.317800 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.317806 kubelet[3426]: W0903 23:26:54.317806 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.317806 kubelet[3426]: E0903 23:26:54.317814 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.318058 kubelet[3426]: E0903 23:26:54.318027 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.318058 kubelet[3426]: W0903 23:26:54.318046 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.318058 kubelet[3426]: E0903 23:26:54.318056 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.318303 kubelet[3426]: E0903 23:26:54.318285 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.318303 kubelet[3426]: W0903 23:26:54.318298 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.318370 kubelet[3426]: E0903 23:26:54.318308 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.318748 kubelet[3426]: E0903 23:26:54.318702 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.318748 kubelet[3426]: W0903 23:26:54.318715 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.318748 kubelet[3426]: E0903 23:26:54.318725 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.319640 kubelet[3426]: E0903 23:26:54.319620 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.319640 kubelet[3426]: W0903 23:26:54.319636 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.319730 kubelet[3426]: E0903 23:26:54.319647 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.357146 kubelet[3426]: E0903 23:26:54.357034 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6x5gb" podUID="3257a910-0dfb-493a-b923-e6215d245226" Sep 3 23:26:54.372546 kubelet[3426]: E0903 23:26:54.371695 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.372546 kubelet[3426]: W0903 23:26:54.371716 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.372546 kubelet[3426]: E0903 23:26:54.371730 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.382992 containerd[1869]: time="2025-09-03T23:26:54.382883935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-74b5cc6c54-ks59g,Uid:f3f07b80-df84-46bd-8e6a-597b65b3d7cf,Namespace:calico-system,Attempt:0,}" Sep 3 23:26:54.400992 kubelet[3426]: E0903 23:26:54.400966 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.400992 kubelet[3426]: W0903 23:26:54.400985 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.400992 kubelet[3426]: E0903 23:26:54.401001 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.401306 kubelet[3426]: E0903 23:26:54.401285 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.401342 kubelet[3426]: W0903 23:26:54.401300 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.401342 kubelet[3426]: E0903 23:26:54.401334 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.401712 kubelet[3426]: E0903 23:26:54.401688 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.401712 kubelet[3426]: W0903 23:26:54.401702 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.401712 kubelet[3426]: E0903 23:26:54.401713 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.402351 kubelet[3426]: E0903 23:26:54.402317 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.402351 kubelet[3426]: W0903 23:26:54.402333 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.402351 kubelet[3426]: E0903 23:26:54.402344 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.402983 kubelet[3426]: E0903 23:26:54.402959 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.402983 kubelet[3426]: W0903 23:26:54.402977 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.402983 kubelet[3426]: E0903 23:26:54.402988 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.403178 kubelet[3426]: E0903 23:26:54.403160 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.403178 kubelet[3426]: W0903 23:26:54.403170 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.403178 kubelet[3426]: E0903 23:26:54.403179 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.404755 kubelet[3426]: E0903 23:26:54.404736 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.404755 kubelet[3426]: W0903 23:26:54.404750 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.404826 kubelet[3426]: E0903 23:26:54.404762 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.405181 kubelet[3426]: E0903 23:26:54.405140 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.405181 kubelet[3426]: W0903 23:26:54.405157 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.405181 kubelet[3426]: E0903 23:26:54.405169 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.405717 kubelet[3426]: E0903 23:26:54.405540 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.405763 kubelet[3426]: W0903 23:26:54.405720 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.405763 kubelet[3426]: E0903 23:26:54.405734 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.406183 kubelet[3426]: E0903 23:26:54.406160 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.406183 kubelet[3426]: W0903 23:26:54.406180 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.406236 kubelet[3426]: E0903 23:26:54.406191 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.406345 kubelet[3426]: E0903 23:26:54.406328 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.406345 kubelet[3426]: W0903 23:26:54.406340 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.406384 kubelet[3426]: E0903 23:26:54.406348 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.406591 kubelet[3426]: E0903 23:26:54.406572 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.406591 kubelet[3426]: W0903 23:26:54.406590 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.406636 kubelet[3426]: E0903 23:26:54.406600 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.406853 kubelet[3426]: E0903 23:26:54.406837 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.406853 kubelet[3426]: W0903 23:26:54.406851 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.406900 kubelet[3426]: E0903 23:26:54.406861 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.407509 kubelet[3426]: E0903 23:26:54.407487 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.407509 kubelet[3426]: W0903 23:26:54.407506 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.407612 kubelet[3426]: E0903 23:26:54.407518 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.408220 kubelet[3426]: E0903 23:26:54.408201 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.408220 kubelet[3426]: W0903 23:26:54.408216 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.408270 kubelet[3426]: E0903 23:26:54.408227 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.408393 kubelet[3426]: E0903 23:26:54.408376 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.408393 kubelet[3426]: W0903 23:26:54.408390 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.408440 kubelet[3426]: E0903 23:26:54.408399 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.408586 kubelet[3426]: E0903 23:26:54.408571 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.408586 kubelet[3426]: W0903 23:26:54.408583 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.408647 kubelet[3426]: E0903 23:26:54.408592 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.408827 kubelet[3426]: E0903 23:26:54.408811 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.408827 kubelet[3426]: W0903 23:26:54.408823 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.408886 kubelet[3426]: E0903 23:26:54.408832 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.409250 kubelet[3426]: E0903 23:26:54.409235 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.409275 kubelet[3426]: W0903 23:26:54.409250 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.409275 kubelet[3426]: E0903 23:26:54.409260 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.409520 kubelet[3426]: E0903 23:26:54.409407 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.409566 kubelet[3426]: W0903 23:26:54.409520 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.409566 kubelet[3426]: E0903 23:26:54.409551 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.409930 kubelet[3426]: E0903 23:26:54.409914 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.409930 kubelet[3426]: W0903 23:26:54.409927 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.409988 kubelet[3426]: E0903 23:26:54.409937 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.410075 kubelet[3426]: I0903 23:26:54.410056 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3257a910-0dfb-493a-b923-e6215d245226-kubelet-dir\") pod \"csi-node-driver-6x5gb\" (UID: \"3257a910-0dfb-493a-b923-e6215d245226\") " pod="calico-system/csi-node-driver-6x5gb" Sep 3 23:26:54.410180 kubelet[3426]: E0903 23:26:54.410167 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.410180 kubelet[3426]: W0903 23:26:54.410178 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.410221 kubelet[3426]: E0903 23:26:54.410186 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.410350 kubelet[3426]: E0903 23:26:54.410338 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.410350 kubelet[3426]: W0903 23:26:54.410348 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.410402 kubelet[3426]: E0903 23:26:54.410359 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.410549 kubelet[3426]: E0903 23:26:54.410535 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.410589 kubelet[3426]: W0903 23:26:54.410552 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.410589 kubelet[3426]: E0903 23:26:54.410560 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.410589 kubelet[3426]: I0903 23:26:54.410576 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3257a910-0dfb-493a-b923-e6215d245226-registration-dir\") pod \"csi-node-driver-6x5gb\" (UID: \"3257a910-0dfb-493a-b923-e6215d245226\") " pod="calico-system/csi-node-driver-6x5gb" Sep 3 23:26:54.410775 kubelet[3426]: E0903 23:26:54.410762 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.410775 kubelet[3426]: W0903 23:26:54.410773 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.410824 kubelet[3426]: E0903 23:26:54.410780 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.410824 kubelet[3426]: I0903 23:26:54.410798 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3257a910-0dfb-493a-b923-e6215d245226-socket-dir\") pod \"csi-node-driver-6x5gb\" (UID: \"3257a910-0dfb-493a-b923-e6215d245226\") " pod="calico-system/csi-node-driver-6x5gb" Sep 3 23:26:54.410956 kubelet[3426]: E0903 23:26:54.410942 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.410956 kubelet[3426]: W0903 23:26:54.410953 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.411006 kubelet[3426]: E0903 23:26:54.410961 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.411006 kubelet[3426]: I0903 23:26:54.410975 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3257a910-0dfb-493a-b923-e6215d245226-varrun\") pod \"csi-node-driver-6x5gb\" (UID: \"3257a910-0dfb-493a-b923-e6215d245226\") " pod="calico-system/csi-node-driver-6x5gb" Sep 3 23:26:54.411160 kubelet[3426]: E0903 23:26:54.411144 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.411160 kubelet[3426]: W0903 23:26:54.411156 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.411215 kubelet[3426]: E0903 23:26:54.411165 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.411215 kubelet[3426]: I0903 23:26:54.411182 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwt9d\" (UniqueName: \"kubernetes.io/projected/3257a910-0dfb-493a-b923-e6215d245226-kube-api-access-lwt9d\") pod \"csi-node-driver-6x5gb\" (UID: \"3257a910-0dfb-493a-b923-e6215d245226\") " pod="calico-system/csi-node-driver-6x5gb" Sep 3 23:26:54.411356 kubelet[3426]: E0903 23:26:54.411341 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.411356 kubelet[3426]: W0903 23:26:54.411352 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.411457 kubelet[3426]: E0903 23:26:54.411360 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.411546 kubelet[3426]: E0903 23:26:54.411518 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.411584 kubelet[3426]: W0903 23:26:54.411551 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.411584 kubelet[3426]: E0903 23:26:54.411559 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.411718 kubelet[3426]: E0903 23:26:54.411705 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.411718 kubelet[3426]: W0903 23:26:54.411714 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.411775 kubelet[3426]: E0903 23:26:54.411720 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.411831 kubelet[3426]: E0903 23:26:54.411821 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.411831 kubelet[3426]: W0903 23:26:54.411828 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.411880 kubelet[3426]: E0903 23:26:54.411834 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.411963 kubelet[3426]: E0903 23:26:54.411949 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.411963 kubelet[3426]: W0903 23:26:54.411958 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.412086 kubelet[3426]: E0903 23:26:54.411964 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.412167 kubelet[3426]: E0903 23:26:54.412152 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.412167 kubelet[3426]: W0903 23:26:54.412163 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.412205 kubelet[3426]: E0903 23:26:54.412172 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.412311 kubelet[3426]: E0903 23:26:54.412301 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.412311 kubelet[3426]: W0903 23:26:54.412309 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.412361 kubelet[3426]: E0903 23:26:54.412314 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.412416 kubelet[3426]: E0903 23:26:54.412407 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.412416 kubelet[3426]: W0903 23:26:54.412413 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.412455 kubelet[3426]: E0903 23:26:54.412418 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.505737 containerd[1869]: time="2025-09-03T23:26:54.504706810Z" level=info msg="connecting to shim cadf8d2232a9487c7fb1f5e893c77faa2d6a3a1c01f7f12f3f4f76c7d60cbc72" address="unix:///run/containerd/s/fc776ee26ce7998e27529b9a73d4a279601402bcb981ee3fd1287c09ba5d9a1c" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:54.512006 kubelet[3426]: E0903 23:26:54.511984 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.512207 kubelet[3426]: W0903 23:26:54.512188 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.512823 kubelet[3426]: E0903 23:26:54.512719 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.513119 kubelet[3426]: E0903 23:26:54.513105 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.513334 kubelet[3426]: W0903 23:26:54.513190 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.513334 kubelet[3426]: E0903 23:26:54.513211 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.514065 kubelet[3426]: E0903 23:26:54.514002 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.514065 kubelet[3426]: W0903 23:26:54.514021 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.514065 kubelet[3426]: E0903 23:26:54.514032 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.514279 kubelet[3426]: E0903 23:26:54.514174 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.514279 kubelet[3426]: W0903 23:26:54.514181 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.514279 kubelet[3426]: E0903 23:26:54.514189 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.514333 containerd[1869]: time="2025-09-03T23:26:54.513682226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-g2jc7,Uid:e741f3ce-3bef-47fd-90ee-65317942f679,Namespace:calico-system,Attempt:0,}" Sep 3 23:26:54.514720 kubelet[3426]: E0903 23:26:54.514695 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.514720 kubelet[3426]: W0903 23:26:54.514708 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.514720 kubelet[3426]: E0903 23:26:54.514718 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.515103 kubelet[3426]: E0903 23:26:54.515081 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.515103 kubelet[3426]: W0903 23:26:54.515095 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.515103 kubelet[3426]: E0903 23:26:54.515106 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.515618 kubelet[3426]: E0903 23:26:54.515577 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.515618 kubelet[3426]: W0903 23:26:54.515607 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.515618 kubelet[3426]: E0903 23:26:54.515618 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.515972 kubelet[3426]: E0903 23:26:54.515950 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.515972 kubelet[3426]: W0903 23:26:54.515964 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.515972 kubelet[3426]: E0903 23:26:54.515975 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.516421 kubelet[3426]: E0903 23:26:54.516375 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.516421 kubelet[3426]: W0903 23:26:54.516390 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.516421 kubelet[3426]: E0903 23:26:54.516401 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.516910 kubelet[3426]: E0903 23:26:54.516873 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.516910 kubelet[3426]: W0903 23:26:54.516885 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.516910 kubelet[3426]: E0903 23:26:54.516895 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.517539 kubelet[3426]: E0903 23:26:54.517512 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.517685 kubelet[3426]: W0903 23:26:54.517662 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.517771 kubelet[3426]: E0903 23:26:54.517687 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.518713 kubelet[3426]: E0903 23:26:54.518692 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.518713 kubelet[3426]: W0903 23:26:54.518708 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.518841 kubelet[3426]: E0903 23:26:54.518719 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.518867 kubelet[3426]: E0903 23:26:54.518860 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.518882 kubelet[3426]: W0903 23:26:54.518867 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.518882 kubelet[3426]: E0903 23:26:54.518875 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.519122 kubelet[3426]: E0903 23:26:54.518995 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.519122 kubelet[3426]: W0903 23:26:54.519021 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.519122 kubelet[3426]: E0903 23:26:54.519032 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.519269 kubelet[3426]: E0903 23:26:54.519145 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.519269 kubelet[3426]: W0903 23:26:54.519151 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.519269 kubelet[3426]: E0903 23:26:54.519158 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.519269 kubelet[3426]: E0903 23:26:54.519266 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.519269 kubelet[3426]: W0903 23:26:54.519271 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.519353 kubelet[3426]: E0903 23:26:54.519279 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.520641 kubelet[3426]: E0903 23:26:54.520599 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.520641 kubelet[3426]: W0903 23:26:54.520637 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.520641 kubelet[3426]: E0903 23:26:54.520650 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.521197 kubelet[3426]: E0903 23:26:54.520951 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.521197 kubelet[3426]: W0903 23:26:54.520962 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.521197 kubelet[3426]: E0903 23:26:54.520973 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.521513 kubelet[3426]: E0903 23:26:54.521459 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.521753 kubelet[3426]: W0903 23:26:54.521561 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.521753 kubelet[3426]: E0903 23:26:54.521576 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.522092 kubelet[3426]: E0903 23:26:54.522052 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.522241 kubelet[3426]: W0903 23:26:54.522173 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.522241 kubelet[3426]: E0903 23:26:54.522189 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.522623 kubelet[3426]: E0903 23:26:54.522596 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.522741 kubelet[3426]: W0903 23:26:54.522631 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.522741 kubelet[3426]: E0903 23:26:54.522643 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.523247 kubelet[3426]: E0903 23:26:54.523223 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.523247 kubelet[3426]: W0903 23:26:54.523235 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.523247 kubelet[3426]: E0903 23:26:54.523246 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.523451 kubelet[3426]: E0903 23:26:54.523436 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.523451 kubelet[3426]: W0903 23:26:54.523450 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.523596 kubelet[3426]: E0903 23:26:54.523460 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.524506 kubelet[3426]: E0903 23:26:54.524478 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.524506 kubelet[3426]: W0903 23:26:54.524504 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.524734 kubelet[3426]: E0903 23:26:54.524515 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.524734 kubelet[3426]: E0903 23:26:54.524713 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.524734 kubelet[3426]: W0903 23:26:54.524721 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.524734 kubelet[3426]: E0903 23:26:54.524730 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.535670 systemd[1]: Started cri-containerd-cadf8d2232a9487c7fb1f5e893c77faa2d6a3a1c01f7f12f3f4f76c7d60cbc72.scope - libcontainer container cadf8d2232a9487c7fb1f5e893c77faa2d6a3a1c01f7f12f3f4f76c7d60cbc72. Sep 3 23:26:54.538988 kubelet[3426]: E0903 23:26:54.538952 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:54.538988 kubelet[3426]: W0903 23:26:54.538965 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:54.538988 kubelet[3426]: E0903 23:26:54.538976 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.579064 containerd[1869]: time="2025-09-03T23:26:54.578946087Z" level=info msg="connecting to shim 614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc" address="unix:///run/containerd/s/52bb0e325766c42473d7c40ca26f301cfe030fbb1e241ded0892e6ebc149fda5" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:54.586115 containerd[1869]: time="2025-09-03T23:26:54.586018417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-74b5cc6c54-ks59g,Uid:f3f07b80-df84-46bd-8e6a-597b65b3d7cf,Namespace:calico-system,Attempt:0,} returns sandbox id \"cadf8d2232a9487c7fb1f5e893c77faa2d6a3a1c01f7f12f3f4f76c7d60cbc72\"" Sep 3 23:26:54.588079 containerd[1869]: time="2025-09-03T23:26:54.588051355Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 3 23:26:54.599836 systemd[1]: Started cri-containerd-614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc.scope - libcontainer container 614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc. Sep 3 23:26:54.632381 containerd[1869]: time="2025-09-03T23:26:54.632271045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-g2jc7,Uid:e741f3ce-3bef-47fd-90ee-65317942f679,Namespace:calico-system,Attempt:0,} returns sandbox id \"614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc\"" Sep 3 23:26:55.841992 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3977968193.mount: Deactivated successfully. Sep 3 23:26:56.243994 containerd[1869]: time="2025-09-03T23:26:56.243846760Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:56.248591 containerd[1869]: time="2025-09-03T23:26:56.248474264Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 3 23:26:56.251463 containerd[1869]: time="2025-09-03T23:26:56.251437775Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:56.257112 containerd[1869]: time="2025-09-03T23:26:56.256760883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:56.257112 containerd[1869]: time="2025-09-03T23:26:56.257020722Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.66867295s" Sep 3 23:26:56.257112 containerd[1869]: time="2025-09-03T23:26:56.257043691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 3 23:26:56.258277 containerd[1869]: time="2025-09-03T23:26:56.258257383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 3 23:26:56.271702 containerd[1869]: time="2025-09-03T23:26:56.271667575Z" level=info msg="CreateContainer within sandbox \"cadf8d2232a9487c7fb1f5e893c77faa2d6a3a1c01f7f12f3f4f76c7d60cbc72\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 3 23:26:56.290181 containerd[1869]: time="2025-09-03T23:26:56.289673839Z" level=info msg="Container 493d5403f8473f806a0f8753fbdeaf11435d0aa752fbb7c2032b19b511d2b5d9: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:56.303573 containerd[1869]: time="2025-09-03T23:26:56.303544437Z" level=info msg="CreateContainer within sandbox \"cadf8d2232a9487c7fb1f5e893c77faa2d6a3a1c01f7f12f3f4f76c7d60cbc72\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"493d5403f8473f806a0f8753fbdeaf11435d0aa752fbb7c2032b19b511d2b5d9\"" Sep 3 23:26:56.304080 containerd[1869]: time="2025-09-03T23:26:56.304055732Z" level=info msg="StartContainer for \"493d5403f8473f806a0f8753fbdeaf11435d0aa752fbb7c2032b19b511d2b5d9\"" Sep 3 23:26:56.305256 containerd[1869]: time="2025-09-03T23:26:56.305211606Z" level=info msg="connecting to shim 493d5403f8473f806a0f8753fbdeaf11435d0aa752fbb7c2032b19b511d2b5d9" address="unix:///run/containerd/s/fc776ee26ce7998e27529b9a73d4a279601402bcb981ee3fd1287c09ba5d9a1c" protocol=ttrpc version=3 Sep 3 23:26:56.319648 systemd[1]: Started cri-containerd-493d5403f8473f806a0f8753fbdeaf11435d0aa752fbb7c2032b19b511d2b5d9.scope - libcontainer container 493d5403f8473f806a0f8753fbdeaf11435d0aa752fbb7c2032b19b511d2b5d9. Sep 3 23:26:56.363139 containerd[1869]: time="2025-09-03T23:26:56.363108822Z" level=info msg="StartContainer for \"493d5403f8473f806a0f8753fbdeaf11435d0aa752fbb7c2032b19b511d2b5d9\" returns successfully" Sep 3 23:26:56.415007 kubelet[3426]: E0903 23:26:56.414954 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6x5gb" podUID="3257a910-0dfb-493a-b923-e6215d245226" Sep 3 23:26:56.523963 kubelet[3426]: E0903 23:26:56.523857 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.523963 kubelet[3426]: W0903 23:26:56.523896 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.523963 kubelet[3426]: E0903 23:26:56.523918 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.524753 kubelet[3426]: E0903 23:26:56.524668 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.524753 kubelet[3426]: W0903 23:26:56.524686 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.524753 kubelet[3426]: E0903 23:26:56.524698 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.525011 kubelet[3426]: E0903 23:26:56.524986 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.525011 kubelet[3426]: W0903 23:26:56.524997 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.525165 kubelet[3426]: E0903 23:26:56.525095 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.526321 kubelet[3426]: E0903 23:26:56.526228 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.526321 kubelet[3426]: W0903 23:26:56.526253 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.526321 kubelet[3426]: E0903 23:26:56.526265 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.526685 kubelet[3426]: E0903 23:26:56.526589 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.526685 kubelet[3426]: W0903 23:26:56.526602 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.526685 kubelet[3426]: E0903 23:26:56.526616 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.526885 kubelet[3426]: E0903 23:26:56.526874 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.526958 kubelet[3426]: W0903 23:26:56.526947 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.527152 kubelet[3426]: E0903 23:26:56.527000 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.527450 kubelet[3426]: E0903 23:26:56.527435 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.527563 kubelet[3426]: W0903 23:26:56.527498 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.527563 kubelet[3426]: E0903 23:26:56.527513 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.527780 kubelet[3426]: E0903 23:26:56.527758 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.527780 kubelet[3426]: W0903 23:26:56.527767 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.527926 kubelet[3426]: E0903 23:26:56.527855 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.528804 kubelet[3426]: E0903 23:26:56.528790 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.528978 kubelet[3426]: W0903 23:26:56.528910 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.528978 kubelet[3426]: E0903 23:26:56.528928 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.529181 kubelet[3426]: E0903 23:26:56.529162 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.529696 kubelet[3426]: W0903 23:26:56.529578 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.529696 kubelet[3426]: E0903 23:26:56.529596 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.529928 kubelet[3426]: E0903 23:26:56.529865 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.529928 kubelet[3426]: W0903 23:26:56.529876 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.529928 kubelet[3426]: E0903 23:26:56.529885 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.530537 kubelet[3426]: E0903 23:26:56.530479 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.530537 kubelet[3426]: W0903 23:26:56.530493 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.530537 kubelet[3426]: E0903 23:26:56.530504 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.531289 kubelet[3426]: E0903 23:26:56.531236 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.531289 kubelet[3426]: W0903 23:26:56.531251 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.531289 kubelet[3426]: E0903 23:26:56.531262 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.531702 kubelet[3426]: E0903 23:26:56.531620 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.531702 kubelet[3426]: W0903 23:26:56.531637 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.531702 kubelet[3426]: E0903 23:26:56.531649 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.532252 kubelet[3426]: E0903 23:26:56.532220 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.532252 kubelet[3426]: W0903 23:26:56.532233 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.533589 kubelet[3426]: E0903 23:26:56.533567 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.536082 kubelet[3426]: E0903 23:26:56.536063 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.536082 kubelet[3426]: W0903 23:26:56.536077 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.536082 kubelet[3426]: E0903 23:26:56.536088 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.536251 kubelet[3426]: E0903 23:26:56.536236 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.536251 kubelet[3426]: W0903 23:26:56.536245 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.536299 kubelet[3426]: E0903 23:26:56.536254 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.536440 kubelet[3426]: E0903 23:26:56.536425 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.536440 kubelet[3426]: W0903 23:26:56.536436 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.536512 kubelet[3426]: E0903 23:26:56.536444 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.536671 kubelet[3426]: E0903 23:26:56.536656 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.536671 kubelet[3426]: W0903 23:26:56.536667 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.536730 kubelet[3426]: E0903 23:26:56.536674 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.537195 kubelet[3426]: E0903 23:26:56.537177 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.537195 kubelet[3426]: W0903 23:26:56.537188 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.537195 kubelet[3426]: E0903 23:26:56.537199 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.537644 kubelet[3426]: E0903 23:26:56.537615 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.537644 kubelet[3426]: W0903 23:26:56.537632 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.537644 kubelet[3426]: E0903 23:26:56.537644 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.537831 kubelet[3426]: E0903 23:26:56.537787 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.537831 kubelet[3426]: W0903 23:26:56.537796 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.537831 kubelet[3426]: E0903 23:26:56.537803 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.538452 kubelet[3426]: E0903 23:26:56.538429 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.538452 kubelet[3426]: W0903 23:26:56.538444 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.538642 kubelet[3426]: E0903 23:26:56.538456 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.538855 kubelet[3426]: E0903 23:26:56.538836 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.538855 kubelet[3426]: W0903 23:26:56.538849 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.538855 kubelet[3426]: E0903 23:26:56.538858 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.539524 kubelet[3426]: E0903 23:26:56.539500 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.539524 kubelet[3426]: W0903 23:26:56.539516 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.539613 kubelet[3426]: E0903 23:26:56.539569 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.540551 kubelet[3426]: E0903 23:26:56.539767 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.540551 kubelet[3426]: W0903 23:26:56.539781 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.540551 kubelet[3426]: E0903 23:26:56.539791 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.540551 kubelet[3426]: E0903 23:26:56.540161 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.540551 kubelet[3426]: W0903 23:26:56.540170 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.540551 kubelet[3426]: E0903 23:26:56.540189 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.540551 kubelet[3426]: E0903 23:26:56.540468 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.540551 kubelet[3426]: W0903 23:26:56.540478 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.540551 kubelet[3426]: E0903 23:26:56.540487 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.540745 kubelet[3426]: E0903 23:26:56.540683 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.540745 kubelet[3426]: W0903 23:26:56.540692 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.540745 kubelet[3426]: E0903 23:26:56.540701 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.540883 kubelet[3426]: E0903 23:26:56.540869 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.540883 kubelet[3426]: W0903 23:26:56.540880 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.540939 kubelet[3426]: E0903 23:26:56.540890 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.541127 kubelet[3426]: E0903 23:26:56.541106 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.541127 kubelet[3426]: W0903 23:26:56.541119 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.541127 kubelet[3426]: E0903 23:26:56.541129 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.541386 kubelet[3426]: E0903 23:26:56.541360 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.541386 kubelet[3426]: W0903 23:26:56.541372 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.541386 kubelet[3426]: E0903 23:26:56.541382 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.541807 kubelet[3426]: E0903 23:26:56.541782 3426 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.541807 kubelet[3426]: W0903 23:26:56.541795 3426 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.541807 kubelet[3426]: E0903 23:26:56.541805 3426 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:57.368565 containerd[1869]: time="2025-09-03T23:26:57.368365442Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:57.371292 containerd[1869]: time="2025-09-03T23:26:57.370738256Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 3 23:26:57.373662 containerd[1869]: time="2025-09-03T23:26:57.373612860Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:57.378830 containerd[1869]: time="2025-09-03T23:26:57.378760971Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:57.379687 containerd[1869]: time="2025-09-03T23:26:57.379583059Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.121216281s" Sep 3 23:26:57.379687 containerd[1869]: time="2025-09-03T23:26:57.379609668Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 3 23:26:57.385616 containerd[1869]: time="2025-09-03T23:26:57.385587547Z" level=info msg="CreateContainer within sandbox \"614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 3 23:26:57.403553 containerd[1869]: time="2025-09-03T23:26:57.401539582Z" level=info msg="Container 727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:57.418895 containerd[1869]: time="2025-09-03T23:26:57.418862201Z" level=info msg="CreateContainer within sandbox \"614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab\"" Sep 3 23:26:57.420035 containerd[1869]: time="2025-09-03T23:26:57.419995683Z" level=info msg="StartContainer for \"727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab\"" Sep 3 23:26:57.421806 containerd[1869]: time="2025-09-03T23:26:57.421704981Z" level=info msg="connecting to shim 727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab" address="unix:///run/containerd/s/52bb0e325766c42473d7c40ca26f301cfe030fbb1e241ded0892e6ebc149fda5" protocol=ttrpc version=3 Sep 3 23:26:57.439667 systemd[1]: Started cri-containerd-727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab.scope - libcontainer container 727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab. Sep 3 23:26:57.475181 containerd[1869]: time="2025-09-03T23:26:57.475074352Z" level=info msg="StartContainer for \"727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab\" returns successfully" Sep 3 23:26:57.475969 kubelet[3426]: I0903 23:26:57.475783 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:26:57.482652 systemd[1]: cri-containerd-727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab.scope: Deactivated successfully. Sep 3 23:26:57.485186 containerd[1869]: time="2025-09-03T23:26:57.485155911Z" level=info msg="TaskExit event in podsandbox handler container_id:\"727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab\" id:\"727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab\" pid:4106 exited_at:{seconds:1756942017 nanos:483918403}" Sep 3 23:26:57.485186 containerd[1869]: time="2025-09-03T23:26:57.485170256Z" level=info msg="received exit event container_id:\"727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab\" id:\"727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab\" pid:4106 exited_at:{seconds:1756942017 nanos:483918403}" Sep 3 23:26:57.499624 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-727b1c1e13fbd58c8be37f09a2e5d58652a85fb480dc2233d4519a4d51df32ab-rootfs.mount: Deactivated successfully. Sep 3 23:26:58.415566 kubelet[3426]: E0903 23:26:58.414793 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6x5gb" podUID="3257a910-0dfb-493a-b923-e6215d245226" Sep 3 23:26:58.491629 kubelet[3426]: I0903 23:26:58.491568 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-74b5cc6c54-ks59g" podStartSLOduration=2.821145273 podStartE2EDuration="4.491554925s" podCreationTimestamp="2025-09-03 23:26:54 +0000 UTC" firstStartedPulling="2025-09-03 23:26:54.587214536 +0000 UTC m=+18.345729175" lastFinishedPulling="2025-09-03 23:26:56.25762418 +0000 UTC m=+20.016138827" observedRunningTime="2025-09-03 23:26:56.479245743 +0000 UTC m=+20.237760390" watchObservedRunningTime="2025-09-03 23:26:58.491554925 +0000 UTC m=+22.250069564" Sep 3 23:26:59.481176 containerd[1869]: time="2025-09-03T23:26:59.481132775Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 3 23:27:00.414432 kubelet[3426]: E0903 23:27:00.414037 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6x5gb" podUID="3257a910-0dfb-493a-b923-e6215d245226" Sep 3 23:27:01.670435 containerd[1869]: time="2025-09-03T23:27:01.669969447Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:01.675473 containerd[1869]: time="2025-09-03T23:27:01.675445368Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 3 23:27:01.678251 containerd[1869]: time="2025-09-03T23:27:01.678228377Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:01.681461 containerd[1869]: time="2025-09-03T23:27:01.681426847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:01.682157 containerd[1869]: time="2025-09-03T23:27:01.681923845Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.200759549s" Sep 3 23:27:01.682157 containerd[1869]: time="2025-09-03T23:27:01.681946982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 3 23:27:01.688690 containerd[1869]: time="2025-09-03T23:27:01.688666179Z" level=info msg="CreateContainer within sandbox \"614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 3 23:27:01.724935 containerd[1869]: time="2025-09-03T23:27:01.724022246Z" level=info msg="Container 3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:01.740183 containerd[1869]: time="2025-09-03T23:27:01.740094309Z" level=info msg="CreateContainer within sandbox \"614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8\"" Sep 3 23:27:01.741689 containerd[1869]: time="2025-09-03T23:27:01.740689055Z" level=info msg="StartContainer for \"3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8\"" Sep 3 23:27:01.741689 containerd[1869]: time="2025-09-03T23:27:01.741633162Z" level=info msg="connecting to shim 3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8" address="unix:///run/containerd/s/52bb0e325766c42473d7c40ca26f301cfe030fbb1e241ded0892e6ebc149fda5" protocol=ttrpc version=3 Sep 3 23:27:01.762644 systemd[1]: Started cri-containerd-3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8.scope - libcontainer container 3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8. Sep 3 23:27:01.792344 containerd[1869]: time="2025-09-03T23:27:01.792319567Z" level=info msg="StartContainer for \"3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8\" returns successfully" Sep 3 23:27:02.414547 kubelet[3426]: E0903 23:27:02.414163 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6x5gb" podUID="3257a910-0dfb-493a-b923-e6215d245226" Sep 3 23:27:02.426509 kubelet[3426]: I0903 23:27:02.426380 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:02.981982 containerd[1869]: time="2025-09-03T23:27:02.981940876Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 3 23:27:02.984308 systemd[1]: cri-containerd-3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8.scope: Deactivated successfully. Sep 3 23:27:02.986364 systemd[1]: cri-containerd-3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8.scope: Consumed 311ms CPU time, 188.7M memory peak, 165.8M written to disk. Sep 3 23:27:02.988856 containerd[1869]: time="2025-09-03T23:27:02.988775900Z" level=info msg="received exit event container_id:\"3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8\" id:\"3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8\" pid:4165 exited_at:{seconds:1756942022 nanos:988473060}" Sep 3 23:27:02.989312 containerd[1869]: time="2025-09-03T23:27:02.989190353Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8\" id:\"3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8\" pid:4165 exited_at:{seconds:1756942022 nanos:988473060}" Sep 3 23:27:03.004354 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3706d271cabb1a62187af2b76e496b302bb0af3300e354c15951ccd475e0a2a8-rootfs.mount: Deactivated successfully. Sep 3 23:27:03.032263 kubelet[3426]: I0903 23:27:03.032169 3426 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 3 23:27:03.910375 systemd[1]: Created slice kubepods-besteffort-podd14a611a_4e62_4f6e_bad2_5b7a8bb81f0d.slice - libcontainer container kubepods-besteffort-podd14a611a_4e62_4f6e_bad2_5b7a8bb81f0d.slice. Sep 3 23:27:03.921132 systemd[1]: Created slice kubepods-burstable-pod31860c45_46a5_4bc4_8d23_e0d3f50c8c6b.slice - libcontainer container kubepods-burstable-pod31860c45_46a5_4bc4_8d23_e0d3f50c8c6b.slice. Sep 3 23:27:03.927819 systemd[1]: Created slice kubepods-besteffort-pod3257a910_0dfb_493a_b923_e6215d245226.slice - libcontainer container kubepods-besteffort-pod3257a910_0dfb_493a_b923_e6215d245226.slice. Sep 3 23:27:03.934186 containerd[1869]: time="2025-09-03T23:27:03.934030891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6x5gb,Uid:3257a910-0dfb-493a-b923-e6215d245226,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:03.937006 systemd[1]: Created slice kubepods-burstable-pod7320caad_e77e_46f7_ae51_16ff07f2bafa.slice - libcontainer container kubepods-burstable-pod7320caad_e77e_46f7_ae51_16ff07f2bafa.slice. Sep 3 23:27:03.947741 systemd[1]: Created slice kubepods-besteffort-poddd28fb5e_4963_4ee1_9d8f_b3d73564708c.slice - libcontainer container kubepods-besteffort-poddd28fb5e_4963_4ee1_9d8f_b3d73564708c.slice. Sep 3 23:27:03.963673 systemd[1]: Created slice kubepods-besteffort-pod0ee4a8e2_5e1b_4954_b402_9fae5592724a.slice - libcontainer container kubepods-besteffort-pod0ee4a8e2_5e1b_4954_b402_9fae5592724a.slice. Sep 3 23:27:03.972051 systemd[1]: Created slice kubepods-besteffort-pod03fb5889_accd_4288_86d3_895ee8b136f5.slice - libcontainer container kubepods-besteffort-pod03fb5889_accd_4288_86d3_895ee8b136f5.slice. Sep 3 23:27:03.976266 systemd[1]: Created slice kubepods-besteffort-podde505837_0b0f_4898_99c8_73a171069dca.slice - libcontainer container kubepods-besteffort-podde505837_0b0f_4898_99c8_73a171069dca.slice. Sep 3 23:27:03.982257 systemd[1]: Created slice kubepods-besteffort-podc4ad7e75_cfe6_44c8_9079_3c41c5e29937.slice - libcontainer container kubepods-besteffort-podc4ad7e75_cfe6_44c8_9079_3c41c5e29937.slice. Sep 3 23:27:04.013401 containerd[1869]: time="2025-09-03T23:27:04.013344916Z" level=error msg="Failed to destroy network for sandbox \"386dc28910faa947bdca398a5b2de1f985a281716d4c9d0e7e3b154dc7f1ebd2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.014962 systemd[1]: run-netns-cni\x2d80edaa5b\x2d4658\x2dbc9f\x2d61f4\x2df82c17eb4790.mount: Deactivated successfully. Sep 3 23:27:04.018693 containerd[1869]: time="2025-09-03T23:27:04.018282501Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6x5gb,Uid:3257a910-0dfb-493a-b923-e6215d245226,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"386dc28910faa947bdca398a5b2de1f985a281716d4c9d0e7e3b154dc7f1ebd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.018920 kubelet[3426]: E0903 23:27:04.018879 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"386dc28910faa947bdca398a5b2de1f985a281716d4c9d0e7e3b154dc7f1ebd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.019285 kubelet[3426]: E0903 23:27:04.018975 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"386dc28910faa947bdca398a5b2de1f985a281716d4c9d0e7e3b154dc7f1ebd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6x5gb" Sep 3 23:27:04.019285 kubelet[3426]: E0903 23:27:04.019015 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"386dc28910faa947bdca398a5b2de1f985a281716d4c9d0e7e3b154dc7f1ebd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6x5gb" Sep 3 23:27:04.019285 kubelet[3426]: E0903 23:27:04.019057 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-6x5gb_calico-system(3257a910-0dfb-493a-b923-e6215d245226)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-6x5gb_calico-system(3257a910-0dfb-493a-b923-e6215d245226)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"386dc28910faa947bdca398a5b2de1f985a281716d4c9d0e7e3b154dc7f1ebd2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-6x5gb" podUID="3257a910-0dfb-493a-b923-e6215d245226" Sep 3 23:27:04.043006 kubelet[3426]: I0903 23:27:04.042982 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxm8d\" (UniqueName: \"kubernetes.io/projected/0ee4a8e2-5e1b-4954-b402-9fae5592724a-kube-api-access-jxm8d\") pod \"calico-apiserver-5c8645d4dc-4m6vn\" (UID: \"0ee4a8e2-5e1b-4954-b402-9fae5592724a\") " pod="calico-apiserver/calico-apiserver-5c8645d4dc-4m6vn" Sep 3 23:27:04.043006 kubelet[3426]: I0903 23:27:04.043007 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-backend-key-pair\") pod \"whisker-56b9f764d-gnf8b\" (UID: \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\") " pod="calico-system/whisker-56b9f764d-gnf8b" Sep 3 23:27:04.043116 kubelet[3426]: I0903 23:27:04.043025 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31860c45-46a5-4bc4-8d23-e0d3f50c8c6b-config-volume\") pod \"coredns-674b8bbfcf-sh2fm\" (UID: \"31860c45-46a5-4bc4-8d23-e0d3f50c8c6b\") " pod="kube-system/coredns-674b8bbfcf-sh2fm" Sep 3 23:27:04.043116 kubelet[3426]: I0903 23:27:04.043038 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03fb5889-accd-4288-86d3-895ee8b136f5-calico-apiserver-certs\") pod \"calico-apiserver-5c8645d4dc-8s4zv\" (UID: \"03fb5889-accd-4288-86d3-895ee8b136f5\") " pod="calico-apiserver/calico-apiserver-5c8645d4dc-8s4zv" Sep 3 23:27:04.043116 kubelet[3426]: I0903 23:27:04.043048 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g98l\" (UniqueName: \"kubernetes.io/projected/7320caad-e77e-46f7-ae51-16ff07f2bafa-kube-api-access-4g98l\") pod \"coredns-674b8bbfcf-4tqph\" (UID: \"7320caad-e77e-46f7-ae51-16ff07f2bafa\") " pod="kube-system/coredns-674b8bbfcf-4tqph" Sep 3 23:27:04.043116 kubelet[3426]: I0903 23:27:04.043060 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzrj\" (UniqueName: \"kubernetes.io/projected/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-kube-api-access-dnzrj\") pod \"whisker-56b9f764d-gnf8b\" (UID: \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\") " pod="calico-system/whisker-56b9f764d-gnf8b" Sep 3 23:27:04.043116 kubelet[3426]: I0903 23:27:04.043080 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76l7\" (UniqueName: \"kubernetes.io/projected/dd28fb5e-4963-4ee1-9d8f-b3d73564708c-kube-api-access-q76l7\") pod \"calico-kube-controllers-58676dd4cc-b8ngc\" (UID: \"dd28fb5e-4963-4ee1-9d8f-b3d73564708c\") " pod="calico-system/calico-kube-controllers-58676dd4cc-b8ngc" Sep 3 23:27:04.043196 kubelet[3426]: I0903 23:27:04.043089 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0ee4a8e2-5e1b-4954-b402-9fae5592724a-calico-apiserver-certs\") pod \"calico-apiserver-5c8645d4dc-4m6vn\" (UID: \"0ee4a8e2-5e1b-4954-b402-9fae5592724a\") " pod="calico-apiserver/calico-apiserver-5c8645d4dc-4m6vn" Sep 3 23:27:04.043196 kubelet[3426]: I0903 23:27:04.043100 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-ca-bundle\") pod \"whisker-56b9f764d-gnf8b\" (UID: \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\") " pod="calico-system/whisker-56b9f764d-gnf8b" Sep 3 23:27:04.043196 kubelet[3426]: I0903 23:27:04.043109 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbmsd\" (UniqueName: \"kubernetes.io/projected/31860c45-46a5-4bc4-8d23-e0d3f50c8c6b-kube-api-access-lbmsd\") pod \"coredns-674b8bbfcf-sh2fm\" (UID: \"31860c45-46a5-4bc4-8d23-e0d3f50c8c6b\") " pod="kube-system/coredns-674b8bbfcf-sh2fm" Sep 3 23:27:04.043196 kubelet[3426]: I0903 23:27:04.043122 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd28fb5e-4963-4ee1-9d8f-b3d73564708c-tigera-ca-bundle\") pod \"calico-kube-controllers-58676dd4cc-b8ngc\" (UID: \"dd28fb5e-4963-4ee1-9d8f-b3d73564708c\") " pod="calico-system/calico-kube-controllers-58676dd4cc-b8ngc" Sep 3 23:27:04.043196 kubelet[3426]: I0903 23:27:04.043140 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4xs\" (UniqueName: \"kubernetes.io/projected/03fb5889-accd-4288-86d3-895ee8b136f5-kube-api-access-qg4xs\") pod \"calico-apiserver-5c8645d4dc-8s4zv\" (UID: \"03fb5889-accd-4288-86d3-895ee8b136f5\") " pod="calico-apiserver/calico-apiserver-5c8645d4dc-8s4zv" Sep 3 23:27:04.043272 kubelet[3426]: I0903 23:27:04.043150 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7320caad-e77e-46f7-ae51-16ff07f2bafa-config-volume\") pod \"coredns-674b8bbfcf-4tqph\" (UID: \"7320caad-e77e-46f7-ae51-16ff07f2bafa\") " pod="kube-system/coredns-674b8bbfcf-4tqph" Sep 3 23:27:04.144396 kubelet[3426]: I0903 23:27:04.144355 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c4ad7e75-cfe6-44c8-9079-3c41c5e29937-goldmane-key-pair\") pod \"goldmane-54d579b49d-xfwsv\" (UID: \"c4ad7e75-cfe6-44c8-9079-3c41c5e29937\") " pod="calico-system/goldmane-54d579b49d-xfwsv" Sep 3 23:27:04.144396 kubelet[3426]: I0903 23:27:04.144384 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d99vb\" (UniqueName: \"kubernetes.io/projected/c4ad7e75-cfe6-44c8-9079-3c41c5e29937-kube-api-access-d99vb\") pod \"goldmane-54d579b49d-xfwsv\" (UID: \"c4ad7e75-cfe6-44c8-9079-3c41c5e29937\") " pod="calico-system/goldmane-54d579b49d-xfwsv" Sep 3 23:27:04.144513 kubelet[3426]: I0903 23:27:04.144404 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ad7e75-cfe6-44c8-9079-3c41c5e29937-config\") pod \"goldmane-54d579b49d-xfwsv\" (UID: \"c4ad7e75-cfe6-44c8-9079-3c41c5e29937\") " pod="calico-system/goldmane-54d579b49d-xfwsv" Sep 3 23:27:04.144513 kubelet[3426]: I0903 23:27:04.144416 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/de505837-0b0f-4898-99c8-73a171069dca-calico-apiserver-certs\") pod \"calico-apiserver-776b7f98d6-vxp58\" (UID: \"de505837-0b0f-4898-99c8-73a171069dca\") " pod="calico-apiserver/calico-apiserver-776b7f98d6-vxp58" Sep 3 23:27:04.144513 kubelet[3426]: I0903 23:27:04.144467 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lmt\" (UniqueName: \"kubernetes.io/projected/de505837-0b0f-4898-99c8-73a171069dca-kube-api-access-r9lmt\") pod \"calico-apiserver-776b7f98d6-vxp58\" (UID: \"de505837-0b0f-4898-99c8-73a171069dca\") " pod="calico-apiserver/calico-apiserver-776b7f98d6-vxp58" Sep 3 23:27:04.144513 kubelet[3426]: I0903 23:27:04.144500 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4ad7e75-cfe6-44c8-9079-3c41c5e29937-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-xfwsv\" (UID: \"c4ad7e75-cfe6-44c8-9079-3c41c5e29937\") " pod="calico-system/goldmane-54d579b49d-xfwsv" Sep 3 23:27:04.216273 containerd[1869]: time="2025-09-03T23:27:04.216184442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56b9f764d-gnf8b,Uid:d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:04.226902 containerd[1869]: time="2025-09-03T23:27:04.226859569Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-sh2fm,Uid:31860c45-46a5-4bc4-8d23-e0d3f50c8c6b,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:04.255131 containerd[1869]: time="2025-09-03T23:27:04.254948765Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4tqph,Uid:7320caad-e77e-46f7-ae51-16ff07f2bafa,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:04.261788 containerd[1869]: time="2025-09-03T23:27:04.261767148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58676dd4cc-b8ngc,Uid:dd28fb5e-4963-4ee1-9d8f-b3d73564708c,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:04.269394 containerd[1869]: time="2025-09-03T23:27:04.269371434Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-4m6vn,Uid:0ee4a8e2-5e1b-4954-b402-9fae5592724a,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:04.275031 containerd[1869]: time="2025-09-03T23:27:04.274954029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-8s4zv,Uid:03fb5889-accd-4288-86d3-895ee8b136f5,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:04.281121 containerd[1869]: time="2025-09-03T23:27:04.281093448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776b7f98d6-vxp58,Uid:de505837-0b0f-4898-99c8-73a171069dca,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:04.284748 containerd[1869]: time="2025-09-03T23:27:04.284657688Z" level=error msg="Failed to destroy network for sandbox \"d89b98e4c8bc2a8c7c8246f1abbadbaed16ee34b50a2b3afeb6b2b9aca5077d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.286329 containerd[1869]: time="2025-09-03T23:27:04.286304544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xfwsv,Uid:c4ad7e75-cfe6-44c8-9079-3c41c5e29937,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:04.291918 containerd[1869]: time="2025-09-03T23:27:04.291882882Z" level=error msg="Failed to destroy network for sandbox \"b2d5b0a7a19965e598092b72ec10660036d5989636ab00ffcafb5785ccaf8732\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.388267 containerd[1869]: time="2025-09-03T23:27:04.388135826Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-sh2fm,Uid:31860c45-46a5-4bc4-8d23-e0d3f50c8c6b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d89b98e4c8bc2a8c7c8246f1abbadbaed16ee34b50a2b3afeb6b2b9aca5077d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.388377 kubelet[3426]: E0903 23:27:04.388327 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d89b98e4c8bc2a8c7c8246f1abbadbaed16ee34b50a2b3afeb6b2b9aca5077d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.388429 kubelet[3426]: E0903 23:27:04.388393 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d89b98e4c8bc2a8c7c8246f1abbadbaed16ee34b50a2b3afeb6b2b9aca5077d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-sh2fm" Sep 3 23:27:04.388429 kubelet[3426]: E0903 23:27:04.388410 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d89b98e4c8bc2a8c7c8246f1abbadbaed16ee34b50a2b3afeb6b2b9aca5077d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-sh2fm" Sep 3 23:27:04.388481 kubelet[3426]: E0903 23:27:04.388460 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-sh2fm_kube-system(31860c45-46a5-4bc4-8d23-e0d3f50c8c6b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-sh2fm_kube-system(31860c45-46a5-4bc4-8d23-e0d3f50c8c6b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d89b98e4c8bc2a8c7c8246f1abbadbaed16ee34b50a2b3afeb6b2b9aca5077d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-sh2fm" podUID="31860c45-46a5-4bc4-8d23-e0d3f50c8c6b" Sep 3 23:27:04.396715 containerd[1869]: time="2025-09-03T23:27:04.396682172Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56b9f764d-gnf8b,Uid:d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d5b0a7a19965e598092b72ec10660036d5989636ab00ffcafb5785ccaf8732\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.396926 kubelet[3426]: E0903 23:27:04.396903 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d5b0a7a19965e598092b72ec10660036d5989636ab00ffcafb5785ccaf8732\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.397003 kubelet[3426]: E0903 23:27:04.396937 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d5b0a7a19965e598092b72ec10660036d5989636ab00ffcafb5785ccaf8732\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-56b9f764d-gnf8b" Sep 3 23:27:04.397003 kubelet[3426]: E0903 23:27:04.396950 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d5b0a7a19965e598092b72ec10660036d5989636ab00ffcafb5785ccaf8732\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-56b9f764d-gnf8b" Sep 3 23:27:04.397003 kubelet[3426]: E0903 23:27:04.396979 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-56b9f764d-gnf8b_calico-system(d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-56b9f764d-gnf8b_calico-system(d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2d5b0a7a19965e598092b72ec10660036d5989636ab00ffcafb5785ccaf8732\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-56b9f764d-gnf8b" podUID="d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d" Sep 3 23:27:04.470147 containerd[1869]: time="2025-09-03T23:27:04.470038768Z" level=error msg="Failed to destroy network for sandbox \"77024cb100a064526ca1731a622734bf7a553fc1c7943c597fe175685fba14df\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.475806 containerd[1869]: time="2025-09-03T23:27:04.475637091Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58676dd4cc-b8ngc,Uid:dd28fb5e-4963-4ee1-9d8f-b3d73564708c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"77024cb100a064526ca1731a622734bf7a553fc1c7943c597fe175685fba14df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.475923 kubelet[3426]: E0903 23:27:04.475835 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77024cb100a064526ca1731a622734bf7a553fc1c7943c597fe175685fba14df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.475923 kubelet[3426]: E0903 23:27:04.475887 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77024cb100a064526ca1731a622734bf7a553fc1c7943c597fe175685fba14df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58676dd4cc-b8ngc" Sep 3 23:27:04.475923 kubelet[3426]: E0903 23:27:04.475905 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77024cb100a064526ca1731a622734bf7a553fc1c7943c597fe175685fba14df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58676dd4cc-b8ngc" Sep 3 23:27:04.476489 kubelet[3426]: E0903 23:27:04.475947 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58676dd4cc-b8ngc_calico-system(dd28fb5e-4963-4ee1-9d8f-b3d73564708c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58676dd4cc-b8ngc_calico-system(dd28fb5e-4963-4ee1-9d8f-b3d73564708c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"77024cb100a064526ca1731a622734bf7a553fc1c7943c597fe175685fba14df\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58676dd4cc-b8ngc" podUID="dd28fb5e-4963-4ee1-9d8f-b3d73564708c" Sep 3 23:27:04.478331 containerd[1869]: time="2025-09-03T23:27:04.478298785Z" level=error msg="Failed to destroy network for sandbox \"7b079d1ad0caef7e722e88887eca311c5ed72f81ec4fac85209cd159df902165\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.487369 containerd[1869]: time="2025-09-03T23:27:04.487337272Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4tqph,Uid:7320caad-e77e-46f7-ae51-16ff07f2bafa,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b079d1ad0caef7e722e88887eca311c5ed72f81ec4fac85209cd159df902165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.487773 kubelet[3426]: E0903 23:27:04.487640 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b079d1ad0caef7e722e88887eca311c5ed72f81ec4fac85209cd159df902165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.487773 kubelet[3426]: E0903 23:27:04.487692 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b079d1ad0caef7e722e88887eca311c5ed72f81ec4fac85209cd159df902165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4tqph" Sep 3 23:27:04.487773 kubelet[3426]: E0903 23:27:04.487709 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b079d1ad0caef7e722e88887eca311c5ed72f81ec4fac85209cd159df902165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4tqph" Sep 3 23:27:04.487880 kubelet[3426]: E0903 23:27:04.487741 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-4tqph_kube-system(7320caad-e77e-46f7-ae51-16ff07f2bafa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-4tqph_kube-system(7320caad-e77e-46f7-ae51-16ff07f2bafa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b079d1ad0caef7e722e88887eca311c5ed72f81ec4fac85209cd159df902165\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-4tqph" podUID="7320caad-e77e-46f7-ae51-16ff07f2bafa" Sep 3 23:27:04.500515 containerd[1869]: time="2025-09-03T23:27:04.500401462Z" level=error msg="Failed to destroy network for sandbox \"f6eb6c406d4a07acfd253e9ee756f01ba7de8af8a50b198449b4837143959771\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.507352 containerd[1869]: time="2025-09-03T23:27:04.506370204Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-4m6vn,Uid:0ee4a8e2-5e1b-4954-b402-9fae5592724a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6eb6c406d4a07acfd253e9ee756f01ba7de8af8a50b198449b4837143959771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.508022 kubelet[3426]: E0903 23:27:04.507976 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6eb6c406d4a07acfd253e9ee756f01ba7de8af8a50b198449b4837143959771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.508022 kubelet[3426]: E0903 23:27:04.508017 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6eb6c406d4a07acfd253e9ee756f01ba7de8af8a50b198449b4837143959771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c8645d4dc-4m6vn" Sep 3 23:27:04.508263 kubelet[3426]: E0903 23:27:04.508031 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6eb6c406d4a07acfd253e9ee756f01ba7de8af8a50b198449b4837143959771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c8645d4dc-4m6vn" Sep 3 23:27:04.508263 kubelet[3426]: E0903 23:27:04.508065 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c8645d4dc-4m6vn_calico-apiserver(0ee4a8e2-5e1b-4954-b402-9fae5592724a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c8645d4dc-4m6vn_calico-apiserver(0ee4a8e2-5e1b-4954-b402-9fae5592724a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6eb6c406d4a07acfd253e9ee756f01ba7de8af8a50b198449b4837143959771\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c8645d4dc-4m6vn" podUID="0ee4a8e2-5e1b-4954-b402-9fae5592724a" Sep 3 23:27:04.513316 containerd[1869]: time="2025-09-03T23:27:04.513202723Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 3 23:27:04.527692 containerd[1869]: time="2025-09-03T23:27:04.527618216Z" level=error msg="Failed to destroy network for sandbox \"c4d5713ec0e4cc978b2721f85376fb714ad352818af86495288f7eaa4e8dc6d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.530921 containerd[1869]: time="2025-09-03T23:27:04.530896727Z" level=error msg="Failed to destroy network for sandbox \"42ee39425e91d865422c70938c75d73a21b88d8d0893c09c679fdc213d7296c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.531126 containerd[1869]: time="2025-09-03T23:27:04.531095349Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776b7f98d6-vxp58,Uid:de505837-0b0f-4898-99c8-73a171069dca,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5713ec0e4cc978b2721f85376fb714ad352818af86495288f7eaa4e8dc6d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.531328 kubelet[3426]: E0903 23:27:04.531237 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5713ec0e4cc978b2721f85376fb714ad352818af86495288f7eaa4e8dc6d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.531328 kubelet[3426]: E0903 23:27:04.531274 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5713ec0e4cc978b2721f85376fb714ad352818af86495288f7eaa4e8dc6d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776b7f98d6-vxp58" Sep 3 23:27:04.531328 kubelet[3426]: E0903 23:27:04.531286 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5713ec0e4cc978b2721f85376fb714ad352818af86495288f7eaa4e8dc6d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776b7f98d6-vxp58" Sep 3 23:27:04.531462 kubelet[3426]: E0903 23:27:04.531321 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-776b7f98d6-vxp58_calico-apiserver(de505837-0b0f-4898-99c8-73a171069dca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-776b7f98d6-vxp58_calico-apiserver(de505837-0b0f-4898-99c8-73a171069dca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4d5713ec0e4cc978b2721f85376fb714ad352818af86495288f7eaa4e8dc6d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-776b7f98d6-vxp58" podUID="de505837-0b0f-4898-99c8-73a171069dca" Sep 3 23:27:04.533642 containerd[1869]: time="2025-09-03T23:27:04.533598830Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-8s4zv,Uid:03fb5889-accd-4288-86d3-895ee8b136f5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"42ee39425e91d865422c70938c75d73a21b88d8d0893c09c679fdc213d7296c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.533970 kubelet[3426]: E0903 23:27:04.533858 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42ee39425e91d865422c70938c75d73a21b88d8d0893c09c679fdc213d7296c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.533970 kubelet[3426]: E0903 23:27:04.533891 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42ee39425e91d865422c70938c75d73a21b88d8d0893c09c679fdc213d7296c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c8645d4dc-8s4zv" Sep 3 23:27:04.533970 kubelet[3426]: E0903 23:27:04.533904 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42ee39425e91d865422c70938c75d73a21b88d8d0893c09c679fdc213d7296c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c8645d4dc-8s4zv" Sep 3 23:27:04.534094 kubelet[3426]: E0903 23:27:04.533933 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c8645d4dc-8s4zv_calico-apiserver(03fb5889-accd-4288-86d3-895ee8b136f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c8645d4dc-8s4zv_calico-apiserver(03fb5889-accd-4288-86d3-895ee8b136f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"42ee39425e91d865422c70938c75d73a21b88d8d0893c09c679fdc213d7296c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c8645d4dc-8s4zv" podUID="03fb5889-accd-4288-86d3-895ee8b136f5" Sep 3 23:27:04.535548 containerd[1869]: time="2025-09-03T23:27:04.535502878Z" level=error msg="Failed to destroy network for sandbox \"7961f75b7d1a4df8ffddb41ddf5286601490e5d2b9169ff9c1e2108da2feb14e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.538376 containerd[1869]: time="2025-09-03T23:27:04.538345609Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xfwsv,Uid:c4ad7e75-cfe6-44c8-9079-3c41c5e29937,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961f75b7d1a4df8ffddb41ddf5286601490e5d2b9169ff9c1e2108da2feb14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.538608 kubelet[3426]: E0903 23:27:04.538466 3426 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961f75b7d1a4df8ffddb41ddf5286601490e5d2b9169ff9c1e2108da2feb14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:04.538608 kubelet[3426]: E0903 23:27:04.538495 3426 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961f75b7d1a4df8ffddb41ddf5286601490e5d2b9169ff9c1e2108da2feb14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xfwsv" Sep 3 23:27:04.538608 kubelet[3426]: E0903 23:27:04.538511 3426 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7961f75b7d1a4df8ffddb41ddf5286601490e5d2b9169ff9c1e2108da2feb14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xfwsv" Sep 3 23:27:04.538763 kubelet[3426]: E0903 23:27:04.538745 3426 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-xfwsv_calico-system(c4ad7e75-cfe6-44c8-9079-3c41c5e29937)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-xfwsv_calico-system(c4ad7e75-cfe6-44c8-9079-3c41c5e29937)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7961f75b7d1a4df8ffddb41ddf5286601490e5d2b9169ff9c1e2108da2feb14e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-xfwsv" podUID="c4ad7e75-cfe6-44c8-9079-3c41c5e29937" Sep 3 23:27:08.253666 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1738256904.mount: Deactivated successfully. Sep 3 23:27:09.256557 containerd[1869]: time="2025-09-03T23:27:09.256131642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:09.309975 containerd[1869]: time="2025-09-03T23:27:09.309820113Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 3 23:27:09.547403 containerd[1869]: time="2025-09-03T23:27:09.547145300Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:09.555572 containerd[1869]: time="2025-09-03T23:27:09.555512568Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:09.555973 containerd[1869]: time="2025-09-03T23:27:09.555950965Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 5.042630423s" Sep 3 23:27:09.556021 containerd[1869]: time="2025-09-03T23:27:09.555977334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 3 23:27:09.573945 containerd[1869]: time="2025-09-03T23:27:09.573920026Z" level=info msg="CreateContainer within sandbox \"614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 3 23:27:09.616761 containerd[1869]: time="2025-09-03T23:27:09.616600711Z" level=info msg="Container f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:09.620786 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3952910261.mount: Deactivated successfully. Sep 3 23:27:09.643677 containerd[1869]: time="2025-09-03T23:27:09.643644108Z" level=info msg="CreateContainer within sandbox \"614038eb0a9e406f09524285896fe5d00a1338d5cc659e84672f325005660cbc\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\"" Sep 3 23:27:09.645290 containerd[1869]: time="2025-09-03T23:27:09.645242954Z" level=info msg="StartContainer for \"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\"" Sep 3 23:27:09.646323 containerd[1869]: time="2025-09-03T23:27:09.646284889Z" level=info msg="connecting to shim f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1" address="unix:///run/containerd/s/52bb0e325766c42473d7c40ca26f301cfe030fbb1e241ded0892e6ebc149fda5" protocol=ttrpc version=3 Sep 3 23:27:09.664705 systemd[1]: Started cri-containerd-f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1.scope - libcontainer container f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1. Sep 3 23:27:09.699036 containerd[1869]: time="2025-09-03T23:27:09.699012371Z" level=info msg="StartContainer for \"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" returns successfully" Sep 3 23:27:10.071652 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 3 23:27:10.071766 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 3 23:27:10.283636 kubelet[3426]: I0903 23:27:10.283514 3426 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnzrj\" (UniqueName: \"kubernetes.io/projected/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-kube-api-access-dnzrj\") pod \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\" (UID: \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\") " Sep 3 23:27:10.283636 kubelet[3426]: I0903 23:27:10.283643 3426 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-ca-bundle\") pod \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\" (UID: \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\") " Sep 3 23:27:10.283995 kubelet[3426]: I0903 23:27:10.283661 3426 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-backend-key-pair\") pod \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\" (UID: \"d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d\") " Sep 3 23:27:10.289606 kubelet[3426]: I0903 23:27:10.289520 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d" (UID: "d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 3 23:27:10.291006 kubelet[3426]: I0903 23:27:10.290978 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d" (UID: "d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 3 23:27:10.291592 kubelet[3426]: I0903 23:27:10.291521 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-kube-api-access-dnzrj" (OuterVolumeSpecName: "kube-api-access-dnzrj") pod "d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d" (UID: "d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d"). InnerVolumeSpecName "kube-api-access-dnzrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 3 23:27:10.384683 kubelet[3426]: I0903 23:27:10.384593 3426 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-backend-key-pair\") on node \"ci-4372.1.0-n-989a023a05\" DevicePath \"\"" Sep 3 23:27:10.384683 kubelet[3426]: I0903 23:27:10.384622 3426 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnzrj\" (UniqueName: \"kubernetes.io/projected/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-kube-api-access-dnzrj\") on node \"ci-4372.1.0-n-989a023a05\" DevicePath \"\"" Sep 3 23:27:10.384683 kubelet[3426]: I0903 23:27:10.384631 3426 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d-whisker-ca-bundle\") on node \"ci-4372.1.0-n-989a023a05\" DevicePath \"\"" Sep 3 23:27:10.419359 systemd[1]: Removed slice kubepods-besteffort-podd14a611a_4e62_4f6e_bad2_5b7a8bb81f0d.slice - libcontainer container kubepods-besteffort-podd14a611a_4e62_4f6e_bad2_5b7a8bb81f0d.slice. Sep 3 23:27:10.551319 kubelet[3426]: I0903 23:27:10.551268 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-g2jc7" podStartSLOduration=1.627896813 podStartE2EDuration="16.551253154s" podCreationTimestamp="2025-09-03 23:26:54 +0000 UTC" firstStartedPulling="2025-09-03 23:26:54.633226411 +0000 UTC m=+18.391741050" lastFinishedPulling="2025-09-03 23:27:09.556582752 +0000 UTC m=+33.315097391" observedRunningTime="2025-09-03 23:27:10.540509624 +0000 UTC m=+34.299024263" watchObservedRunningTime="2025-09-03 23:27:10.551253154 +0000 UTC m=+34.309767793" Sep 3 23:27:10.562604 systemd[1]: var-lib-kubelet-pods-d14a611a\x2d4e62\x2d4f6e\x2dbad2\x2d5b7a8bb81f0d-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddnzrj.mount: Deactivated successfully. Sep 3 23:27:10.562687 systemd[1]: var-lib-kubelet-pods-d14a611a\x2d4e62\x2d4f6e\x2dbad2\x2d5b7a8bb81f0d-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 3 23:27:10.610022 systemd[1]: Created slice kubepods-besteffort-pod6a4baa2e_5d42_4596_ab9d_18fda5c8cfa7.slice - libcontainer container kubepods-besteffort-pod6a4baa2e_5d42_4596_ab9d_18fda5c8cfa7.slice. Sep 3 23:27:10.686731 kubelet[3426]: I0903 23:27:10.686638 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5rps\" (UniqueName: \"kubernetes.io/projected/6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7-kube-api-access-h5rps\") pod \"whisker-54fd6d9bd7-8z6q9\" (UID: \"6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7\") " pod="calico-system/whisker-54fd6d9bd7-8z6q9" Sep 3 23:27:10.687102 kubelet[3426]: I0903 23:27:10.687031 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7-whisker-backend-key-pair\") pod \"whisker-54fd6d9bd7-8z6q9\" (UID: \"6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7\") " pod="calico-system/whisker-54fd6d9bd7-8z6q9" Sep 3 23:27:10.687102 kubelet[3426]: I0903 23:27:10.687055 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7-whisker-ca-bundle\") pod \"whisker-54fd6d9bd7-8z6q9\" (UID: \"6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7\") " pod="calico-system/whisker-54fd6d9bd7-8z6q9" Sep 3 23:27:10.913702 containerd[1869]: time="2025-09-03T23:27:10.913639613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54fd6d9bd7-8z6q9,Uid:6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:11.059388 systemd-networkd[1697]: cali2b6721cc4f2: Link UP Sep 3 23:27:11.060134 systemd-networkd[1697]: cali2b6721cc4f2: Gained carrier Sep 3 23:27:11.079796 containerd[1869]: 2025-09-03 23:27:10.939 [INFO][4525] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:11.079796 containerd[1869]: 2025-09-03 23:27:10.970 [INFO][4525] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0 whisker-54fd6d9bd7- calico-system 6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7 896 0 2025-09-03 23:27:10 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:54fd6d9bd7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 whisker-54fd6d9bd7-8z6q9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2b6721cc4f2 [] [] }} ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-" Sep 3 23:27:11.079796 containerd[1869]: 2025-09-03 23:27:10.970 [INFO][4525] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" Sep 3 23:27:11.079796 containerd[1869]: 2025-09-03 23:27:10.987 [INFO][4533] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" HandleID="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Workload="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.987 [INFO][4533] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" HandleID="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Workload="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-989a023a05", "pod":"whisker-54fd6d9bd7-8z6q9", "timestamp":"2025-09-03 23:27:10.987267097 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.987 [INFO][4533] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.987 [INFO][4533] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.987 [INFO][4533] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.992 [INFO][4533] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.995 [INFO][4533] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.997 [INFO][4533] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:10.998 [INFO][4533] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.079975 containerd[1869]: 2025-09-03 23:27:11.001 [INFO][4533] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.080105 containerd[1869]: 2025-09-03 23:27:11.001 [INFO][4533] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.080105 containerd[1869]: 2025-09-03 23:27:11.002 [INFO][4533] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a Sep 3 23:27:11.080105 containerd[1869]: 2025-09-03 23:27:11.006 [INFO][4533] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.080105 containerd[1869]: 2025-09-03 23:27:11.014 [INFO][4533] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.1/26] block=192.168.70.0/26 handle="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.080105 containerd[1869]: 2025-09-03 23:27:11.014 [INFO][4533] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.1/26] handle="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:11.080105 containerd[1869]: 2025-09-03 23:27:11.014 [INFO][4533] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:11.080105 containerd[1869]: 2025-09-03 23:27:11.014 [INFO][4533] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.1/26] IPv6=[] ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" HandleID="k8s-pod-network.148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Workload="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" Sep 3 23:27:11.080194 containerd[1869]: 2025-09-03 23:27:11.017 [INFO][4525] cni-plugin/k8s.go 418: Populated endpoint ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0", GenerateName:"whisker-54fd6d9bd7-", Namespace:"calico-system", SelfLink:"", UID:"6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 27, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54fd6d9bd7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"whisker-54fd6d9bd7-8z6q9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.70.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2b6721cc4f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:11.080194 containerd[1869]: 2025-09-03 23:27:11.017 [INFO][4525] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.1/32] ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" Sep 3 23:27:11.080271 containerd[1869]: 2025-09-03 23:27:11.017 [INFO][4525] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b6721cc4f2 ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" Sep 3 23:27:11.080271 containerd[1869]: 2025-09-03 23:27:11.060 [INFO][4525] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" Sep 3 23:27:11.080305 containerd[1869]: 2025-09-03 23:27:11.061 [INFO][4525] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0", GenerateName:"whisker-54fd6d9bd7-", Namespace:"calico-system", SelfLink:"", UID:"6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 27, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54fd6d9bd7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a", Pod:"whisker-54fd6d9bd7-8z6q9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.70.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2b6721cc4f2", MAC:"ae:6d:81:fc:47:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:11.080360 containerd[1869]: 2025-09-03 23:27:11.074 [INFO][4525] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" Namespace="calico-system" Pod="whisker-54fd6d9bd7-8z6q9" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-whisker--54fd6d9bd7--8z6q9-eth0" Sep 3 23:27:11.128854 containerd[1869]: time="2025-09-03T23:27:11.128821371Z" level=info msg="connecting to shim 148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a" address="unix:///run/containerd/s/7e9524e95cc900cf9a7edd898fd1e7804edf9a5a50995eec162c71de4a644a26" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:11.143659 systemd[1]: Started cri-containerd-148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a.scope - libcontainer container 148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a. Sep 3 23:27:11.170971 containerd[1869]: time="2025-09-03T23:27:11.170938495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54fd6d9bd7-8z6q9,Uid:6a4baa2e-5d42-4596-ab9d-18fda5c8cfa7,Namespace:calico-system,Attempt:0,} returns sandbox id \"148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a\"" Sep 3 23:27:11.173022 containerd[1869]: time="2025-09-03T23:27:11.172419507Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 3 23:27:11.642786 kubelet[3426]: I0903 23:27:11.642751 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:11.713002 containerd[1869]: time="2025-09-03T23:27:11.712963844Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" id:\"b27d1e1ee5833e33c3ad6ec689247cdd0e4c34146ab97f80c1d63d637edd79cf\" pid:4718 exit_status:1 exited_at:{seconds:1756942031 nanos:712727979}" Sep 3 23:27:11.774697 containerd[1869]: time="2025-09-03T23:27:11.774670546Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" id:\"52a7335a5567cd1e8f7f43014627fcbd7aa6196c3824e57c40fa878396a21ee8\" pid:4750 exit_status:1 exited_at:{seconds:1756942031 nanos:774459738}" Sep 3 23:27:11.933167 systemd-networkd[1697]: vxlan.calico: Link UP Sep 3 23:27:11.933173 systemd-networkd[1697]: vxlan.calico: Gained carrier Sep 3 23:27:12.416863 kubelet[3426]: I0903 23:27:12.416720 3426 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d" path="/var/lib/kubelet/pods/d14a611a-4e62-4f6e-bad2-5b7a8bb81f0d/volumes" Sep 3 23:27:12.459513 containerd[1869]: time="2025-09-03T23:27:12.459473603Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:12.465637 containerd[1869]: time="2025-09-03T23:27:12.465610091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 3 23:27:12.470119 containerd[1869]: time="2025-09-03T23:27:12.470094528Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:12.474819 containerd[1869]: time="2025-09-03T23:27:12.474783156Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:12.475363 containerd[1869]: time="2025-09-03T23:27:12.475326510Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.302871691s" Sep 3 23:27:12.475502 containerd[1869]: time="2025-09-03T23:27:12.475349742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 3 23:27:12.482166 containerd[1869]: time="2025-09-03T23:27:12.481484294Z" level=info msg="CreateContainer within sandbox \"148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 3 23:27:12.496504 containerd[1869]: time="2025-09-03T23:27:12.496480191Z" level=info msg="Container c2a7dd7bd95c75806c2b08c24eac4f86e5bcdc40a283746f8a296fc0ae6738c1: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:12.515182 containerd[1869]: time="2025-09-03T23:27:12.515142921Z" level=info msg="CreateContainer within sandbox \"148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"c2a7dd7bd95c75806c2b08c24eac4f86e5bcdc40a283746f8a296fc0ae6738c1\"" Sep 3 23:27:12.517033 containerd[1869]: time="2025-09-03T23:27:12.515576770Z" level=info msg="StartContainer for \"c2a7dd7bd95c75806c2b08c24eac4f86e5bcdc40a283746f8a296fc0ae6738c1\"" Sep 3 23:27:12.517236 containerd[1869]: time="2025-09-03T23:27:12.517214175Z" level=info msg="connecting to shim c2a7dd7bd95c75806c2b08c24eac4f86e5bcdc40a283746f8a296fc0ae6738c1" address="unix:///run/containerd/s/7e9524e95cc900cf9a7edd898fd1e7804edf9a5a50995eec162c71de4a644a26" protocol=ttrpc version=3 Sep 3 23:27:12.537647 systemd[1]: Started cri-containerd-c2a7dd7bd95c75806c2b08c24eac4f86e5bcdc40a283746f8a296fc0ae6738c1.scope - libcontainer container c2a7dd7bd95c75806c2b08c24eac4f86e5bcdc40a283746f8a296fc0ae6738c1. Sep 3 23:27:12.568705 containerd[1869]: time="2025-09-03T23:27:12.568682242Z" level=info msg="StartContainer for \"c2a7dd7bd95c75806c2b08c24eac4f86e5bcdc40a283746f8a296fc0ae6738c1\" returns successfully" Sep 3 23:27:12.569722 containerd[1869]: time="2025-09-03T23:27:12.569691652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 3 23:27:12.739658 systemd-networkd[1697]: cali2b6721cc4f2: Gained IPv6LL Sep 3 23:27:13.827696 systemd-networkd[1697]: vxlan.calico: Gained IPv6LL Sep 3 23:27:14.144052 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3383448337.mount: Deactivated successfully. Sep 3 23:27:14.192862 containerd[1869]: time="2025-09-03T23:27:14.192821939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:14.195888 containerd[1869]: time="2025-09-03T23:27:14.195766347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 3 23:27:14.198481 containerd[1869]: time="2025-09-03T23:27:14.198453826Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:14.201957 containerd[1869]: time="2025-09-03T23:27:14.201523171Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:14.201957 containerd[1869]: time="2025-09-03T23:27:14.201858292Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.631908143s" Sep 3 23:27:14.201957 containerd[1869]: time="2025-09-03T23:27:14.201882852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 3 23:27:14.207782 containerd[1869]: time="2025-09-03T23:27:14.207761444Z" level=info msg="CreateContainer within sandbox \"148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 3 23:27:14.226541 containerd[1869]: time="2025-09-03T23:27:14.226177429Z" level=info msg="Container 86189908395508cd4a69e4510c48474e8547ee2664f19b1a70d0122df908260a: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:14.229233 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount586994203.mount: Deactivated successfully. Sep 3 23:27:14.242256 containerd[1869]: time="2025-09-03T23:27:14.242228654Z" level=info msg="CreateContainer within sandbox \"148417a6e3c3d688ba94cb4002d409dd1d7df9c32af6210b781852186108d30a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"86189908395508cd4a69e4510c48474e8547ee2664f19b1a70d0122df908260a\"" Sep 3 23:27:14.243202 containerd[1869]: time="2025-09-03T23:27:14.243162923Z" level=info msg="StartContainer for \"86189908395508cd4a69e4510c48474e8547ee2664f19b1a70d0122df908260a\"" Sep 3 23:27:14.244173 containerd[1869]: time="2025-09-03T23:27:14.244042054Z" level=info msg="connecting to shim 86189908395508cd4a69e4510c48474e8547ee2664f19b1a70d0122df908260a" address="unix:///run/containerd/s/7e9524e95cc900cf9a7edd898fd1e7804edf9a5a50995eec162c71de4a644a26" protocol=ttrpc version=3 Sep 3 23:27:14.262647 systemd[1]: Started cri-containerd-86189908395508cd4a69e4510c48474e8547ee2664f19b1a70d0122df908260a.scope - libcontainer container 86189908395508cd4a69e4510c48474e8547ee2664f19b1a70d0122df908260a. Sep 3 23:27:14.351649 containerd[1869]: time="2025-09-03T23:27:14.351615152Z" level=info msg="StartContainer for \"86189908395508cd4a69e4510c48474e8547ee2664f19b1a70d0122df908260a\" returns successfully" Sep 3 23:27:14.549788 kubelet[3426]: I0903 23:27:14.549691 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-54fd6d9bd7-8z6q9" podStartSLOduration=1.5192930919999998 podStartE2EDuration="4.549674781s" podCreationTimestamp="2025-09-03 23:27:10 +0000 UTC" firstStartedPulling="2025-09-03 23:27:11.17218806 +0000 UTC m=+34.930702699" lastFinishedPulling="2025-09-03 23:27:14.202569749 +0000 UTC m=+37.961084388" observedRunningTime="2025-09-03 23:27:14.548506417 +0000 UTC m=+38.307021056" watchObservedRunningTime="2025-09-03 23:27:14.549674781 +0000 UTC m=+38.308189420" Sep 3 23:27:15.415519 containerd[1869]: time="2025-09-03T23:27:15.415480427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xfwsv,Uid:c4ad7e75-cfe6-44c8-9079-3c41c5e29937,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:15.493250 systemd-networkd[1697]: cali24e74f334d4: Link UP Sep 3 23:27:15.494122 systemd-networkd[1697]: cali24e74f334d4: Gained carrier Sep 3 23:27:15.510000 containerd[1869]: 2025-09-03 23:27:15.443 [INFO][4903] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0 goldmane-54d579b49d- calico-system c4ad7e75-cfe6-44c8-9079-3c41c5e29937 832 0 2025-09-03 23:26:54 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 goldmane-54d579b49d-xfwsv eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali24e74f334d4 [] [] }} ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-" Sep 3 23:27:15.510000 containerd[1869]: 2025-09-03 23:27:15.443 [INFO][4903] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" Sep 3 23:27:15.510000 containerd[1869]: 2025-09-03 23:27:15.459 [INFO][4915] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" HandleID="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Workload="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.460 [INFO][4915] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" HandleID="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Workload="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b120), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-989a023a05", "pod":"goldmane-54d579b49d-xfwsv", "timestamp":"2025-09-03 23:27:15.459926075 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.460 [INFO][4915] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.460 [INFO][4915] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.460 [INFO][4915] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.465 [INFO][4915] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.468 [INFO][4915] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.470 [INFO][4915] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.471 [INFO][4915] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510168 containerd[1869]: 2025-09-03 23:27:15.473 [INFO][4915] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510303 containerd[1869]: 2025-09-03 23:27:15.473 [INFO][4915] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510303 containerd[1869]: 2025-09-03 23:27:15.474 [INFO][4915] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2 Sep 3 23:27:15.510303 containerd[1869]: 2025-09-03 23:27:15.478 [INFO][4915] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510303 containerd[1869]: 2025-09-03 23:27:15.486 [INFO][4915] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.2/26] block=192.168.70.0/26 handle="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510303 containerd[1869]: 2025-09-03 23:27:15.486 [INFO][4915] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.2/26] handle="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:15.510303 containerd[1869]: 2025-09-03 23:27:15.486 [INFO][4915] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:15.510303 containerd[1869]: 2025-09-03 23:27:15.486 [INFO][4915] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.2/26] IPv6=[] ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" HandleID="k8s-pod-network.943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Workload="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" Sep 3 23:27:15.510440 containerd[1869]: 2025-09-03 23:27:15.488 [INFO][4903] cni-plugin/k8s.go 418: Populated endpoint ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"c4ad7e75-cfe6-44c8-9079-3c41c5e29937", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"goldmane-54d579b49d-xfwsv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.70.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali24e74f334d4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:15.510440 containerd[1869]: 2025-09-03 23:27:15.488 [INFO][4903] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.2/32] ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" Sep 3 23:27:15.510492 containerd[1869]: 2025-09-03 23:27:15.488 [INFO][4903] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24e74f334d4 ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" Sep 3 23:27:15.510492 containerd[1869]: 2025-09-03 23:27:15.494 [INFO][4903] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" Sep 3 23:27:15.510521 containerd[1869]: 2025-09-03 23:27:15.494 [INFO][4903] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"c4ad7e75-cfe6-44c8-9079-3c41c5e29937", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2", Pod:"goldmane-54d579b49d-xfwsv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.70.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali24e74f334d4", MAC:"f6:3a:ed:f8:cb:5a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:15.510609 containerd[1869]: 2025-09-03 23:27:15.506 [INFO][4903] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" Namespace="calico-system" Pod="goldmane-54d579b49d-xfwsv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-goldmane--54d579b49d--xfwsv-eth0" Sep 3 23:27:15.573519 containerd[1869]: time="2025-09-03T23:27:15.573452023Z" level=info msg="connecting to shim 943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2" address="unix:///run/containerd/s/9cd520a65b10b5091b79883b1801f70631c34d0713ae23a66cc9f15936471a23" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:15.589678 systemd[1]: Started cri-containerd-943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2.scope - libcontainer container 943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2. Sep 3 23:27:15.619588 containerd[1869]: time="2025-09-03T23:27:15.619561370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xfwsv,Uid:c4ad7e75-cfe6-44c8-9079-3c41c5e29937,Namespace:calico-system,Attempt:0,} returns sandbox id \"943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2\"" Sep 3 23:27:15.621595 containerd[1869]: time="2025-09-03T23:27:15.621578673Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 3 23:27:16.416019 containerd[1869]: time="2025-09-03T23:27:16.415526398Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-sh2fm,Uid:31860c45-46a5-4bc4-8d23-e0d3f50c8c6b,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:16.416414 containerd[1869]: time="2025-09-03T23:27:16.416389441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58676dd4cc-b8ngc,Uid:dd28fb5e-4963-4ee1-9d8f-b3d73564708c,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:16.416831 containerd[1869]: time="2025-09-03T23:27:16.416583759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-4m6vn,Uid:0ee4a8e2-5e1b-4954-b402-9fae5592724a,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:16.579784 systemd-networkd[1697]: cali24e74f334d4: Gained IPv6LL Sep 3 23:27:16.627383 systemd-networkd[1697]: calicf99d3e5a37: Link UP Sep 3 23:27:16.628349 systemd-networkd[1697]: calicf99d3e5a37: Gained carrier Sep 3 23:27:16.641036 containerd[1869]: 2025-09-03 23:27:16.540 [INFO][5000] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0 calico-apiserver-5c8645d4dc- calico-apiserver 0ee4a8e2-5e1b-4954-b402-9fae5592724a 829 0 2025-09-03 23:26:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c8645d4dc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 calico-apiserver-5c8645d4dc-4m6vn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicf99d3e5a37 [] [] }} ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-" Sep 3 23:27:16.641036 containerd[1869]: 2025-09-03 23:27:16.558 [INFO][5000] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:16.641036 containerd[1869]: 2025-09-03 23:27:16.589 [INFO][5018] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.589 [INFO][5018] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-989a023a05", "pod":"calico-apiserver-5c8645d4dc-4m6vn", "timestamp":"2025-09-03 23:27:16.588983017 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.589 [INFO][5018] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.589 [INFO][5018] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.589 [INFO][5018] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.595 [INFO][5018] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.598 [INFO][5018] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.601 [INFO][5018] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.602 [INFO][5018] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641224 containerd[1869]: 2025-09-03 23:27:16.604 [INFO][5018] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641361 containerd[1869]: 2025-09-03 23:27:16.604 [INFO][5018] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641361 containerd[1869]: 2025-09-03 23:27:16.605 [INFO][5018] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30 Sep 3 23:27:16.641361 containerd[1869]: 2025-09-03 23:27:16.609 [INFO][5018] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641361 containerd[1869]: 2025-09-03 23:27:16.618 [INFO][5018] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.3/26] block=192.168.70.0/26 handle="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641361 containerd[1869]: 2025-09-03 23:27:16.618 [INFO][5018] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.3/26] handle="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.641361 containerd[1869]: 2025-09-03 23:27:16.618 [INFO][5018] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:16.641361 containerd[1869]: 2025-09-03 23:27:16.618 [INFO][5018] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.3/26] IPv6=[] ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:16.641453 containerd[1869]: 2025-09-03 23:27:16.620 [INFO][5000] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0", GenerateName:"calico-apiserver-5c8645d4dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"0ee4a8e2-5e1b-4954-b402-9fae5592724a", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c8645d4dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"calico-apiserver-5c8645d4dc-4m6vn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicf99d3e5a37", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.641496 containerd[1869]: 2025-09-03 23:27:16.621 [INFO][5000] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.3/32] ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:16.641496 containerd[1869]: 2025-09-03 23:27:16.621 [INFO][5000] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicf99d3e5a37 ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:16.641496 containerd[1869]: 2025-09-03 23:27:16.628 [INFO][5000] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:16.642580 containerd[1869]: 2025-09-03 23:27:16.629 [INFO][5000] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0", GenerateName:"calico-apiserver-5c8645d4dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"0ee4a8e2-5e1b-4954-b402-9fae5592724a", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c8645d4dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30", Pod:"calico-apiserver-5c8645d4dc-4m6vn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicf99d3e5a37", MAC:"2a:94:89:8a:45:3e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.642644 containerd[1869]: 2025-09-03 23:27:16.638 [INFO][5000] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-4m6vn" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:16.735731 systemd-networkd[1697]: cali3e5a0fd43e3: Link UP Sep 3 23:27:16.737008 systemd-networkd[1697]: cali3e5a0fd43e3: Gained carrier Sep 3 23:27:16.757939 containerd[1869]: 2025-09-03 23:27:16.480 [INFO][4977] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0 coredns-674b8bbfcf- kube-system 31860c45-46a5-4bc4-8d23-e0d3f50c8c6b 825 0 2025-09-03 23:26:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 coredns-674b8bbfcf-sh2fm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3e5a0fd43e3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-" Sep 3 23:27:16.757939 containerd[1869]: 2025-09-03 23:27:16.558 [INFO][4977] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" Sep 3 23:27:16.757939 containerd[1869]: 2025-09-03 23:27:16.591 [INFO][5016] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" HandleID="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Workload="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.592 [INFO][5016] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" HandleID="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Workload="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-989a023a05", "pod":"coredns-674b8bbfcf-sh2fm", "timestamp":"2025-09-03 23:27:16.591564833 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.592 [INFO][5016] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.619 [INFO][5016] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.619 [INFO][5016] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.696 [INFO][5016] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.701 [INFO][5016] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.708 [INFO][5016] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.710 [INFO][5016] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758088 containerd[1869]: 2025-09-03 23:27:16.712 [INFO][5016] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758235 containerd[1869]: 2025-09-03 23:27:16.712 [INFO][5016] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758235 containerd[1869]: 2025-09-03 23:27:16.715 [INFO][5016] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1 Sep 3 23:27:16.758235 containerd[1869]: 2025-09-03 23:27:16.720 [INFO][5016] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758235 containerd[1869]: 2025-09-03 23:27:16.729 [INFO][5016] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.4/26] block=192.168.70.0/26 handle="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758235 containerd[1869]: 2025-09-03 23:27:16.730 [INFO][5016] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.4/26] handle="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.758235 containerd[1869]: 2025-09-03 23:27:16.730 [INFO][5016] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:16.758235 containerd[1869]: 2025-09-03 23:27:16.730 [INFO][5016] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.4/26] IPv6=[] ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" HandleID="k8s-pod-network.d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Workload="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" Sep 3 23:27:16.758334 containerd[1869]: 2025-09-03 23:27:16.733 [INFO][4977] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"31860c45-46a5-4bc4-8d23-e0d3f50c8c6b", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"coredns-674b8bbfcf-sh2fm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e5a0fd43e3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.758334 containerd[1869]: 2025-09-03 23:27:16.733 [INFO][4977] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.4/32] ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" Sep 3 23:27:16.758334 containerd[1869]: 2025-09-03 23:27:16.733 [INFO][4977] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e5a0fd43e3 ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" Sep 3 23:27:16.758334 containerd[1869]: 2025-09-03 23:27:16.742 [INFO][4977] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" Sep 3 23:27:16.758334 containerd[1869]: 2025-09-03 23:27:16.743 [INFO][4977] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"31860c45-46a5-4bc4-8d23-e0d3f50c8c6b", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1", Pod:"coredns-674b8bbfcf-sh2fm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e5a0fd43e3", MAC:"9a:19:e8:6b:39:9f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.758334 containerd[1869]: 2025-09-03 23:27:16.755 [INFO][4977] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" Namespace="kube-system" Pod="coredns-674b8bbfcf-sh2fm" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--sh2fm-eth0" Sep 3 23:27:16.803269 containerd[1869]: time="2025-09-03T23:27:16.803030724Z" level=info msg="connecting to shim fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" address="unix:///run/containerd/s/07a4528e5b3a84600c009f12e04329c46c49a68501ae45f4bd26e90eb40c5b6f" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:16.823225 containerd[1869]: time="2025-09-03T23:27:16.823194757Z" level=info msg="connecting to shim d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1" address="unix:///run/containerd/s/467b18a669257393a42efaf077573a30e5ad92bb5660e002489716960f46a6d8" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:16.840822 systemd[1]: Started cri-containerd-fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30.scope - libcontainer container fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30. Sep 3 23:27:16.848980 systemd[1]: Started cri-containerd-d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1.scope - libcontainer container d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1. Sep 3 23:27:16.865086 systemd-networkd[1697]: cali3de85f9942f: Link UP Sep 3 23:27:16.865819 systemd-networkd[1697]: cali3de85f9942f: Gained carrier Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.520 [INFO][4981] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0 calico-kube-controllers-58676dd4cc- calico-system dd28fb5e-4963-4ee1-9d8f-b3d73564708c 827 0 2025-09-03 23:26:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58676dd4cc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 calico-kube-controllers-58676dd4cc-b8ngc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3de85f9942f [] [] }} ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.559 [INFO][4981] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.596 [INFO][5026] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" HandleID="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.597 [INFO][5026] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" HandleID="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3650), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-989a023a05", "pod":"calico-kube-controllers-58676dd4cc-b8ngc", "timestamp":"2025-09-03 23:27:16.596834756 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.597 [INFO][5026] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.730 [INFO][5026] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.730 [INFO][5026] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.796 [INFO][5026] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.801 [INFO][5026] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.816 [INFO][5026] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.819 [INFO][5026] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.821 [INFO][5026] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.821 [INFO][5026] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.823 [INFO][5026] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847 Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.831 [INFO][5026] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.856 [INFO][5026] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.5/26] block=192.168.70.0/26 handle="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.856 [INFO][5026] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.5/26] handle="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.856 [INFO][5026] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:16.906701 containerd[1869]: 2025-09-03 23:27:16.857 [INFO][5026] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.5/26] IPv6=[] ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" HandleID="k8s-pod-network.d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" Sep 3 23:27:16.907713 containerd[1869]: 2025-09-03 23:27:16.860 [INFO][4981] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0", GenerateName:"calico-kube-controllers-58676dd4cc-", Namespace:"calico-system", SelfLink:"", UID:"dd28fb5e-4963-4ee1-9d8f-b3d73564708c", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58676dd4cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"calico-kube-controllers-58676dd4cc-b8ngc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3de85f9942f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.907713 containerd[1869]: 2025-09-03 23:27:16.861 [INFO][4981] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.5/32] ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" Sep 3 23:27:16.907713 containerd[1869]: 2025-09-03 23:27:16.861 [INFO][4981] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3de85f9942f ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" Sep 3 23:27:16.907713 containerd[1869]: 2025-09-03 23:27:16.866 [INFO][4981] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" Sep 3 23:27:16.907713 containerd[1869]: 2025-09-03 23:27:16.866 [INFO][4981] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0", GenerateName:"calico-kube-controllers-58676dd4cc-", Namespace:"calico-system", SelfLink:"", UID:"dd28fb5e-4963-4ee1-9d8f-b3d73564708c", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58676dd4cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847", Pod:"calico-kube-controllers-58676dd4cc-b8ngc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3de85f9942f", MAC:"b2:df:95:3a:f8:dd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.907713 containerd[1869]: 2025-09-03 23:27:16.899 [INFO][4981] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" Namespace="calico-system" Pod="calico-kube-controllers-58676dd4cc-b8ngc" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--kube--controllers--58676dd4cc--b8ngc-eth0" Sep 3 23:27:16.938066 containerd[1869]: time="2025-09-03T23:27:16.937982647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-4m6vn,Uid:0ee4a8e2-5e1b-4954-b402-9fae5592724a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\"" Sep 3 23:27:16.946110 containerd[1869]: time="2025-09-03T23:27:16.946085714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-sh2fm,Uid:31860c45-46a5-4bc4-8d23-e0d3f50c8c6b,Namespace:kube-system,Attempt:0,} returns sandbox id \"d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1\"" Sep 3 23:27:16.951744 containerd[1869]: time="2025-09-03T23:27:16.951715784Z" level=info msg="CreateContainer within sandbox \"d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 3 23:27:16.979922 containerd[1869]: time="2025-09-03T23:27:16.979894136Z" level=info msg="Container 7fe0927633a36b4a865b102419a2637464c29016054a72d2f84b92bb8b573805: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:16.999225 containerd[1869]: time="2025-09-03T23:27:16.999074274Z" level=info msg="CreateContainer within sandbox \"d3d55fea7a1bd97e886dd2f3b1c50f33204798274f4d070730c21c11c9efbaf1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7fe0927633a36b4a865b102419a2637464c29016054a72d2f84b92bb8b573805\"" Sep 3 23:27:17.001367 containerd[1869]: time="2025-09-03T23:27:17.001339328Z" level=info msg="StartContainer for \"7fe0927633a36b4a865b102419a2637464c29016054a72d2f84b92bb8b573805\"" Sep 3 23:27:17.001699 containerd[1869]: time="2025-09-03T23:27:17.001648466Z" level=info msg="connecting to shim d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847" address="unix:///run/containerd/s/c75da0673a56bc4c45c98ba247b66094fa800a55f2bcda52a9d1ae9a1a0f72d4" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:17.002546 containerd[1869]: time="2025-09-03T23:27:17.002437634Z" level=info msg="connecting to shim 7fe0927633a36b4a865b102419a2637464c29016054a72d2f84b92bb8b573805" address="unix:///run/containerd/s/467b18a669257393a42efaf077573a30e5ad92bb5660e002489716960f46a6d8" protocol=ttrpc version=3 Sep 3 23:27:17.020789 systemd[1]: Started cri-containerd-7fe0927633a36b4a865b102419a2637464c29016054a72d2f84b92bb8b573805.scope - libcontainer container 7fe0927633a36b4a865b102419a2637464c29016054a72d2f84b92bb8b573805. Sep 3 23:27:17.021601 systemd[1]: Started cri-containerd-d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847.scope - libcontainer container d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847. Sep 3 23:27:17.083077 containerd[1869]: time="2025-09-03T23:27:17.083031874Z" level=info msg="StartContainer for \"7fe0927633a36b4a865b102419a2637464c29016054a72d2f84b92bb8b573805\" returns successfully" Sep 3 23:27:17.099907 containerd[1869]: time="2025-09-03T23:27:17.099714782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58676dd4cc-b8ngc,Uid:dd28fb5e-4963-4ee1-9d8f-b3d73564708c,Namespace:calico-system,Attempt:0,} returns sandbox id \"d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847\"" Sep 3 23:27:17.576824 kubelet[3426]: I0903 23:27:17.576629 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-sh2fm" podStartSLOduration=36.576375468 podStartE2EDuration="36.576375468s" podCreationTimestamp="2025-09-03 23:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:27:17.573794796 +0000 UTC m=+41.332309435" watchObservedRunningTime="2025-09-03 23:27:17.576375468 +0000 UTC m=+41.334890115" Sep 3 23:27:17.872180 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2651428413.mount: Deactivated successfully. Sep 3 23:27:18.228641 containerd[1869]: time="2025-09-03T23:27:18.228368718Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:18.234630 containerd[1869]: time="2025-09-03T23:27:18.234589831Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 3 23:27:18.237857 containerd[1869]: time="2025-09-03T23:27:18.237802754Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:18.241467 containerd[1869]: time="2025-09-03T23:27:18.241425859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:18.242000 containerd[1869]: time="2025-09-03T23:27:18.241894265Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.620104986s" Sep 3 23:27:18.242000 containerd[1869]: time="2025-09-03T23:27:18.241921274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 3 23:27:18.243901 containerd[1869]: time="2025-09-03T23:27:18.243862070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 3 23:27:18.248780 containerd[1869]: time="2025-09-03T23:27:18.248759982Z" level=info msg="CreateContainer within sandbox \"943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 3 23:27:18.266414 containerd[1869]: time="2025-09-03T23:27:18.265611696Z" level=info msg="Container cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:18.287492 containerd[1869]: time="2025-09-03T23:27:18.287463684Z" level=info msg="CreateContainer within sandbox \"943037f736ae40a86801014ebc0355f2b33e586e4278e1eb350add3f4a55bed2\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\"" Sep 3 23:27:18.288277 containerd[1869]: time="2025-09-03T23:27:18.288250716Z" level=info msg="StartContainer for \"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\"" Sep 3 23:27:18.294951 containerd[1869]: time="2025-09-03T23:27:18.294927099Z" level=info msg="connecting to shim cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d" address="unix:///run/containerd/s/9cd520a65b10b5091b79883b1801f70631c34d0713ae23a66cc9f15936471a23" protocol=ttrpc version=3 Sep 3 23:27:18.318642 systemd[1]: Started cri-containerd-cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d.scope - libcontainer container cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d. Sep 3 23:27:18.349939 containerd[1869]: time="2025-09-03T23:27:18.349915882Z" level=info msg="StartContainer for \"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" returns successfully" Sep 3 23:27:18.415188 containerd[1869]: time="2025-09-03T23:27:18.414972544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4tqph,Uid:7320caad-e77e-46f7-ae51-16ff07f2bafa,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:18.415340 containerd[1869]: time="2025-09-03T23:27:18.415321227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-8s4zv,Uid:03fb5889-accd-4288-86d3-895ee8b136f5,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:18.434959 systemd-networkd[1697]: cali3de85f9942f: Gained IPv6LL Sep 3 23:27:18.516995 systemd-networkd[1697]: cali488ec1b9ec6: Link UP Sep 3 23:27:18.517711 systemd-networkd[1697]: cali488ec1b9ec6: Gained carrier Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.463 [INFO][5292] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0 coredns-674b8bbfcf- kube-system 7320caad-e77e-46f7-ae51-16ff07f2bafa 826 0 2025-09-03 23:26:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 coredns-674b8bbfcf-4tqph eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali488ec1b9ec6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.463 [INFO][5292] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.486 [INFO][5318] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" HandleID="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Workload="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.486 [INFO][5318] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" HandleID="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Workload="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-989a023a05", "pod":"coredns-674b8bbfcf-4tqph", "timestamp":"2025-09-03 23:27:18.486327529 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.486 [INFO][5318] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.486 [INFO][5318] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.486 [INFO][5318] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.491 [INFO][5318] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.494 [INFO][5318] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.497 [INFO][5318] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.498 [INFO][5318] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.499 [INFO][5318] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.499 [INFO][5318] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.500 [INFO][5318] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9 Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.504 [INFO][5318] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.512 [INFO][5318] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.6/26] block=192.168.70.0/26 handle="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.512 [INFO][5318] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.6/26] handle="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.512 [INFO][5318] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:18.536744 containerd[1869]: 2025-09-03 23:27:18.512 [INFO][5318] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.6/26] IPv6=[] ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" HandleID="k8s-pod-network.f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Workload="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" Sep 3 23:27:18.537798 containerd[1869]: 2025-09-03 23:27:18.514 [INFO][5292] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7320caad-e77e-46f7-ae51-16ff07f2bafa", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"coredns-674b8bbfcf-4tqph", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali488ec1b9ec6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.537798 containerd[1869]: 2025-09-03 23:27:18.514 [INFO][5292] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.6/32] ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" Sep 3 23:27:18.537798 containerd[1869]: 2025-09-03 23:27:18.514 [INFO][5292] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali488ec1b9ec6 ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" Sep 3 23:27:18.537798 containerd[1869]: 2025-09-03 23:27:18.518 [INFO][5292] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" Sep 3 23:27:18.537798 containerd[1869]: 2025-09-03 23:27:18.518 [INFO][5292] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7320caad-e77e-46f7-ae51-16ff07f2bafa", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9", Pod:"coredns-674b8bbfcf-4tqph", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali488ec1b9ec6", MAC:"52:31:7e:9e:bd:b9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.537798 containerd[1869]: 2025-09-03 23:27:18.533 [INFO][5292] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-4tqph" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-coredns--674b8bbfcf--4tqph-eth0" Sep 3 23:27:18.562642 systemd-networkd[1697]: calicf99d3e5a37: Gained IPv6LL Sep 3 23:27:18.607230 containerd[1869]: time="2025-09-03T23:27:18.607123341Z" level=info msg="connecting to shim f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9" address="unix:///run/containerd/s/f9f08ec4f6b0a18cb267957304ba14d23ae724d70f973dc042e82b36eb4b5785" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:18.626632 systemd-networkd[1697]: cali3e5a0fd43e3: Gained IPv6LL Sep 3 23:27:18.640739 systemd-networkd[1697]: calie95ddc4c419: Link UP Sep 3 23:27:18.640865 systemd-networkd[1697]: calie95ddc4c419: Gained carrier Sep 3 23:27:18.657764 systemd[1]: Started cri-containerd-f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9.scope - libcontainer container f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9. Sep 3 23:27:18.660849 kubelet[3426]: I0903 23:27:18.660760 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-xfwsv" podStartSLOduration=22.038618858 podStartE2EDuration="24.660743282s" podCreationTimestamp="2025-09-03 23:26:54 +0000 UTC" firstStartedPulling="2025-09-03 23:27:15.620621139 +0000 UTC m=+39.379135778" lastFinishedPulling="2025-09-03 23:27:18.242745563 +0000 UTC m=+42.001260202" observedRunningTime="2025-09-03 23:27:18.578643452 +0000 UTC m=+42.337158099" watchObservedRunningTime="2025-09-03 23:27:18.660743282 +0000 UTC m=+42.419257921" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.464 [INFO][5303] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0 calico-apiserver-5c8645d4dc- calico-apiserver 03fb5889-accd-4288-86d3-895ee8b136f5 830 0 2025-09-03 23:26:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c8645d4dc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 calico-apiserver-5c8645d4dc-8s4zv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie95ddc4c419 [] [] }} ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.464 [INFO][5303] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.485 [INFO][5316] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.486 [INFO][5316] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ab4a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-989a023a05", "pod":"calico-apiserver-5c8645d4dc-8s4zv", "timestamp":"2025-09-03 23:27:18.485373188 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.486 [INFO][5316] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.513 [INFO][5316] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.513 [INFO][5316] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.592 [INFO][5316] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.595 [INFO][5316] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.598 [INFO][5316] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.600 [INFO][5316] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.603 [INFO][5316] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.603 [INFO][5316] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.604 [INFO][5316] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.613 [INFO][5316] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.624 [INFO][5316] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.7/26] block=192.168.70.0/26 handle="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.624 [INFO][5316] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.7/26] handle="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.624 [INFO][5316] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:18.664942 containerd[1869]: 2025-09-03 23:27:18.624 [INFO][5316] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.7/26] IPv6=[] ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:18.666378 containerd[1869]: 2025-09-03 23:27:18.629 [INFO][5303] cni-plugin/k8s.go 418: Populated endpoint ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0", GenerateName:"calico-apiserver-5c8645d4dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"03fb5889-accd-4288-86d3-895ee8b136f5", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c8645d4dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"calico-apiserver-5c8645d4dc-8s4zv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie95ddc4c419", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.666378 containerd[1869]: 2025-09-03 23:27:18.629 [INFO][5303] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.7/32] ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:18.666378 containerd[1869]: 2025-09-03 23:27:18.629 [INFO][5303] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie95ddc4c419 ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:18.666378 containerd[1869]: 2025-09-03 23:27:18.641 [INFO][5303] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:18.666378 containerd[1869]: 2025-09-03 23:27:18.641 [INFO][5303] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0", GenerateName:"calico-apiserver-5c8645d4dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"03fb5889-accd-4288-86d3-895ee8b136f5", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c8645d4dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c", Pod:"calico-apiserver-5c8645d4dc-8s4zv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie95ddc4c419", MAC:"1a:2c:5b:16:d3:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.666378 containerd[1869]: 2025-09-03 23:27:18.661 [INFO][5303] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Namespace="calico-apiserver" Pod="calico-apiserver-5c8645d4dc-8s4zv" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:18.711396 containerd[1869]: time="2025-09-03T23:27:18.710992989Z" level=info msg="connecting to shim 28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" address="unix:///run/containerd/s/b51157a4f67919d7b2c3d316bfa4da594a7935729a07291afcb77bc13d7cde98" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:18.712509 containerd[1869]: time="2025-09-03T23:27:18.712477931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4tqph,Uid:7320caad-e77e-46f7-ae51-16ff07f2bafa,Namespace:kube-system,Attempt:0,} returns sandbox id \"f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9\"" Sep 3 23:27:18.721686 containerd[1869]: time="2025-09-03T23:27:18.721620870Z" level=info msg="CreateContainer within sandbox \"f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 3 23:27:18.739747 systemd[1]: Started cri-containerd-28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c.scope - libcontainer container 28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c. Sep 3 23:27:18.746168 containerd[1869]: time="2025-09-03T23:27:18.746142238Z" level=info msg="Container 937a9a47d64583e26be77cd3fcd67b95c95e4dd7b3992f137570bef41100b6a5: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:18.765099 containerd[1869]: time="2025-09-03T23:27:18.765070144Z" level=info msg="CreateContainer within sandbox \"f3a43620d3a00be037205b3196c769afce886c99618ae5bd515844ac5e8716d9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"937a9a47d64583e26be77cd3fcd67b95c95e4dd7b3992f137570bef41100b6a5\"" Sep 3 23:27:18.766341 containerd[1869]: time="2025-09-03T23:27:18.765545902Z" level=info msg="StartContainer for \"937a9a47d64583e26be77cd3fcd67b95c95e4dd7b3992f137570bef41100b6a5\"" Sep 3 23:27:18.766341 containerd[1869]: time="2025-09-03T23:27:18.766098304Z" level=info msg="connecting to shim 937a9a47d64583e26be77cd3fcd67b95c95e4dd7b3992f137570bef41100b6a5" address="unix:///run/containerd/s/f9f08ec4f6b0a18cb267957304ba14d23ae724d70f973dc042e82b36eb4b5785" protocol=ttrpc version=3 Sep 3 23:27:18.774525 containerd[1869]: time="2025-09-03T23:27:18.774454530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c8645d4dc-8s4zv,Uid:03fb5889-accd-4288-86d3-895ee8b136f5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\"" Sep 3 23:27:18.790645 systemd[1]: Started cri-containerd-937a9a47d64583e26be77cd3fcd67b95c95e4dd7b3992f137570bef41100b6a5.scope - libcontainer container 937a9a47d64583e26be77cd3fcd67b95c95e4dd7b3992f137570bef41100b6a5. Sep 3 23:27:18.814517 containerd[1869]: time="2025-09-03T23:27:18.814477913Z" level=info msg="StartContainer for \"937a9a47d64583e26be77cd3fcd67b95c95e4dd7b3992f137570bef41100b6a5\" returns successfully" Sep 3 23:27:19.418839 containerd[1869]: time="2025-09-03T23:27:19.418795319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776b7f98d6-vxp58,Uid:de505837-0b0f-4898-99c8-73a171069dca,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:19.419311 containerd[1869]: time="2025-09-03T23:27:19.418798272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6x5gb,Uid:3257a910-0dfb-493a-b923-e6215d245226,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:19.554450 systemd-networkd[1697]: calif445cf78916: Link UP Sep 3 23:27:19.556454 systemd-networkd[1697]: calif445cf78916: Gained carrier Sep 3 23:27:19.572319 kubelet[3426]: I0903 23:27:19.572298 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.465 [INFO][5484] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0 csi-node-driver- calico-system 3257a910-0dfb-493a-b923-e6215d245226 713 0 2025-09-03 23:26:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 csi-node-driver-6x5gb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif445cf78916 [] [] }} ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.466 [INFO][5484] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.491 [INFO][5501] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" HandleID="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Workload="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.491 [INFO][5501] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" HandleID="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Workload="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb870), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-989a023a05", "pod":"csi-node-driver-6x5gb", "timestamp":"2025-09-03 23:27:19.491611734 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.491 [INFO][5501] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.491 [INFO][5501] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.491 [INFO][5501] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.500 [INFO][5501] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.508 [INFO][5501] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.515 [INFO][5501] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.519 [INFO][5501] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.521 [INFO][5501] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.521 [INFO][5501] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.525 [INFO][5501] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8 Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.530 [INFO][5501] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.547 [INFO][5501] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.8/26] block=192.168.70.0/26 handle="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.548 [INFO][5501] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.8/26] handle="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.548 [INFO][5501] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:19.581850 containerd[1869]: 2025-09-03 23:27:19.548 [INFO][5501] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.8/26] IPv6=[] ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" HandleID="k8s-pod-network.d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Workload="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" Sep 3 23:27:19.582221 containerd[1869]: 2025-09-03 23:27:19.550 [INFO][5484] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3257a910-0dfb-493a-b923-e6215d245226", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"csi-node-driver-6x5gb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif445cf78916", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:19.582221 containerd[1869]: 2025-09-03 23:27:19.550 [INFO][5484] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.8/32] ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" Sep 3 23:27:19.582221 containerd[1869]: 2025-09-03 23:27:19.550 [INFO][5484] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif445cf78916 ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" Sep 3 23:27:19.582221 containerd[1869]: 2025-09-03 23:27:19.556 [INFO][5484] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" Sep 3 23:27:19.582221 containerd[1869]: 2025-09-03 23:27:19.557 [INFO][5484] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3257a910-0dfb-493a-b923-e6215d245226", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8", Pod:"csi-node-driver-6x5gb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif445cf78916", MAC:"e6:30:cb:12:fd:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:19.582221 containerd[1869]: 2025-09-03 23:27:19.579 [INFO][5484] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" Namespace="calico-system" Pod="csi-node-driver-6x5gb" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-csi--node--driver--6x5gb-eth0" Sep 3 23:27:19.593510 kubelet[3426]: I0903 23:27:19.593419 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-4tqph" podStartSLOduration=38.593235768 podStartE2EDuration="38.593235768s" podCreationTimestamp="2025-09-03 23:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:27:19.592660575 +0000 UTC m=+43.351175222" watchObservedRunningTime="2025-09-03 23:27:19.593235768 +0000 UTC m=+43.351750407" Sep 3 23:27:19.640938 containerd[1869]: time="2025-09-03T23:27:19.640892656Z" level=info msg="connecting to shim d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8" address="unix:///run/containerd/s/979d7319117204d60f6aff4e3227449ffd42ff1cfa05d7b30dec5d58cb1873b9" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:19.675695 systemd[1]: Started cri-containerd-d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8.scope - libcontainer container d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8. Sep 3 23:27:19.684429 systemd-networkd[1697]: cali59200d94103: Link UP Sep 3 23:27:19.685602 systemd-networkd[1697]: cali59200d94103: Gained carrier Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.471 [INFO][5475] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0 calico-apiserver-776b7f98d6- calico-apiserver de505837-0b0f-4898-99c8-73a171069dca 833 0 2025-09-03 23:26:52 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:776b7f98d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 calico-apiserver-776b7f98d6-vxp58 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali59200d94103 [] [] }} ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.471 [INFO][5475] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.498 [INFO][5506] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" HandleID="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.499 [INFO][5506] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" HandleID="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b040), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-989a023a05", "pod":"calico-apiserver-776b7f98d6-vxp58", "timestamp":"2025-09-03 23:27:19.498486595 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.500 [INFO][5506] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.548 [INFO][5506] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.548 [INFO][5506] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.604 [INFO][5506] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.612 [INFO][5506] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.633 [INFO][5506] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.635 [INFO][5506] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.642 [INFO][5506] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.643 [INFO][5506] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.646 [INFO][5506] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.668 [INFO][5506] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.678 [INFO][5506] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.9/26] block=192.168.70.0/26 handle="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.678 [INFO][5506] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.9/26] handle="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.678 [INFO][5506] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:19.704263 containerd[1869]: 2025-09-03 23:27:19.678 [INFO][5506] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.9/26] IPv6=[] ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" HandleID="k8s-pod-network.718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" Sep 3 23:27:19.705015 containerd[1869]: 2025-09-03 23:27:19.680 [INFO][5475] cni-plugin/k8s.go 418: Populated endpoint ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0", GenerateName:"calico-apiserver-776b7f98d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"de505837-0b0f-4898-99c8-73a171069dca", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776b7f98d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"calico-apiserver-776b7f98d6-vxp58", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali59200d94103", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:19.705015 containerd[1869]: 2025-09-03 23:27:19.680 [INFO][5475] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.9/32] ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" Sep 3 23:27:19.705015 containerd[1869]: 2025-09-03 23:27:19.680 [INFO][5475] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali59200d94103 ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" Sep 3 23:27:19.705015 containerd[1869]: 2025-09-03 23:27:19.685 [INFO][5475] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" Sep 3 23:27:19.705015 containerd[1869]: 2025-09-03 23:27:19.687 [INFO][5475] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0", GenerateName:"calico-apiserver-776b7f98d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"de505837-0b0f-4898-99c8-73a171069dca", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776b7f98d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd", Pod:"calico-apiserver-776b7f98d6-vxp58", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali59200d94103", MAC:"ae:b2:63:92:89:c9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:19.705015 containerd[1869]: 2025-09-03 23:27:19.701 [INFO][5475] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vxp58" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vxp58-eth0" Sep 3 23:27:19.710572 containerd[1869]: time="2025-09-03T23:27:19.710540791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6x5gb,Uid:3257a910-0dfb-493a-b923-e6215d245226,Namespace:calico-system,Attempt:0,} returns sandbox id \"d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8\"" Sep 3 23:27:19.714632 systemd-networkd[1697]: cali488ec1b9ec6: Gained IPv6LL Sep 3 23:27:19.752908 containerd[1869]: time="2025-09-03T23:27:19.752869211Z" level=info msg="connecting to shim 718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd" address="unix:///run/containerd/s/fea3e50ac1118fc30477b91c40a086471be70e169b70fcbd6d79f468100ff134" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:19.776637 systemd[1]: Started cri-containerd-718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd.scope - libcontainer container 718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd. Sep 3 23:27:19.810439 containerd[1869]: time="2025-09-03T23:27:19.810382875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776b7f98d6-vxp58,Uid:de505837-0b0f-4898-99c8-73a171069dca,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd\"" Sep 3 23:27:19.842635 systemd-networkd[1697]: calie95ddc4c419: Gained IPv6LL Sep 3 23:27:20.866643 systemd-networkd[1697]: calif445cf78916: Gained IPv6LL Sep 3 23:27:21.378697 systemd-networkd[1697]: cali59200d94103: Gained IPv6LL Sep 3 23:27:21.544880 containerd[1869]: time="2025-09-03T23:27:21.544722025Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:21.547095 containerd[1869]: time="2025-09-03T23:27:21.547068588Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 3 23:27:21.551355 containerd[1869]: time="2025-09-03T23:27:21.551319348Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:21.555073 containerd[1869]: time="2025-09-03T23:27:21.555036036Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:21.555503 containerd[1869]: time="2025-09-03T23:27:21.555471131Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.311583843s" Sep 3 23:27:21.555503 containerd[1869]: time="2025-09-03T23:27:21.555499603Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 3 23:27:21.556751 containerd[1869]: time="2025-09-03T23:27:21.556578987Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 3 23:27:21.562418 containerd[1869]: time="2025-09-03T23:27:21.562393699Z" level=info msg="CreateContainer within sandbox \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 3 23:27:21.591096 containerd[1869]: time="2025-09-03T23:27:21.589740446Z" level=info msg="Container edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:21.592788 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount129063090.mount: Deactivated successfully. Sep 3 23:27:21.613122 containerd[1869]: time="2025-09-03T23:27:21.613050300Z" level=info msg="CreateContainer within sandbox \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\"" Sep 3 23:27:21.614213 containerd[1869]: time="2025-09-03T23:27:21.613338073Z" level=info msg="StartContainer for \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\"" Sep 3 23:27:21.614465 containerd[1869]: time="2025-09-03T23:27:21.614439921Z" level=info msg="connecting to shim edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334" address="unix:///run/containerd/s/07a4528e5b3a84600c009f12e04329c46c49a68501ae45f4bd26e90eb40c5b6f" protocol=ttrpc version=3 Sep 3 23:27:21.643639 systemd[1]: Started cri-containerd-edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334.scope - libcontainer container edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334. Sep 3 23:27:21.673949 containerd[1869]: time="2025-09-03T23:27:21.673926959Z" level=info msg="StartContainer for \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" returns successfully" Sep 3 23:27:22.610552 kubelet[3426]: I0903 23:27:22.610351 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c8645d4dc-4m6vn" podStartSLOduration=26.993145673 podStartE2EDuration="31.610336563s" podCreationTimestamp="2025-09-03 23:26:51 +0000 UTC" firstStartedPulling="2025-09-03 23:27:16.938846841 +0000 UTC m=+40.697361480" lastFinishedPulling="2025-09-03 23:27:21.556037731 +0000 UTC m=+45.314552370" observedRunningTime="2025-09-03 23:27:22.60999547 +0000 UTC m=+46.368510133" watchObservedRunningTime="2025-09-03 23:27:22.610336563 +0000 UTC m=+46.368851202" Sep 3 23:27:24.730052 containerd[1869]: time="2025-09-03T23:27:24.729980405Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:24.737810 containerd[1869]: time="2025-09-03T23:27:24.737776764Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 3 23:27:24.753776 containerd[1869]: time="2025-09-03T23:27:24.753745862Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:24.763967 containerd[1869]: time="2025-09-03T23:27:24.763923260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:24.764545 containerd[1869]: time="2025-09-03T23:27:24.764258206Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 3.207654643s" Sep 3 23:27:24.764545 containerd[1869]: time="2025-09-03T23:27:24.764283910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 3 23:27:24.764995 containerd[1869]: time="2025-09-03T23:27:24.764977443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 3 23:27:24.777966 containerd[1869]: time="2025-09-03T23:27:24.777938924Z" level=info msg="CreateContainer within sandbox \"d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 3 23:27:24.795740 containerd[1869]: time="2025-09-03T23:27:24.795711147Z" level=info msg="Container f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:24.815225 containerd[1869]: time="2025-09-03T23:27:24.815191365Z" level=info msg="CreateContainer within sandbox \"d3d60511b429fb85200db364ceeafc89dde9c7959b94e548daf9bbbfca672847\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\"" Sep 3 23:27:24.816061 containerd[1869]: time="2025-09-03T23:27:24.815609025Z" level=info msg="StartContainer for \"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\"" Sep 3 23:27:24.816639 containerd[1869]: time="2025-09-03T23:27:24.816619543Z" level=info msg="connecting to shim f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070" address="unix:///run/containerd/s/c75da0673a56bc4c45c98ba247b66094fa800a55f2bcda52a9d1ae9a1a0f72d4" protocol=ttrpc version=3 Sep 3 23:27:24.842705 systemd[1]: Started cri-containerd-f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070.scope - libcontainer container f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070. Sep 3 23:27:24.893203 containerd[1869]: time="2025-09-03T23:27:24.893155909Z" level=info msg="StartContainer for \"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" returns successfully" Sep 3 23:27:25.106570 containerd[1869]: time="2025-09-03T23:27:25.106195013Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:25.110630 containerd[1869]: time="2025-09-03T23:27:25.110592583Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 3 23:27:25.112166 containerd[1869]: time="2025-09-03T23:27:25.112061763Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 346.992564ms" Sep 3 23:27:25.112166 containerd[1869]: time="2025-09-03T23:27:25.112089548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 3 23:27:25.113223 containerd[1869]: time="2025-09-03T23:27:25.113007207Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 3 23:27:25.118426 containerd[1869]: time="2025-09-03T23:27:25.118402327Z" level=info msg="CreateContainer within sandbox \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 3 23:27:25.138943 containerd[1869]: time="2025-09-03T23:27:25.138915823Z" level=info msg="Container 00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:25.170934 containerd[1869]: time="2025-09-03T23:27:25.170900116Z" level=info msg="CreateContainer within sandbox \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\"" Sep 3 23:27:25.171673 containerd[1869]: time="2025-09-03T23:27:25.171652162Z" level=info msg="StartContainer for \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\"" Sep 3 23:27:25.173031 containerd[1869]: time="2025-09-03T23:27:25.173005739Z" level=info msg="connecting to shim 00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270" address="unix:///run/containerd/s/b51157a4f67919d7b2c3d316bfa4da594a7935729a07291afcb77bc13d7cde98" protocol=ttrpc version=3 Sep 3 23:27:25.189667 systemd[1]: Started cri-containerd-00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270.scope - libcontainer container 00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270. Sep 3 23:27:25.222412 containerd[1869]: time="2025-09-03T23:27:25.222381547Z" level=info msg="StartContainer for \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" returns successfully" Sep 3 23:27:25.616157 kubelet[3426]: I0903 23:27:25.615820 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-58676dd4cc-b8ngc" podStartSLOduration=23.953649428 podStartE2EDuration="31.615804697s" podCreationTimestamp="2025-09-03 23:26:54 +0000 UTC" firstStartedPulling="2025-09-03 23:27:17.10274302 +0000 UTC m=+40.861257667" lastFinishedPulling="2025-09-03 23:27:24.764898297 +0000 UTC m=+48.523412936" observedRunningTime="2025-09-03 23:27:25.615708022 +0000 UTC m=+49.374222661" watchObservedRunningTime="2025-09-03 23:27:25.615804697 +0000 UTC m=+49.374319336" Sep 3 23:27:25.679606 containerd[1869]: time="2025-09-03T23:27:25.679487666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"483c20977b54628988d7742376cf8ec970edd22f2f5990df6c7b5e7e752c8a9b\" pid:5775 exited_at:{seconds:1756942045 nanos:672434201}" Sep 3 23:27:25.697181 kubelet[3426]: I0903 23:27:25.696888 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c8645d4dc-8s4zv" podStartSLOduration=28.360313871 podStartE2EDuration="34.696874174s" podCreationTimestamp="2025-09-03 23:26:51 +0000 UTC" firstStartedPulling="2025-09-03 23:27:18.776319564 +0000 UTC m=+42.534834203" lastFinishedPulling="2025-09-03 23:27:25.112879867 +0000 UTC m=+48.871394506" observedRunningTime="2025-09-03 23:27:25.633061113 +0000 UTC m=+49.391575752" watchObservedRunningTime="2025-09-03 23:27:25.696874174 +0000 UTC m=+49.455388813" Sep 3 23:27:26.604564 kubelet[3426]: I0903 23:27:26.604511 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:27.031565 containerd[1869]: time="2025-09-03T23:27:27.031433256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:27.033978 containerd[1869]: time="2025-09-03T23:27:27.033944707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 3 23:27:27.036679 containerd[1869]: time="2025-09-03T23:27:27.036641723Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:27.040729 containerd[1869]: time="2025-09-03T23:27:27.040687619Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:27.041191 containerd[1869]: time="2025-09-03T23:27:27.041059630Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.928026486s" Sep 3 23:27:27.041191 containerd[1869]: time="2025-09-03T23:27:27.041079574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 3 23:27:27.048125 containerd[1869]: time="2025-09-03T23:27:27.048087006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 3 23:27:27.049572 containerd[1869]: time="2025-09-03T23:27:27.048813820Z" level=info msg="CreateContainer within sandbox \"d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 3 23:27:27.083751 containerd[1869]: time="2025-09-03T23:27:27.083721599Z" level=info msg="Container ce666bf73457081e8466e2304dff10ee3ef4e6d135c8a91793b3b513174026b9: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:27.086266 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1124519678.mount: Deactivated successfully. Sep 3 23:27:27.101626 containerd[1869]: time="2025-09-03T23:27:27.101583657Z" level=info msg="CreateContainer within sandbox \"d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"ce666bf73457081e8466e2304dff10ee3ef4e6d135c8a91793b3b513174026b9\"" Sep 3 23:27:27.102183 containerd[1869]: time="2025-09-03T23:27:27.102164898Z" level=info msg="StartContainer for \"ce666bf73457081e8466e2304dff10ee3ef4e6d135c8a91793b3b513174026b9\"" Sep 3 23:27:27.104654 containerd[1869]: time="2025-09-03T23:27:27.104633716Z" level=info msg="connecting to shim ce666bf73457081e8466e2304dff10ee3ef4e6d135c8a91793b3b513174026b9" address="unix:///run/containerd/s/979d7319117204d60f6aff4e3227449ffd42ff1cfa05d7b30dec5d58cb1873b9" protocol=ttrpc version=3 Sep 3 23:27:27.127642 systemd[1]: Started cri-containerd-ce666bf73457081e8466e2304dff10ee3ef4e6d135c8a91793b3b513174026b9.scope - libcontainer container ce666bf73457081e8466e2304dff10ee3ef4e6d135c8a91793b3b513174026b9. Sep 3 23:27:27.156260 containerd[1869]: time="2025-09-03T23:27:27.156227334Z" level=info msg="StartContainer for \"ce666bf73457081e8466e2304dff10ee3ef4e6d135c8a91793b3b513174026b9\" returns successfully" Sep 3 23:27:27.397930 containerd[1869]: time="2025-09-03T23:27:27.397342662Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:27.399782 containerd[1869]: time="2025-09-03T23:27:27.399748429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 3 23:27:27.403046 containerd[1869]: time="2025-09-03T23:27:27.403020654Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 354.893439ms" Sep 3 23:27:27.403178 containerd[1869]: time="2025-09-03T23:27:27.403160235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 3 23:27:27.404355 containerd[1869]: time="2025-09-03T23:27:27.404310845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 3 23:27:27.410966 containerd[1869]: time="2025-09-03T23:27:27.410913745Z" level=info msg="CreateContainer within sandbox \"718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 3 23:27:27.427993 containerd[1869]: time="2025-09-03T23:27:27.426565649Z" level=info msg="Container 9af3301f19af492c830267845ed3853f6fede52f4880951f4d2c07a6437af7be: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:27.459770 containerd[1869]: time="2025-09-03T23:27:27.459714576Z" level=info msg="CreateContainer within sandbox \"718ca5f9284adff2c78936635dadb131a54505e9a7e93a7796a830b3de0f0bbd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9af3301f19af492c830267845ed3853f6fede52f4880951f4d2c07a6437af7be\"" Sep 3 23:27:27.460606 containerd[1869]: time="2025-09-03T23:27:27.460469519Z" level=info msg="StartContainer for \"9af3301f19af492c830267845ed3853f6fede52f4880951f4d2c07a6437af7be\"" Sep 3 23:27:27.461787 containerd[1869]: time="2025-09-03T23:27:27.461768229Z" level=info msg="connecting to shim 9af3301f19af492c830267845ed3853f6fede52f4880951f4d2c07a6437af7be" address="unix:///run/containerd/s/fea3e50ac1118fc30477b91c40a086471be70e169b70fcbd6d79f468100ff134" protocol=ttrpc version=3 Sep 3 23:27:27.477658 systemd[1]: Started cri-containerd-9af3301f19af492c830267845ed3853f6fede52f4880951f4d2c07a6437af7be.scope - libcontainer container 9af3301f19af492c830267845ed3853f6fede52f4880951f4d2c07a6437af7be. Sep 3 23:27:27.519050 containerd[1869]: time="2025-09-03T23:27:27.518980206Z" level=info msg="StartContainer for \"9af3301f19af492c830267845ed3853f6fede52f4880951f4d2c07a6437af7be\" returns successfully" Sep 3 23:27:27.622824 kubelet[3426]: I0903 23:27:27.622776 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-776b7f98d6-vxp58" podStartSLOduration=28.030121578 podStartE2EDuration="35.622762381s" podCreationTimestamp="2025-09-03 23:26:52 +0000 UTC" firstStartedPulling="2025-09-03 23:27:19.811424707 +0000 UTC m=+43.569939354" lastFinishedPulling="2025-09-03 23:27:27.404065518 +0000 UTC m=+51.162580157" observedRunningTime="2025-09-03 23:27:27.622044999 +0000 UTC m=+51.380559638" watchObservedRunningTime="2025-09-03 23:27:27.622762381 +0000 UTC m=+51.381277028" Sep 3 23:27:28.615490 kubelet[3426]: I0903 23:27:28.615168 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:29.299280 containerd[1869]: time="2025-09-03T23:27:29.298830677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:29.301433 containerd[1869]: time="2025-09-03T23:27:29.301407844Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 3 23:27:29.304140 containerd[1869]: time="2025-09-03T23:27:29.304118797Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:29.308588 containerd[1869]: time="2025-09-03T23:27:29.308560992Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:29.309027 containerd[1869]: time="2025-09-03T23:27:29.308842196Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.904505655s" Sep 3 23:27:29.309027 containerd[1869]: time="2025-09-03T23:27:29.308871204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 3 23:27:29.317099 containerd[1869]: time="2025-09-03T23:27:29.317069328Z" level=info msg="CreateContainer within sandbox \"d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 3 23:27:29.342558 containerd[1869]: time="2025-09-03T23:27:29.342272475Z" level=info msg="Container 4c00cb4a82d236d7487209129a3b726de5f128ecc00993a9b037d67623213de1: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:29.360408 containerd[1869]: time="2025-09-03T23:27:29.360377651Z" level=info msg="CreateContainer within sandbox \"d8229c73ac9d9712a694bf396d01dd16e24f1559db46f4e0ad9bc2f2b8484ed8\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4c00cb4a82d236d7487209129a3b726de5f128ecc00993a9b037d67623213de1\"" Sep 3 23:27:29.361165 containerd[1869]: time="2025-09-03T23:27:29.361137734Z" level=info msg="StartContainer for \"4c00cb4a82d236d7487209129a3b726de5f128ecc00993a9b037d67623213de1\"" Sep 3 23:27:29.363098 containerd[1869]: time="2025-09-03T23:27:29.363066619Z" level=info msg="connecting to shim 4c00cb4a82d236d7487209129a3b726de5f128ecc00993a9b037d67623213de1" address="unix:///run/containerd/s/979d7319117204d60f6aff4e3227449ffd42ff1cfa05d7b30dec5d58cb1873b9" protocol=ttrpc version=3 Sep 3 23:27:29.382655 systemd[1]: Started cri-containerd-4c00cb4a82d236d7487209129a3b726de5f128ecc00993a9b037d67623213de1.scope - libcontainer container 4c00cb4a82d236d7487209129a3b726de5f128ecc00993a9b037d67623213de1. Sep 3 23:27:29.414302 containerd[1869]: time="2025-09-03T23:27:29.414151531Z" level=info msg="StartContainer for \"4c00cb4a82d236d7487209129a3b726de5f128ecc00993a9b037d67623213de1\" returns successfully" Sep 3 23:27:29.487975 kubelet[3426]: I0903 23:27:29.487900 3426 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 3 23:27:29.492490 kubelet[3426]: I0903 23:27:29.492472 3426 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 3 23:27:29.557918 kubelet[3426]: I0903 23:27:29.557816 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:29.605390 containerd[1869]: time="2025-09-03T23:27:29.605358310Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"d6cf502c70ccecf03bda322fdd643bc4e013622be4176232c6c622ae907e7ada\" pid:5912 exit_status:1 exited_at:{seconds:1756942049 nanos:604413648}" Sep 3 23:27:29.639955 kubelet[3426]: I0903 23:27:29.639594 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-6x5gb" podStartSLOduration=26.042461293 podStartE2EDuration="35.639580673s" podCreationTimestamp="2025-09-03 23:26:54 +0000 UTC" firstStartedPulling="2025-09-03 23:27:19.712512676 +0000 UTC m=+43.471027315" lastFinishedPulling="2025-09-03 23:27:29.309632056 +0000 UTC m=+53.068146695" observedRunningTime="2025-09-03 23:27:29.63857989 +0000 UTC m=+53.397094537" watchObservedRunningTime="2025-09-03 23:27:29.639580673 +0000 UTC m=+53.398095312" Sep 3 23:27:29.662004 containerd[1869]: time="2025-09-03T23:27:29.661973633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"0657c95b32188f4164a6dd709d4a57605e1bb6ecedbfc731a43e5d0def6ef57a\" pid:5936 exit_status:1 exited_at:{seconds:1756942049 nanos:661488786}" Sep 3 23:27:35.593245 kubelet[3426]: I0903 23:27:35.593026 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:41.773560 containerd[1869]: time="2025-09-03T23:27:41.773515310Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" id:\"9282a3c55061abd099d591c235f656b4689c2f0182b86717b75165f07350022e\" pid:5975 exited_at:{seconds:1756942061 nanos:773242223}" Sep 3 23:27:45.936481 containerd[1869]: time="2025-09-03T23:27:45.936440285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"66d2414e47cf2840e4f79c67bcc02f16366f29feb4960d7a93f35bfe03e7d707\" pid:6002 exited_at:{seconds:1756942065 nanos:936256809}" Sep 3 23:27:50.098500 kubelet[3426]: I0903 23:27:50.098452 3426 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:50.151782 containerd[1869]: time="2025-09-03T23:27:50.151739308Z" level=info msg="StopContainer for \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" with timeout 30 (s)" Sep 3 23:27:50.152516 containerd[1869]: time="2025-09-03T23:27:50.152129830Z" level=info msg="Stop container \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" with signal terminated" Sep 3 23:27:50.173395 systemd[1]: cri-containerd-00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270.scope: Deactivated successfully. Sep 3 23:27:50.175836 containerd[1869]: time="2025-09-03T23:27:50.175797820Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" id:\"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" pid:5742 exit_status:1 exited_at:{seconds:1756942070 nanos:175444707}" Sep 3 23:27:50.176066 containerd[1869]: time="2025-09-03T23:27:50.176008825Z" level=info msg="received exit event container_id:\"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" id:\"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" pid:5742 exit_status:1 exited_at:{seconds:1756942070 nanos:175444707}" Sep 3 23:27:50.204368 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270-rootfs.mount: Deactivated successfully. Sep 3 23:27:50.278318 systemd[1]: Created slice kubepods-besteffort-podf1afd8a8_9be6_444e_9e50_609dbafc67d7.slice - libcontainer container kubepods-besteffort-podf1afd8a8_9be6_444e_9e50_609dbafc67d7.slice. Sep 3 23:27:50.328005 kubelet[3426]: I0903 23:27:50.327973 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f1afd8a8-9be6-444e-9e50-609dbafc67d7-calico-apiserver-certs\") pod \"calico-apiserver-776b7f98d6-vd6sw\" (UID: \"f1afd8a8-9be6-444e-9e50-609dbafc67d7\") " pod="calico-apiserver/calico-apiserver-776b7f98d6-vd6sw" Sep 3 23:27:50.353219 kubelet[3426]: I0903 23:27:50.328018 3426 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmglp\" (UniqueName: \"kubernetes.io/projected/f1afd8a8-9be6-444e-9e50-609dbafc67d7-kube-api-access-qmglp\") pod \"calico-apiserver-776b7f98d6-vd6sw\" (UID: \"f1afd8a8-9be6-444e-9e50-609dbafc67d7\") " pod="calico-apiserver/calico-apiserver-776b7f98d6-vd6sw" Sep 3 23:27:50.609104 containerd[1869]: time="2025-09-03T23:27:50.608479417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776b7f98d6-vd6sw,Uid:f1afd8a8-9be6-444e-9e50-609dbafc67d7,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:51.126554 containerd[1869]: time="2025-09-03T23:27:51.126113243Z" level=info msg="StopContainer for \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" returns successfully" Sep 3 23:27:51.128463 containerd[1869]: time="2025-09-03T23:27:51.128358998Z" level=info msg="StopPodSandbox for \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\"" Sep 3 23:27:51.128762 containerd[1869]: time="2025-09-03T23:27:51.128665871Z" level=info msg="Container to stop \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 3 23:27:51.143368 systemd[1]: cri-containerd-28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c.scope: Deactivated successfully. Sep 3 23:27:51.147366 containerd[1869]: time="2025-09-03T23:27:51.147001768Z" level=info msg="TaskExit event in podsandbox handler container_id:\"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" id:\"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" pid:5431 exit_status:137 exited_at:{seconds:1756942071 nanos:145761575}" Sep 3 23:27:51.188573 containerd[1869]: time="2025-09-03T23:27:51.187556187Z" level=info msg="received exit event sandbox_id:\"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" exit_status:137 exited_at:{seconds:1756942071 nanos:145761575}" Sep 3 23:27:51.189431 containerd[1869]: time="2025-09-03T23:27:51.189156221Z" level=info msg="shim disconnected" id=28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c namespace=k8s.io Sep 3 23:27:51.189690 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c-rootfs.mount: Deactivated successfully. Sep 3 23:27:51.195695 containerd[1869]: time="2025-09-03T23:27:51.189429621Z" level=warning msg="cleaning up after shim disconnected" id=28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c namespace=k8s.io Sep 3 23:27:51.195695 containerd[1869]: time="2025-09-03T23:27:51.195616330Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 3 23:27:51.203668 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c-shm.mount: Deactivated successfully. Sep 3 23:27:51.226974 systemd-networkd[1697]: cali8ab3a8561b8: Link UP Sep 3 23:27:51.227081 systemd-networkd[1697]: cali8ab3a8561b8: Gained carrier Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.124 [INFO][6037] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0 calico-apiserver-776b7f98d6- calico-apiserver f1afd8a8-9be6-444e-9e50-609dbafc67d7 1146 0 2025-09-03 23:27:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:776b7f98d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-989a023a05 calico-apiserver-776b7f98d6-vd6sw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8ab3a8561b8 [] [] }} ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.124 [INFO][6037] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.161 [INFO][6051] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" HandleID="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.162 [INFO][6051] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" HandleID="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3820), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-989a023a05", "pod":"calico-apiserver-776b7f98d6-vd6sw", "timestamp":"2025-09-03 23:27:51.161911558 +0000 UTC"}, Hostname:"ci-4372.1.0-n-989a023a05", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.162 [INFO][6051] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.162 [INFO][6051] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.162 [INFO][6051] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-989a023a05' Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.171 [INFO][6051] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.174 [INFO][6051] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.181 [INFO][6051] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.183 [INFO][6051] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.189 [INFO][6051] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.191 [INFO][6051] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.193 [INFO][6051] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0 Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.205 [INFO][6051] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.216 [INFO][6051] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.10/26] block=192.168.70.0/26 handle="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.216 [INFO][6051] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.10/26] handle="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" host="ci-4372.1.0-n-989a023a05" Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.217 [INFO][6051] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:51.247357 containerd[1869]: 2025-09-03 23:27:51.217 [INFO][6051] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.10/26] IPv6=[] ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" HandleID="k8s-pod-network.c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" Sep 3 23:27:51.249449 containerd[1869]: 2025-09-03 23:27:51.220 [INFO][6037] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0", GenerateName:"calico-apiserver-776b7f98d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1afd8a8-9be6-444e-9e50-609dbafc67d7", ResourceVersion:"1146", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 27, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776b7f98d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"", Pod:"calico-apiserver-776b7f98d6-vd6sw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8ab3a8561b8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:51.249449 containerd[1869]: 2025-09-03 23:27:51.224 [INFO][6037] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.10/32] ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" Sep 3 23:27:51.249449 containerd[1869]: 2025-09-03 23:27:51.225 [INFO][6037] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8ab3a8561b8 ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" Sep 3 23:27:51.249449 containerd[1869]: 2025-09-03 23:27:51.227 [INFO][6037] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" Sep 3 23:27:51.249449 containerd[1869]: 2025-09-03 23:27:51.228 [INFO][6037] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0", GenerateName:"calico-apiserver-776b7f98d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1afd8a8-9be6-444e-9e50-609dbafc67d7", ResourceVersion:"1146", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 27, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776b7f98d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-989a023a05", ContainerID:"c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0", Pod:"calico-apiserver-776b7f98d6-vd6sw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8ab3a8561b8", MAC:"aa:3a:2d:53:7a:8c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:51.249449 containerd[1869]: 2025-09-03 23:27:51.242 [INFO][6037] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" Namespace="calico-apiserver" Pod="calico-apiserver-776b7f98d6-vd6sw" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--776b7f98d6--vd6sw-eth0" Sep 3 23:27:51.269106 systemd-networkd[1697]: calie95ddc4c419: Link DOWN Sep 3 23:27:51.269111 systemd-networkd[1697]: calie95ddc4c419: Lost carrier Sep 3 23:27:51.302482 containerd[1869]: time="2025-09-03T23:27:51.301396305Z" level=info msg="connecting to shim c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0" address="unix:///run/containerd/s/a77b5c4d95eaae7b10da949a3873b230ae306991b816344e3cecb4fcf05648df" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:51.325732 systemd[1]: Started cri-containerd-c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0.scope - libcontainer container c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0. Sep 3 23:27:51.378698 containerd[1869]: time="2025-09-03T23:27:51.377854322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776b7f98d6-vd6sw,Uid:f1afd8a8-9be6-444e-9e50-609dbafc67d7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0\"" Sep 3 23:27:51.387580 containerd[1869]: time="2025-09-03T23:27:51.387553253Z" level=info msg="CreateContainer within sandbox \"c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.265 [INFO][6096] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.265 [INFO][6096] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" iface="eth0" netns="/var/run/netns/cni-be7ac63c-9a48-7ab3-4e1d-510177c0286e" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.265 [INFO][6096] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" iface="eth0" netns="/var/run/netns/cni-be7ac63c-9a48-7ab3-4e1d-510177c0286e" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.278 [INFO][6096] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" after=13.208337ms iface="eth0" netns="/var/run/netns/cni-be7ac63c-9a48-7ab3-4e1d-510177c0286e" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.278 [INFO][6096] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.278 [INFO][6096] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.300 [INFO][6125] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.302 [INFO][6125] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.302 [INFO][6125] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.391 [INFO][6125] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.391 [INFO][6125] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.393 [INFO][6125] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:51.398361 containerd[1869]: 2025-09-03 23:27:51.396 [INFO][6096] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:27:51.402469 systemd[1]: run-netns-cni\x2dbe7ac63c\x2d9a48\x2d7ab3\x2d4e1d\x2d510177c0286e.mount: Deactivated successfully. Sep 3 23:27:51.402675 containerd[1869]: time="2025-09-03T23:27:51.402652128Z" level=info msg="TearDown network for sandbox \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" successfully" Sep 3 23:27:51.402750 containerd[1869]: time="2025-09-03T23:27:51.402735843Z" level=info msg="StopPodSandbox for \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" returns successfully" Sep 3 23:27:51.412134 containerd[1869]: time="2025-09-03T23:27:51.412107381Z" level=info msg="Container 027ccfddb993905a77447bd4e051eb6b93ddc3b65e1560a391dc3b1f2b4d2757: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:51.428873 containerd[1869]: time="2025-09-03T23:27:51.428831427Z" level=info msg="CreateContainer within sandbox \"c56542b63f9378ab4db9c72f3207336a8cfadc2893525460534520b2685180b0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"027ccfddb993905a77447bd4e051eb6b93ddc3b65e1560a391dc3b1f2b4d2757\"" Sep 3 23:27:51.429281 containerd[1869]: time="2025-09-03T23:27:51.429247190Z" level=info msg="StartContainer for \"027ccfddb993905a77447bd4e051eb6b93ddc3b65e1560a391dc3b1f2b4d2757\"" Sep 3 23:27:51.430004 containerd[1869]: time="2025-09-03T23:27:51.429964881Z" level=info msg="connecting to shim 027ccfddb993905a77447bd4e051eb6b93ddc3b65e1560a391dc3b1f2b4d2757" address="unix:///run/containerd/s/a77b5c4d95eaae7b10da949a3873b230ae306991b816344e3cecb4fcf05648df" protocol=ttrpc version=3 Sep 3 23:27:51.452974 systemd[1]: Started cri-containerd-027ccfddb993905a77447bd4e051eb6b93ddc3b65e1560a391dc3b1f2b4d2757.scope - libcontainer container 027ccfddb993905a77447bd4e051eb6b93ddc3b65e1560a391dc3b1f2b4d2757. Sep 3 23:27:51.521246 containerd[1869]: time="2025-09-03T23:27:51.521146067Z" level=info msg="StartContainer for \"027ccfddb993905a77447bd4e051eb6b93ddc3b65e1560a391dc3b1f2b4d2757\" returns successfully" Sep 3 23:27:51.534466 kubelet[3426]: I0903 23:27:51.534429 3426 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03fb5889-accd-4288-86d3-895ee8b136f5-calico-apiserver-certs\") pod \"03fb5889-accd-4288-86d3-895ee8b136f5\" (UID: \"03fb5889-accd-4288-86d3-895ee8b136f5\") " Sep 3 23:27:51.534466 kubelet[3426]: I0903 23:27:51.534464 3426 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4xs\" (UniqueName: \"kubernetes.io/projected/03fb5889-accd-4288-86d3-895ee8b136f5-kube-api-access-qg4xs\") pod \"03fb5889-accd-4288-86d3-895ee8b136f5\" (UID: \"03fb5889-accd-4288-86d3-895ee8b136f5\") " Sep 3 23:27:51.539718 kubelet[3426]: I0903 23:27:51.539681 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fb5889-accd-4288-86d3-895ee8b136f5-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "03fb5889-accd-4288-86d3-895ee8b136f5" (UID: "03fb5889-accd-4288-86d3-895ee8b136f5"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 3 23:27:51.539920 kubelet[3426]: I0903 23:27:51.539735 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fb5889-accd-4288-86d3-895ee8b136f5-kube-api-access-qg4xs" (OuterVolumeSpecName: "kube-api-access-qg4xs") pod "03fb5889-accd-4288-86d3-895ee8b136f5" (UID: "03fb5889-accd-4288-86d3-895ee8b136f5"). InnerVolumeSpecName "kube-api-access-qg4xs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 3 23:27:51.635159 kubelet[3426]: I0903 23:27:51.635052 3426 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03fb5889-accd-4288-86d3-895ee8b136f5-calico-apiserver-certs\") on node \"ci-4372.1.0-n-989a023a05\" DevicePath \"\"" Sep 3 23:27:51.635159 kubelet[3426]: I0903 23:27:51.635081 3426 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qg4xs\" (UniqueName: \"kubernetes.io/projected/03fb5889-accd-4288-86d3-895ee8b136f5-kube-api-access-qg4xs\") on node \"ci-4372.1.0-n-989a023a05\" DevicePath \"\"" Sep 3 23:27:51.672919 kubelet[3426]: I0903 23:27:51.672896 3426 scope.go:117] "RemoveContainer" containerID="00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270" Sep 3 23:27:51.678348 containerd[1869]: time="2025-09-03T23:27:51.678306686Z" level=info msg="RemoveContainer for \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\"" Sep 3 23:27:51.686507 systemd[1]: Removed slice kubepods-besteffort-pod03fb5889_accd_4288_86d3_895ee8b136f5.slice - libcontainer container kubepods-besteffort-pod03fb5889_accd_4288_86d3_895ee8b136f5.slice. Sep 3 23:27:51.713364 containerd[1869]: time="2025-09-03T23:27:51.713333950Z" level=info msg="RemoveContainer for \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" returns successfully" Sep 3 23:27:51.716538 kubelet[3426]: I0903 23:27:51.716489 3426 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-776b7f98d6-vd6sw" podStartSLOduration=1.716475835 podStartE2EDuration="1.716475835s" podCreationTimestamp="2025-09-03 23:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:27:51.696775203 +0000 UTC m=+75.455289850" watchObservedRunningTime="2025-09-03 23:27:51.716475835 +0000 UTC m=+75.474990586" Sep 3 23:27:51.718942 kubelet[3426]: I0903 23:27:51.718924 3426 scope.go:117] "RemoveContainer" containerID="00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270" Sep 3 23:27:51.719268 containerd[1869]: time="2025-09-03T23:27:51.719238600Z" level=error msg="ContainerStatus for \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\": not found" Sep 3 23:27:51.719396 kubelet[3426]: E0903 23:27:51.719378 3426 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\": not found" containerID="00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270" Sep 3 23:27:51.719464 kubelet[3426]: I0903 23:27:51.719425 3426 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270"} err="failed to get container status \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\": rpc error: code = NotFound desc = an error occurred when try to find container \"00c507d419d56fa0f7d9ef4d5c234e875d91fda3aa239e97f647f6e355c0d270\": not found" Sep 3 23:27:52.203413 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2757123378.mount: Deactivated successfully. Sep 3 23:27:52.203508 systemd[1]: var-lib-kubelet-pods-03fb5889\x2daccd\x2d4288\x2d86d3\x2d895ee8b136f5-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqg4xs.mount: Deactivated successfully. Sep 3 23:27:52.203566 systemd[1]: var-lib-kubelet-pods-03fb5889\x2daccd\x2d4288\x2d86d3\x2d895ee8b136f5-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 3 23:27:52.416383 kubelet[3426]: I0903 23:27:52.416304 3426 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fb5889-accd-4288-86d3-895ee8b136f5" path="/var/lib/kubelet/pods/03fb5889-accd-4288-86d3-895ee8b136f5/volumes" Sep 3 23:27:52.748272 containerd[1869]: time="2025-09-03T23:27:52.748184053Z" level=info msg="StopContainer for \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" with timeout 30 (s)" Sep 3 23:27:52.749049 containerd[1869]: time="2025-09-03T23:27:52.749010999Z" level=info msg="Stop container \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" with signal terminated" Sep 3 23:27:52.811621 systemd[1]: cri-containerd-edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334.scope: Deactivated successfully. Sep 3 23:27:52.818705 containerd[1869]: time="2025-09-03T23:27:52.818668895Z" level=info msg="received exit event container_id:\"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" id:\"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" pid:5649 exit_status:1 exited_at:{seconds:1756942072 nanos:818263102}" Sep 3 23:27:52.818981 containerd[1869]: time="2025-09-03T23:27:52.818900908Z" level=info msg="TaskExit event in podsandbox handler container_id:\"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" id:\"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" pid:5649 exit_status:1 exited_at:{seconds:1756942072 nanos:818263102}" Sep 3 23:27:52.839056 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334-rootfs.mount: Deactivated successfully. Sep 3 23:27:52.873547 containerd[1869]: time="2025-09-03T23:27:52.873363519Z" level=info msg="StopContainer for \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" returns successfully" Sep 3 23:27:52.874769 containerd[1869]: time="2025-09-03T23:27:52.874712301Z" level=info msg="StopPodSandbox for \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\"" Sep 3 23:27:52.874837 containerd[1869]: time="2025-09-03T23:27:52.874792751Z" level=info msg="Container to stop \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 3 23:27:52.880669 systemd[1]: cri-containerd-fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30.scope: Deactivated successfully. Sep 3 23:27:52.887247 containerd[1869]: time="2025-09-03T23:27:52.886835759Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" id:\"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" pid:5122 exit_status:137 exited_at:{seconds:1756942072 nanos:883781004}" Sep 3 23:27:52.914277 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30-rootfs.mount: Deactivated successfully. Sep 3 23:27:52.915297 containerd[1869]: time="2025-09-03T23:27:52.915271463Z" level=info msg="shim disconnected" id=fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30 namespace=k8s.io Sep 3 23:27:52.915794 containerd[1869]: time="2025-09-03T23:27:52.915556661Z" level=warning msg="cleaning up after shim disconnected" id=fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30 namespace=k8s.io Sep 3 23:27:52.917059 containerd[1869]: time="2025-09-03T23:27:52.916834121Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 3 23:27:52.941567 containerd[1869]: time="2025-09-03T23:27:52.941326474Z" level=info msg="received exit event sandbox_id:\"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" exit_status:137 exited_at:{seconds:1756942072 nanos:883781004}" Sep 3 23:27:52.945052 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30-shm.mount: Deactivated successfully. Sep 3 23:27:52.991081 systemd-networkd[1697]: calicf99d3e5a37: Link DOWN Sep 3 23:27:52.991353 systemd-networkd[1697]: calicf99d3e5a37: Lost carrier Sep 3 23:27:52.995748 systemd-networkd[1697]: cali8ab3a8561b8: Gained IPv6LL Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:52.989 [INFO][6299] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:52.989 [INFO][6299] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" iface="eth0" netns="/var/run/netns/cni-81793c2a-79c9-802c-4181-f082214f125d" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:52.989 [INFO][6299] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" iface="eth0" netns="/var/run/netns/cni-81793c2a-79c9-802c-4181-f082214f125d" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:52.997 [INFO][6299] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" after=8.159323ms iface="eth0" netns="/var/run/netns/cni-81793c2a-79c9-802c-4181-f082214f125d" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:52.997 [INFO][6299] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:52.997 [INFO][6299] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:53.020 [INFO][6306] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:53.020 [INFO][6306] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:53.020 [INFO][6306] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:53.061 [INFO][6306] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:53.061 [INFO][6306] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:53.062 [INFO][6306] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:53.066323 containerd[1869]: 2025-09-03 23:27:53.063 [INFO][6299] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:27:53.069120 systemd[1]: run-netns-cni\x2d81793c2a\x2d79c9\x2d802c\x2d4181\x2df082214f125d.mount: Deactivated successfully. Sep 3 23:27:53.071008 containerd[1869]: time="2025-09-03T23:27:53.070836987Z" level=info msg="TearDown network for sandbox \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" successfully" Sep 3 23:27:53.071008 containerd[1869]: time="2025-09-03T23:27:53.070891413Z" level=info msg="StopPodSandbox for \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" returns successfully" Sep 3 23:27:53.146567 kubelet[3426]: I0903 23:27:53.146258 3426 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxm8d\" (UniqueName: \"kubernetes.io/projected/0ee4a8e2-5e1b-4954-b402-9fae5592724a-kube-api-access-jxm8d\") pod \"0ee4a8e2-5e1b-4954-b402-9fae5592724a\" (UID: \"0ee4a8e2-5e1b-4954-b402-9fae5592724a\") " Sep 3 23:27:53.146567 kubelet[3426]: I0903 23:27:53.146297 3426 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0ee4a8e2-5e1b-4954-b402-9fae5592724a-calico-apiserver-certs\") pod \"0ee4a8e2-5e1b-4954-b402-9fae5592724a\" (UID: \"0ee4a8e2-5e1b-4954-b402-9fae5592724a\") " Sep 3 23:27:53.149213 kubelet[3426]: I0903 23:27:53.149176 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee4a8e2-5e1b-4954-b402-9fae5592724a-kube-api-access-jxm8d" (OuterVolumeSpecName: "kube-api-access-jxm8d") pod "0ee4a8e2-5e1b-4954-b402-9fae5592724a" (UID: "0ee4a8e2-5e1b-4954-b402-9fae5592724a"). InnerVolumeSpecName "kube-api-access-jxm8d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 3 23:27:53.149850 kubelet[3426]: I0903 23:27:53.149822 3426 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee4a8e2-5e1b-4954-b402-9fae5592724a-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "0ee4a8e2-5e1b-4954-b402-9fae5592724a" (UID: "0ee4a8e2-5e1b-4954-b402-9fae5592724a"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 3 23:27:53.204516 systemd[1]: var-lib-kubelet-pods-0ee4a8e2\x2d5e1b\x2d4954\x2db402\x2d9fae5592724a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2djxm8d.mount: Deactivated successfully. Sep 3 23:27:53.204625 systemd[1]: var-lib-kubelet-pods-0ee4a8e2\x2d5e1b\x2d4954\x2db402\x2d9fae5592724a-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 3 23:27:53.246584 kubelet[3426]: I0903 23:27:53.246541 3426 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxm8d\" (UniqueName: \"kubernetes.io/projected/0ee4a8e2-5e1b-4954-b402-9fae5592724a-kube-api-access-jxm8d\") on node \"ci-4372.1.0-n-989a023a05\" DevicePath \"\"" Sep 3 23:27:53.246584 kubelet[3426]: I0903 23:27:53.246570 3426 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0ee4a8e2-5e1b-4954-b402-9fae5592724a-calico-apiserver-certs\") on node \"ci-4372.1.0-n-989a023a05\" DevicePath \"\"" Sep 3 23:27:53.683671 kubelet[3426]: I0903 23:27:53.683645 3426 scope.go:117] "RemoveContainer" containerID="edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334" Sep 3 23:27:53.688622 containerd[1869]: time="2025-09-03T23:27:53.688171578Z" level=info msg="RemoveContainer for \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\"" Sep 3 23:27:53.691504 systemd[1]: Removed slice kubepods-besteffort-pod0ee4a8e2_5e1b_4954_b402_9fae5592724a.slice - libcontainer container kubepods-besteffort-pod0ee4a8e2_5e1b_4954_b402_9fae5592724a.slice. Sep 3 23:27:53.698539 containerd[1869]: time="2025-09-03T23:27:53.698466540Z" level=info msg="RemoveContainer for \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" returns successfully" Sep 3 23:27:53.698726 kubelet[3426]: I0903 23:27:53.698699 3426 scope.go:117] "RemoveContainer" containerID="edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334" Sep 3 23:27:53.698969 containerd[1869]: time="2025-09-03T23:27:53.698947591Z" level=error msg="ContainerStatus for \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\": not found" Sep 3 23:27:53.699188 kubelet[3426]: E0903 23:27:53.699163 3426 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\": not found" containerID="edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334" Sep 3 23:27:53.699188 kubelet[3426]: I0903 23:27:53.699184 3426 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334"} err="failed to get container status \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\": rpc error: code = NotFound desc = an error occurred when try to find container \"edf01fee74e3b3b5425871e98154cb30f65819ba76c58204d825f03704539334\": not found" Sep 3 23:27:54.417636 kubelet[3426]: I0903 23:27:54.417326 3426 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee4a8e2-5e1b-4954-b402-9fae5592724a" path="/var/lib/kubelet/pods/0ee4a8e2-5e1b-4954-b402-9fae5592724a/volumes" Sep 3 23:27:55.620046 containerd[1869]: time="2025-09-03T23:27:55.619838947Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"9801ff7b2887ae7ea0329f9d821cebb7131817cfe9e8c002e840d30939992770\" pid:6334 exited_at:{seconds:1756942075 nanos:619656527}" Sep 3 23:27:59.658166 containerd[1869]: time="2025-09-03T23:27:59.658006091Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"7d605fe071930855d7a8c90dc25bac552f95e24c18283e3b1cd27831d8ebfbc8\" pid:6357 exited_at:{seconds:1756942079 nanos:657806326}" Sep 3 23:28:02.359686 containerd[1869]: time="2025-09-03T23:28:02.359648130Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"e62e6ec5528ef015c41b597804a377dba806d0fcb5c3f242671b838953cdb23a\" pid:6382 exited_at:{seconds:1756942082 nanos:359379172}" Sep 3 23:28:11.786624 containerd[1869]: time="2025-09-03T23:28:11.786577364Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" id:\"8f0142eaf7d17722fda94666b188506b4d4a37c3316b7fed81a84fe6288b6b0f\" pid:6407 exited_at:{seconds:1756942091 nanos:786302301}" Sep 3 23:28:25.631944 containerd[1869]: time="2025-09-03T23:28:25.631737344Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"d87d3585cf18ed09108e51a0582db55601764ae8982218241712b2ac282aba46\" pid:6438 exited_at:{seconds:1756942105 nanos:631561700}" Sep 3 23:28:29.652105 containerd[1869]: time="2025-09-03T23:28:29.652054049Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"c2f86457344838ff31d9e910771d3d9b9dd97ce8c09f1d29ff849cc436cf2d9d\" pid:6457 exited_at:{seconds:1756942109 nanos:651644639}" Sep 3 23:28:36.412720 containerd[1869]: time="2025-09-03T23:28:36.412645754Z" level=info msg="StopPodSandbox for \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\"" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.439 [WARNING][6482] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.439 [INFO][6482] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.439 [INFO][6482] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" iface="eth0" netns="" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.439 [INFO][6482] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.440 [INFO][6482] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.453 [INFO][6491] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.454 [INFO][6491] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.454 [INFO][6491] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.458 [WARNING][6491] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.458 [INFO][6491] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.460 [INFO][6491] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:28:36.464701 containerd[1869]: 2025-09-03 23:28:36.461 [INFO][6482] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.465953 containerd[1869]: time="2025-09-03T23:28:36.465113241Z" level=info msg="TearDown network for sandbox \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" successfully" Sep 3 23:28:36.465953 containerd[1869]: time="2025-09-03T23:28:36.465139289Z" level=info msg="StopPodSandbox for \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" returns successfully" Sep 3 23:28:36.467061 containerd[1869]: time="2025-09-03T23:28:36.466351571Z" level=info msg="RemovePodSandbox for \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\"" Sep 3 23:28:36.471800 containerd[1869]: time="2025-09-03T23:28:36.471780054Z" level=info msg="Forcibly stopping sandbox \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\"" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.495 [WARNING][6507] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.495 [INFO][6507] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.495 [INFO][6507] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" iface="eth0" netns="" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.495 [INFO][6507] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.495 [INFO][6507] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.509 [INFO][6515] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.509 [INFO][6515] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.509 [INFO][6515] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.514 [WARNING][6515] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.514 [INFO][6515] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" HandleID="k8s-pod-network.28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--8s4zv-eth0" Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.516 [INFO][6515] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:28:36.520071 containerd[1869]: 2025-09-03 23:28:36.518 [INFO][6507] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c" Sep 3 23:28:36.520071 containerd[1869]: time="2025-09-03T23:28:36.519802302Z" level=info msg="TearDown network for sandbox \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" successfully" Sep 3 23:28:36.521021 containerd[1869]: time="2025-09-03T23:28:36.520998896Z" level=info msg="Ensure that sandbox 28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c in task-service has been cleanup successfully" Sep 3 23:28:36.529939 containerd[1869]: time="2025-09-03T23:28:36.529905748Z" level=info msg="RemovePodSandbox \"28eb4bfb5e11e048a6f48053189be294398e2bfff216ab0ebc1caf7f1455574c\" returns successfully" Sep 3 23:28:36.530350 containerd[1869]: time="2025-09-03T23:28:36.530326789Z" level=info msg="StopPodSandbox for \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\"" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.551 [WARNING][6529] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.551 [INFO][6529] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.551 [INFO][6529] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" iface="eth0" netns="" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.551 [INFO][6529] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.551 [INFO][6529] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.565 [INFO][6537] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.565 [INFO][6537] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.565 [INFO][6537] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.571 [WARNING][6537] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.571 [INFO][6537] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.572 [INFO][6537] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:28:36.575221 containerd[1869]: 2025-09-03 23:28:36.573 [INFO][6529] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.576194 containerd[1869]: time="2025-09-03T23:28:36.575253964Z" level=info msg="TearDown network for sandbox \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" successfully" Sep 3 23:28:36.576194 containerd[1869]: time="2025-09-03T23:28:36.575268508Z" level=info msg="StopPodSandbox for \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" returns successfully" Sep 3 23:28:36.576194 containerd[1869]: time="2025-09-03T23:28:36.576074334Z" level=info msg="RemovePodSandbox for \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\"" Sep 3 23:28:36.576194 containerd[1869]: time="2025-09-03T23:28:36.576095238Z" level=info msg="Forcibly stopping sandbox \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\"" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.599 [WARNING][6552] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" WorkloadEndpoint="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.599 [INFO][6552] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.599 [INFO][6552] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" iface="eth0" netns="" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.599 [INFO][6552] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.599 [INFO][6552] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.612 [INFO][6559] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.613 [INFO][6559] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.613 [INFO][6559] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.617 [WARNING][6559] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.617 [INFO][6559] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" HandleID="k8s-pod-network.fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Workload="ci--4372.1.0--n--989a023a05-k8s-calico--apiserver--5c8645d4dc--4m6vn-eth0" Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.618 [INFO][6559] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:28:36.620961 containerd[1869]: 2025-09-03 23:28:36.619 [INFO][6552] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30" Sep 3 23:28:36.621277 containerd[1869]: time="2025-09-03T23:28:36.621029997Z" level=info msg="TearDown network for sandbox \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" successfully" Sep 3 23:28:36.622469 containerd[1869]: time="2025-09-03T23:28:36.622445643Z" level=info msg="Ensure that sandbox fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30 in task-service has been cleanup successfully" Sep 3 23:28:36.633447 containerd[1869]: time="2025-09-03T23:28:36.633420284Z" level=info msg="RemovePodSandbox \"fabc9f7c81cb6bf488dd19db6421838519780d13a81812fed9b92b8779248e30\" returns successfully" Sep 3 23:28:41.773103 containerd[1869]: time="2025-09-03T23:28:41.773061462Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" id:\"c0db50acc31a666db10f43431a885996fa600e16a2e41fbbbe8e0b272e9aefb1\" pid:6579 exited_at:{seconds:1756942121 nanos:772784000}" Sep 3 23:28:45.930476 containerd[1869]: time="2025-09-03T23:28:45.930437829Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"3368505896c5667484cb047c84cb261f2dff85575c4503aab9e49b80da672f5e\" pid:6605 exited_at:{seconds:1756942125 nanos:930120062}" Sep 3 23:28:52.870563 systemd[1]: Started sshd@7-10.200.20.13:22-10.200.16.10:55182.service - OpenSSH per-connection server daemon (10.200.16.10:55182). Sep 3 23:28:53.378835 sshd[6641]: Accepted publickey for core from 10.200.16.10 port 55182 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:28:53.380910 sshd-session[6641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:28:53.388085 systemd-logind[1851]: New session 10 of user core. Sep 3 23:28:53.390691 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 3 23:28:53.784494 sshd[6643]: Connection closed by 10.200.16.10 port 55182 Sep 3 23:28:53.785087 sshd-session[6641]: pam_unix(sshd:session): session closed for user core Sep 3 23:28:53.788276 systemd[1]: sshd@7-10.200.20.13:22-10.200.16.10:55182.service: Deactivated successfully. Sep 3 23:28:53.790038 systemd[1]: session-10.scope: Deactivated successfully. Sep 3 23:28:53.791392 systemd-logind[1851]: Session 10 logged out. Waiting for processes to exit. Sep 3 23:28:53.792414 systemd-logind[1851]: Removed session 10. Sep 3 23:28:55.633496 containerd[1869]: time="2025-09-03T23:28:55.633457488Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"a55703cda20006bc97a6784356043b9c1061be81b51ee1379dccfef4665d19ad\" pid:6667 exited_at:{seconds:1756942135 nanos:633026286}" Sep 3 23:28:58.878365 systemd[1]: Started sshd@8-10.200.20.13:22-10.200.16.10:55190.service - OpenSSH per-connection server daemon (10.200.16.10:55190). Sep 3 23:28:59.369739 sshd[6677]: Accepted publickey for core from 10.200.16.10 port 55190 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:28:59.370917 sshd-session[6677]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:28:59.374674 systemd-logind[1851]: New session 11 of user core. Sep 3 23:28:59.381834 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 3 23:28:59.657879 containerd[1869]: time="2025-09-03T23:28:59.657283667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"a32a7fd975bcb82d7100586bffb34474d297574a68fdf34db0fc49347fa39502\" pid:6692 exited_at:{seconds:1756942139 nanos:656984702}" Sep 3 23:28:59.786238 sshd[6679]: Connection closed by 10.200.16.10 port 55190 Sep 3 23:28:59.786796 sshd-session[6677]: pam_unix(sshd:session): session closed for user core Sep 3 23:28:59.790105 systemd[1]: sshd@8-10.200.20.13:22-10.200.16.10:55190.service: Deactivated successfully. Sep 3 23:28:59.792148 systemd[1]: session-11.scope: Deactivated successfully. Sep 3 23:28:59.792957 systemd-logind[1851]: Session 11 logged out. Waiting for processes to exit. Sep 3 23:28:59.794033 systemd-logind[1851]: Removed session 11. Sep 3 23:29:02.331049 containerd[1869]: time="2025-09-03T23:29:02.330859638Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"f1769f8b49159c9d4805e18cbc77e516e8d05d6929a40c23b3b84d939fe9b5cc\" pid:6725 exited_at:{seconds:1756942142 nanos:330642801}" Sep 3 23:29:04.869179 systemd[1]: Started sshd@9-10.200.20.13:22-10.200.16.10:36996.service - OpenSSH per-connection server daemon (10.200.16.10:36996). Sep 3 23:29:05.325956 sshd[6736]: Accepted publickey for core from 10.200.16.10 port 36996 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:05.327722 sshd-session[6736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:05.331780 systemd-logind[1851]: New session 12 of user core. Sep 3 23:29:05.339818 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 3 23:29:05.691105 sshd[6738]: Connection closed by 10.200.16.10 port 36996 Sep 3 23:29:05.690939 sshd-session[6736]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:05.693413 systemd[1]: sshd@9-10.200.20.13:22-10.200.16.10:36996.service: Deactivated successfully. Sep 3 23:29:05.695175 systemd[1]: session-12.scope: Deactivated successfully. Sep 3 23:29:05.697077 systemd-logind[1851]: Session 12 logged out. Waiting for processes to exit. Sep 3 23:29:05.698220 systemd-logind[1851]: Removed session 12. Sep 3 23:29:05.789027 systemd[1]: Started sshd@10-10.200.20.13:22-10.200.16.10:37008.service - OpenSSH per-connection server daemon (10.200.16.10:37008). Sep 3 23:29:06.278248 sshd[6750]: Accepted publickey for core from 10.200.16.10 port 37008 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:06.279267 sshd-session[6750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:06.282826 systemd-logind[1851]: New session 13 of user core. Sep 3 23:29:06.295650 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 3 23:29:06.704650 sshd[6752]: Connection closed by 10.200.16.10 port 37008 Sep 3 23:29:06.704866 sshd-session[6750]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:06.708229 systemd[1]: sshd@10-10.200.20.13:22-10.200.16.10:37008.service: Deactivated successfully. Sep 3 23:29:06.710058 systemd[1]: session-13.scope: Deactivated successfully. Sep 3 23:29:06.710791 systemd-logind[1851]: Session 13 logged out. Waiting for processes to exit. Sep 3 23:29:06.711904 systemd-logind[1851]: Removed session 13. Sep 3 23:29:06.787733 systemd[1]: Started sshd@11-10.200.20.13:22-10.200.16.10:37012.service - OpenSSH per-connection server daemon (10.200.16.10:37012). Sep 3 23:29:07.245036 sshd[6761]: Accepted publickey for core from 10.200.16.10 port 37012 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:07.246456 sshd-session[6761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:07.250314 systemd-logind[1851]: New session 14 of user core. Sep 3 23:29:07.259666 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 3 23:29:07.632578 sshd[6763]: Connection closed by 10.200.16.10 port 37012 Sep 3 23:29:07.633086 sshd-session[6761]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:07.636106 systemd-logind[1851]: Session 14 logged out. Waiting for processes to exit. Sep 3 23:29:07.636895 systemd[1]: sshd@11-10.200.20.13:22-10.200.16.10:37012.service: Deactivated successfully. Sep 3 23:29:07.639121 systemd[1]: session-14.scope: Deactivated successfully. Sep 3 23:29:07.641102 systemd-logind[1851]: Removed session 14. Sep 3 23:29:11.780903 containerd[1869]: time="2025-09-03T23:29:11.780863643Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" id:\"493453043508c4b7e257c1a230f4caf44370e7a7e98bb38152611bd2ed88f8b2\" pid:6790 exit_status:1 exited_at:{seconds:1756942151 nanos:780559453}" Sep 3 23:29:12.721693 systemd[1]: Started sshd@12-10.200.20.13:22-10.200.16.10:53398.service - OpenSSH per-connection server daemon (10.200.16.10:53398). Sep 3 23:29:13.216476 sshd[6804]: Accepted publickey for core from 10.200.16.10 port 53398 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:13.218037 sshd-session[6804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:13.222567 systemd-logind[1851]: New session 15 of user core. Sep 3 23:29:13.227643 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 3 23:29:13.608329 sshd[6808]: Connection closed by 10.200.16.10 port 53398 Sep 3 23:29:13.608813 sshd-session[6804]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:13.611907 systemd[1]: sshd@12-10.200.20.13:22-10.200.16.10:53398.service: Deactivated successfully. Sep 3 23:29:13.613866 systemd[1]: session-15.scope: Deactivated successfully. Sep 3 23:29:13.615241 systemd-logind[1851]: Session 15 logged out. Waiting for processes to exit. Sep 3 23:29:13.616903 systemd-logind[1851]: Removed session 15. Sep 3 23:29:18.687944 systemd[1]: Started sshd@13-10.200.20.13:22-10.200.16.10:53414.service - OpenSSH per-connection server daemon (10.200.16.10:53414). Sep 3 23:29:19.141790 sshd[6820]: Accepted publickey for core from 10.200.16.10 port 53414 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:19.142906 sshd-session[6820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:19.147488 systemd-logind[1851]: New session 16 of user core. Sep 3 23:29:19.154676 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 3 23:29:19.508484 sshd[6822]: Connection closed by 10.200.16.10 port 53414 Sep 3 23:29:19.509005 sshd-session[6820]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:19.512401 systemd[1]: sshd@13-10.200.20.13:22-10.200.16.10:53414.service: Deactivated successfully. Sep 3 23:29:19.514198 systemd[1]: session-16.scope: Deactivated successfully. Sep 3 23:29:19.515182 systemd-logind[1851]: Session 16 logged out. Waiting for processes to exit. Sep 3 23:29:19.517642 systemd-logind[1851]: Removed session 16. Sep 3 23:29:24.605652 systemd[1]: Started sshd@14-10.200.20.13:22-10.200.16.10:36954.service - OpenSSH per-connection server daemon (10.200.16.10:36954). Sep 3 23:29:25.098316 sshd[6834]: Accepted publickey for core from 10.200.16.10 port 36954 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:25.101930 sshd-session[6834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:25.106692 systemd-logind[1851]: New session 17 of user core. Sep 3 23:29:25.112676 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 3 23:29:25.503888 sshd[6836]: Connection closed by 10.200.16.10 port 36954 Sep 3 23:29:25.504305 sshd-session[6834]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:25.507855 systemd[1]: sshd@14-10.200.20.13:22-10.200.16.10:36954.service: Deactivated successfully. Sep 3 23:29:25.509957 systemd[1]: session-17.scope: Deactivated successfully. Sep 3 23:29:25.511333 systemd-logind[1851]: Session 17 logged out. Waiting for processes to exit. Sep 3 23:29:25.513721 systemd-logind[1851]: Removed session 17. Sep 3 23:29:25.621669 containerd[1869]: time="2025-09-03T23:29:25.621631876Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"69dd97a7af633852331c5ec4b732246e0665e6889718c27265d2d64136300c5a\" pid:6859 exited_at:{seconds:1756942165 nanos:621353030}" Sep 3 23:29:29.654923 containerd[1869]: time="2025-09-03T23:29:29.654831221Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"c495a726396b547adb95691f6979d3a7afdd9b000dd550f082f688bb915b5f5d\" pid:6881 exited_at:{seconds:1756942169 nanos:654387565}" Sep 3 23:29:30.591139 systemd[1]: Started sshd@15-10.200.20.13:22-10.200.16.10:53502.service - OpenSSH per-connection server daemon (10.200.16.10:53502). Sep 3 23:29:31.074699 sshd[6892]: Accepted publickey for core from 10.200.16.10 port 53502 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:31.075853 sshd-session[6892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:31.079195 systemd-logind[1851]: New session 18 of user core. Sep 3 23:29:31.084655 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 3 23:29:31.458552 sshd[6894]: Connection closed by 10.200.16.10 port 53502 Sep 3 23:29:31.457951 sshd-session[6892]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:31.460934 systemd[1]: sshd@15-10.200.20.13:22-10.200.16.10:53502.service: Deactivated successfully. Sep 3 23:29:31.464233 systemd[1]: session-18.scope: Deactivated successfully. Sep 3 23:29:31.464908 systemd-logind[1851]: Session 18 logged out. Waiting for processes to exit. Sep 3 23:29:31.466179 systemd-logind[1851]: Removed session 18. Sep 3 23:29:31.539989 systemd[1]: Started sshd@16-10.200.20.13:22-10.200.16.10:53510.service - OpenSSH per-connection server daemon (10.200.16.10:53510). Sep 3 23:29:32.001428 sshd[6906]: Accepted publickey for core from 10.200.16.10 port 53510 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:32.002612 sshd-session[6906]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:32.006307 systemd-logind[1851]: New session 19 of user core. Sep 3 23:29:32.015645 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 3 23:29:32.536930 sshd[6908]: Connection closed by 10.200.16.10 port 53510 Sep 3 23:29:32.537878 sshd-session[6906]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:32.541022 systemd[1]: sshd@16-10.200.20.13:22-10.200.16.10:53510.service: Deactivated successfully. Sep 3 23:29:32.544265 systemd[1]: session-19.scope: Deactivated successfully. Sep 3 23:29:32.545899 systemd-logind[1851]: Session 19 logged out. Waiting for processes to exit. Sep 3 23:29:32.547525 systemd-logind[1851]: Removed session 19. Sep 3 23:29:32.627735 systemd[1]: Started sshd@17-10.200.20.13:22-10.200.16.10:53524.service - OpenSSH per-connection server daemon (10.200.16.10:53524). Sep 3 23:29:33.126543 sshd[6917]: Accepted publickey for core from 10.200.16.10 port 53524 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:33.127736 sshd-session[6917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:33.132863 systemd-logind[1851]: New session 20 of user core. Sep 3 23:29:33.138639 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 3 23:29:33.936743 sshd[6919]: Connection closed by 10.200.16.10 port 53524 Sep 3 23:29:33.937338 sshd-session[6917]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:33.940806 systemd[1]: sshd@17-10.200.20.13:22-10.200.16.10:53524.service: Deactivated successfully. Sep 3 23:29:33.942844 systemd[1]: session-20.scope: Deactivated successfully. Sep 3 23:29:33.943572 systemd-logind[1851]: Session 20 logged out. Waiting for processes to exit. Sep 3 23:29:33.945174 systemd-logind[1851]: Removed session 20. Sep 3 23:29:34.041390 systemd[1]: Started sshd@18-10.200.20.13:22-10.200.16.10:53528.service - OpenSSH per-connection server daemon (10.200.16.10:53528). Sep 3 23:29:34.532093 sshd[6939]: Accepted publickey for core from 10.200.16.10 port 53528 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:34.533157 sshd-session[6939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:34.537128 systemd-logind[1851]: New session 21 of user core. Sep 3 23:29:34.543648 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 3 23:29:34.998118 sshd[6941]: Connection closed by 10.200.16.10 port 53528 Sep 3 23:29:34.997245 sshd-session[6939]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:34.999890 systemd[1]: sshd@18-10.200.20.13:22-10.200.16.10:53528.service: Deactivated successfully. Sep 3 23:29:35.001749 systemd[1]: session-21.scope: Deactivated successfully. Sep 3 23:29:35.003684 systemd-logind[1851]: Session 21 logged out. Waiting for processes to exit. Sep 3 23:29:35.005068 systemd-logind[1851]: Removed session 21. Sep 3 23:29:35.079608 systemd[1]: Started sshd@19-10.200.20.13:22-10.200.16.10:53538.service - OpenSSH per-connection server daemon (10.200.16.10:53538). Sep 3 23:29:35.534747 sshd[6951]: Accepted publickey for core from 10.200.16.10 port 53538 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:35.535812 sshd-session[6951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:35.539343 systemd-logind[1851]: New session 22 of user core. Sep 3 23:29:35.547636 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 3 23:29:35.914611 sshd[6953]: Connection closed by 10.200.16.10 port 53538 Sep 3 23:29:35.915100 sshd-session[6951]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:35.918393 systemd[1]: sshd@19-10.200.20.13:22-10.200.16.10:53538.service: Deactivated successfully. Sep 3 23:29:35.920914 systemd[1]: session-22.scope: Deactivated successfully. Sep 3 23:29:35.921984 systemd-logind[1851]: Session 22 logged out. Waiting for processes to exit. Sep 3 23:29:35.923276 systemd-logind[1851]: Removed session 22. Sep 3 23:29:41.004775 systemd[1]: Started sshd@20-10.200.20.13:22-10.200.16.10:48574.service - OpenSSH per-connection server daemon (10.200.16.10:48574). Sep 3 23:29:41.500688 sshd[6968]: Accepted publickey for core from 10.200.16.10 port 48574 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:41.502310 sshd-session[6968]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:41.506841 systemd-logind[1851]: New session 23 of user core. Sep 3 23:29:41.511665 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 3 23:29:41.777414 containerd[1869]: time="2025-09-03T23:29:41.777379057Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f431bb561a90becbf7512cfb3124261995d34d4b5e836d63f3a70a1a4ac36fb1\" id:\"e26d58e5975ebfd84423f9348f29176fca077c45a4602716738c1cf907019363\" pid:6984 exited_at:{seconds:1756942181 nanos:777103764}" Sep 3 23:29:41.908298 sshd[6970]: Connection closed by 10.200.16.10 port 48574 Sep 3 23:29:41.908754 sshd-session[6968]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:41.911448 systemd[1]: sshd@20-10.200.20.13:22-10.200.16.10:48574.service: Deactivated successfully. Sep 3 23:29:41.915067 systemd[1]: session-23.scope: Deactivated successfully. Sep 3 23:29:41.917175 systemd-logind[1851]: Session 23 logged out. Waiting for processes to exit. Sep 3 23:29:41.918197 systemd-logind[1851]: Removed session 23. Sep 3 23:29:45.941249 containerd[1869]: time="2025-09-03T23:29:45.941206509Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"a81be448f97b5d9a394b2b149573ba6a67b7df0211ca79a684008b7773232152\" pid:7024 exited_at:{seconds:1756942185 nanos:940931423}" Sep 3 23:29:46.990220 systemd[1]: Started sshd@21-10.200.20.13:22-10.200.16.10:48578.service - OpenSSH per-connection server daemon (10.200.16.10:48578). Sep 3 23:29:47.455821 sshd[7036]: Accepted publickey for core from 10.200.16.10 port 48578 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:47.457609 sshd-session[7036]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:47.462119 systemd-logind[1851]: New session 24 of user core. Sep 3 23:29:47.466657 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 3 23:29:47.836353 sshd[7038]: Connection closed by 10.200.16.10 port 48578 Sep 3 23:29:47.836854 sshd-session[7036]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:47.840370 systemd[1]: sshd@21-10.200.20.13:22-10.200.16.10:48578.service: Deactivated successfully. Sep 3 23:29:47.843156 systemd[1]: session-24.scope: Deactivated successfully. Sep 3 23:29:47.843983 systemd-logind[1851]: Session 24 logged out. Waiting for processes to exit. Sep 3 23:29:47.845452 systemd-logind[1851]: Removed session 24. Sep 3 23:29:52.916294 systemd[1]: Started sshd@22-10.200.20.13:22-10.200.16.10:58432.service - OpenSSH per-connection server daemon (10.200.16.10:58432). Sep 3 23:29:53.367277 sshd[7055]: Accepted publickey for core from 10.200.16.10 port 58432 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:53.368419 sshd-session[7055]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:53.372107 systemd-logind[1851]: New session 25 of user core. Sep 3 23:29:53.378659 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 3 23:29:53.732754 sshd[7057]: Connection closed by 10.200.16.10 port 58432 Sep 3 23:29:53.733347 sshd-session[7055]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:53.736087 systemd-logind[1851]: Session 25 logged out. Waiting for processes to exit. Sep 3 23:29:53.736207 systemd[1]: sshd@22-10.200.20.13:22-10.200.16.10:58432.service: Deactivated successfully. Sep 3 23:29:53.738155 systemd[1]: session-25.scope: Deactivated successfully. Sep 3 23:29:53.740088 systemd-logind[1851]: Removed session 25. Sep 3 23:29:55.622225 containerd[1869]: time="2025-09-03T23:29:55.621943529Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f638645aa7dbe1bb2bffefb7390afbc38d568e6e9c3702cf0b542f6bf0a03070\" id:\"8193acc1b4a8c45f573fca3218e391d2389efcd4ebef251ef0fd614fb07cf136\" pid:7080 exited_at:{seconds:1756942195 nanos:621620235}" Sep 3 23:29:58.824150 systemd[1]: Started sshd@23-10.200.20.13:22-10.200.16.10:58438.service - OpenSSH per-connection server daemon (10.200.16.10:58438). Sep 3 23:29:59.280125 sshd[7089]: Accepted publickey for core from 10.200.16.10 port 58438 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:59.280563 sshd-session[7089]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:59.284372 systemd-logind[1851]: New session 26 of user core. Sep 3 23:29:59.288643 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 3 23:29:59.642495 sshd[7091]: Connection closed by 10.200.16.10 port 58438 Sep 3 23:29:59.643189 sshd-session[7089]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:59.645812 systemd-logind[1851]: Session 26 logged out. Waiting for processes to exit. Sep 3 23:29:59.646331 systemd[1]: sshd@23-10.200.20.13:22-10.200.16.10:58438.service: Deactivated successfully. Sep 3 23:29:59.649133 systemd[1]: session-26.scope: Deactivated successfully. Sep 3 23:29:59.653939 systemd-logind[1851]: Removed session 26. Sep 3 23:29:59.661321 containerd[1869]: time="2025-09-03T23:29:59.661258674Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"d0bc502ee8d808cad1ca2a59b0008ebb699a5c99fafe1d696812e09e9f9bfa5a\" pid:7112 exited_at:{seconds:1756942199 nanos:660807193}" Sep 3 23:30:02.331825 containerd[1869]: time="2025-09-03T23:30:02.331686654Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfcf6feb708844bc6c6bac2da9506f8e6cf254ddd49f1fe60b2098477ed7724d\" id:\"ceaa8adfdecdbd44224e3e3368e0e01c3e031384fe6b823cb3e63dc6ac202b5a\" pid:7139 exited_at:{seconds:1756942202 nanos:331113908}" Sep 3 23:30:04.725566 systemd[1]: Started sshd@24-10.200.20.13:22-10.200.16.10:34966.service - OpenSSH per-connection server daemon (10.200.16.10:34966). Sep 3 23:30:05.195333 sshd[7149]: Accepted publickey for core from 10.200.16.10 port 34966 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:30:05.196448 sshd-session[7149]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:30:05.200153 systemd-logind[1851]: New session 27 of user core. Sep 3 23:30:05.205671 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 3 23:30:05.572566 sshd[7152]: Connection closed by 10.200.16.10 port 34966 Sep 3 23:30:05.573090 sshd-session[7149]: pam_unix(sshd:session): session closed for user core Sep 3 23:30:05.576251 systemd[1]: sshd@24-10.200.20.13:22-10.200.16.10:34966.service: Deactivated successfully. Sep 3 23:30:05.578802 systemd[1]: session-27.scope: Deactivated successfully. Sep 3 23:30:05.579477 systemd-logind[1851]: Session 27 logged out. Waiting for processes to exit. Sep 3 23:30:05.580489 systemd-logind[1851]: Removed session 27.