Aug 19 00:12:46.996796 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Aug 19 00:12:46.996814 kernel: Linux version 6.12.41-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Mon Aug 18 22:15:14 -00 2025 Aug 19 00:12:46.996820 kernel: KASLR enabled Aug 19 00:12:46.996824 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Aug 19 00:12:46.996829 kernel: printk: legacy bootconsole [pl11] enabled Aug 19 00:12:46.996833 kernel: efi: EFI v2.7 by EDK II Aug 19 00:12:46.996838 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20d018 RNG=0x3fd5f998 MEMRESERVE=0x3e471598 Aug 19 00:12:46.996842 kernel: random: crng init done Aug 19 00:12:46.996846 kernel: secureboot: Secure boot disabled Aug 19 00:12:46.996850 kernel: ACPI: Early table checksum verification disabled Aug 19 00:12:46.996854 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Aug 19 00:12:46.996858 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996861 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996866 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Aug 19 00:12:46.996871 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996876 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996880 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996884 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996889 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996893 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996897 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Aug 19 00:12:46.996901 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 00:12:46.996906 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Aug 19 00:12:46.996910 kernel: ACPI: Use ACPI SPCR as default console: Yes Aug 19 00:12:46.996914 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Aug 19 00:12:46.996918 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Aug 19 00:12:46.996922 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Aug 19 00:12:46.996927 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Aug 19 00:12:46.996931 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Aug 19 00:12:46.996936 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Aug 19 00:12:46.996940 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Aug 19 00:12:46.996944 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Aug 19 00:12:46.996948 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Aug 19 00:12:46.996952 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Aug 19 00:12:46.996956 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Aug 19 00:12:46.996960 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Aug 19 00:12:46.996965 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Aug 19 00:12:46.996969 kernel: NODE_DATA(0) allocated [mem 0x1bf7fca00-0x1bf803fff] Aug 19 00:12:46.996973 kernel: Zone ranges: Aug 19 00:12:46.996977 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Aug 19 00:12:46.996984 kernel: DMA32 empty Aug 19 00:12:46.996988 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Aug 19 00:12:46.996993 kernel: Device empty Aug 19 00:12:46.996997 kernel: Movable zone start for each node Aug 19 00:12:46.997001 kernel: Early memory node ranges Aug 19 00:12:46.997006 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Aug 19 00:12:46.997011 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Aug 19 00:12:46.997015 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Aug 19 00:12:46.997020 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Aug 19 00:12:46.997024 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Aug 19 00:12:46.997028 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Aug 19 00:12:46.997033 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Aug 19 00:12:46.997037 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Aug 19 00:12:46.997041 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Aug 19 00:12:46.997046 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Aug 19 00:12:46.997050 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Aug 19 00:12:46.997054 kernel: cma: Reserved 16 MiB at 0x000000003d400000 on node -1 Aug 19 00:12:46.997059 kernel: psci: probing for conduit method from ACPI. Aug 19 00:12:46.997064 kernel: psci: PSCIv1.1 detected in firmware. Aug 19 00:12:46.997068 kernel: psci: Using standard PSCI v0.2 function IDs Aug 19 00:12:46.997072 kernel: psci: MIGRATE_INFO_TYPE not supported. Aug 19 00:12:46.997077 kernel: psci: SMC Calling Convention v1.4 Aug 19 00:12:46.997081 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Aug 19 00:12:46.997085 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Aug 19 00:12:46.997090 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Aug 19 00:12:46.997094 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Aug 19 00:12:46.997099 kernel: pcpu-alloc: [0] 0 [0] 1 Aug 19 00:12:46.997103 kernel: Detected PIPT I-cache on CPU0 Aug 19 00:12:46.997108 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Aug 19 00:12:46.997113 kernel: CPU features: detected: GIC system register CPU interface Aug 19 00:12:46.997117 kernel: CPU features: detected: Spectre-v4 Aug 19 00:12:46.997121 kernel: CPU features: detected: Spectre-BHB Aug 19 00:12:46.997126 kernel: CPU features: kernel page table isolation forced ON by KASLR Aug 19 00:12:46.997130 kernel: CPU features: detected: Kernel page table isolation (KPTI) Aug 19 00:12:46.997135 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Aug 19 00:12:46.997139 kernel: CPU features: detected: SSBS not fully self-synchronizing Aug 19 00:12:46.997143 kernel: alternatives: applying boot alternatives Aug 19 00:12:46.997148 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=a868ccde263e96e0a18737fdbf04ca04bbf30dfe23963f1ae3994966e8fc9468 Aug 19 00:12:46.997153 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 19 00:12:46.997158 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 19 00:12:46.997163 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 19 00:12:46.997167 kernel: Fallback order for Node 0: 0 Aug 19 00:12:46.997172 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Aug 19 00:12:46.997176 kernel: Policy zone: Normal Aug 19 00:12:46.997180 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 19 00:12:46.997184 kernel: software IO TLB: area num 2. Aug 19 00:12:46.997189 kernel: software IO TLB: mapped [mem 0x0000000036290000-0x000000003a290000] (64MB) Aug 19 00:12:46.997193 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 19 00:12:46.997198 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 19 00:12:46.997203 kernel: rcu: RCU event tracing is enabled. Aug 19 00:12:46.997208 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 19 00:12:46.997213 kernel: Trampoline variant of Tasks RCU enabled. Aug 19 00:12:46.997226 kernel: Tracing variant of Tasks RCU enabled. Aug 19 00:12:46.997231 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 19 00:12:46.997235 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 19 00:12:46.997240 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 00:12:46.997244 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 00:12:46.997249 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Aug 19 00:12:46.997253 kernel: GICv3: 960 SPIs implemented Aug 19 00:12:46.997257 kernel: GICv3: 0 Extended SPIs implemented Aug 19 00:12:46.997261 kernel: Root IRQ handler: gic_handle_irq Aug 19 00:12:46.997266 kernel: GICv3: GICv3 features: 16 PPIs, RSS Aug 19 00:12:46.997271 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Aug 19 00:12:46.997276 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Aug 19 00:12:46.997280 kernel: ITS: No ITS available, not enabling LPIs Aug 19 00:12:46.997285 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 19 00:12:46.997289 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Aug 19 00:12:46.997294 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Aug 19 00:12:46.997298 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Aug 19 00:12:46.997303 kernel: Console: colour dummy device 80x25 Aug 19 00:12:46.997307 kernel: printk: legacy console [tty1] enabled Aug 19 00:12:46.997312 kernel: ACPI: Core revision 20240827 Aug 19 00:12:46.997317 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Aug 19 00:12:46.997322 kernel: pid_max: default: 32768 minimum: 301 Aug 19 00:12:46.997327 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 19 00:12:46.997331 kernel: landlock: Up and running. Aug 19 00:12:46.997336 kernel: SELinux: Initializing. Aug 19 00:12:46.997340 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 00:12:46.997348 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 00:12:46.997354 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Aug 19 00:12:46.997359 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Aug 19 00:12:46.997363 kernel: Hyper-V: enabling crash_kexec_post_notifiers Aug 19 00:12:46.997368 kernel: rcu: Hierarchical SRCU implementation. Aug 19 00:12:46.997373 kernel: rcu: Max phase no-delay instances is 400. Aug 19 00:12:46.997378 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 19 00:12:46.997383 kernel: Remapping and enabling EFI services. Aug 19 00:12:46.997388 kernel: smp: Bringing up secondary CPUs ... Aug 19 00:12:46.997392 kernel: Detected PIPT I-cache on CPU1 Aug 19 00:12:46.997397 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Aug 19 00:12:46.997403 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Aug 19 00:12:46.997408 kernel: smp: Brought up 1 node, 2 CPUs Aug 19 00:12:46.997412 kernel: SMP: Total of 2 processors activated. Aug 19 00:12:46.997417 kernel: CPU: All CPU(s) started at EL1 Aug 19 00:12:46.997422 kernel: CPU features: detected: 32-bit EL0 Support Aug 19 00:12:46.997427 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Aug 19 00:12:46.997431 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Aug 19 00:12:46.997436 kernel: CPU features: detected: Common not Private translations Aug 19 00:12:46.997441 kernel: CPU features: detected: CRC32 instructions Aug 19 00:12:46.997446 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Aug 19 00:12:46.997451 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Aug 19 00:12:46.997456 kernel: CPU features: detected: LSE atomic instructions Aug 19 00:12:46.997461 kernel: CPU features: detected: Privileged Access Never Aug 19 00:12:46.997465 kernel: CPU features: detected: Speculation barrier (SB) Aug 19 00:12:46.997470 kernel: CPU features: detected: TLB range maintenance instructions Aug 19 00:12:46.997475 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Aug 19 00:12:46.997480 kernel: CPU features: detected: Scalable Vector Extension Aug 19 00:12:46.997484 kernel: alternatives: applying system-wide alternatives Aug 19 00:12:46.997490 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Aug 19 00:12:46.997495 kernel: SVE: maximum available vector length 16 bytes per vector Aug 19 00:12:46.997499 kernel: SVE: default vector length 16 bytes per vector Aug 19 00:12:46.997504 kernel: Memory: 3959664K/4194160K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38912K init, 1038K bss, 213308K reserved, 16384K cma-reserved) Aug 19 00:12:46.997509 kernel: devtmpfs: initialized Aug 19 00:12:46.997514 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 19 00:12:46.997519 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 19 00:12:46.997523 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Aug 19 00:12:46.997528 kernel: 0 pages in range for non-PLT usage Aug 19 00:12:46.997534 kernel: 508576 pages in range for PLT usage Aug 19 00:12:46.997538 kernel: pinctrl core: initialized pinctrl subsystem Aug 19 00:12:46.997543 kernel: SMBIOS 3.1.0 present. Aug 19 00:12:46.997548 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Aug 19 00:12:46.997553 kernel: DMI: Memory slots populated: 2/2 Aug 19 00:12:46.997557 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 19 00:12:46.997562 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Aug 19 00:12:46.997567 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Aug 19 00:12:46.997572 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Aug 19 00:12:46.997577 kernel: audit: initializing netlink subsys (disabled) Aug 19 00:12:46.997582 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Aug 19 00:12:46.997586 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 19 00:12:46.997591 kernel: cpuidle: using governor menu Aug 19 00:12:46.997596 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Aug 19 00:12:46.997601 kernel: ASID allocator initialised with 32768 entries Aug 19 00:12:46.997605 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 19 00:12:46.997610 kernel: Serial: AMBA PL011 UART driver Aug 19 00:12:46.997615 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 19 00:12:46.997620 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Aug 19 00:12:46.997625 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Aug 19 00:12:46.997630 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Aug 19 00:12:46.997635 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 19 00:12:46.997639 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Aug 19 00:12:46.997644 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Aug 19 00:12:46.997649 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Aug 19 00:12:46.997653 kernel: ACPI: Added _OSI(Module Device) Aug 19 00:12:46.997658 kernel: ACPI: Added _OSI(Processor Device) Aug 19 00:12:46.997664 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 19 00:12:46.997668 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 19 00:12:46.997673 kernel: ACPI: Interpreter enabled Aug 19 00:12:46.997678 kernel: ACPI: Using GIC for interrupt routing Aug 19 00:12:46.997682 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Aug 19 00:12:46.997687 kernel: printk: legacy console [ttyAMA0] enabled Aug 19 00:12:46.997692 kernel: printk: legacy bootconsole [pl11] disabled Aug 19 00:12:46.997697 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Aug 19 00:12:46.997701 kernel: ACPI: CPU0 has been hot-added Aug 19 00:12:46.997707 kernel: ACPI: CPU1 has been hot-added Aug 19 00:12:46.997711 kernel: iommu: Default domain type: Translated Aug 19 00:12:46.997716 kernel: iommu: DMA domain TLB invalidation policy: strict mode Aug 19 00:12:46.997721 kernel: efivars: Registered efivars operations Aug 19 00:12:46.997726 kernel: vgaarb: loaded Aug 19 00:12:46.997730 kernel: clocksource: Switched to clocksource arch_sys_counter Aug 19 00:12:46.997735 kernel: VFS: Disk quotas dquot_6.6.0 Aug 19 00:12:46.997740 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 19 00:12:46.997744 kernel: pnp: PnP ACPI init Aug 19 00:12:46.997750 kernel: pnp: PnP ACPI: found 0 devices Aug 19 00:12:46.997755 kernel: NET: Registered PF_INET protocol family Aug 19 00:12:46.997759 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 19 00:12:46.997764 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Aug 19 00:12:46.997769 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 19 00:12:46.997774 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 19 00:12:46.997779 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Aug 19 00:12:46.997783 kernel: TCP: Hash tables configured (established 32768 bind 32768) Aug 19 00:12:46.997788 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 00:12:46.997794 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 00:12:46.997798 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 19 00:12:46.997803 kernel: PCI: CLS 0 bytes, default 64 Aug 19 00:12:46.997808 kernel: kvm [1]: HYP mode not available Aug 19 00:12:46.997812 kernel: Initialise system trusted keyrings Aug 19 00:12:46.997817 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Aug 19 00:12:46.997822 kernel: Key type asymmetric registered Aug 19 00:12:46.997826 kernel: Asymmetric key parser 'x509' registered Aug 19 00:12:46.997831 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Aug 19 00:12:46.997837 kernel: io scheduler mq-deadline registered Aug 19 00:12:46.997845 kernel: io scheduler kyber registered Aug 19 00:12:46.997849 kernel: io scheduler bfq registered Aug 19 00:12:46.997854 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 19 00:12:46.997859 kernel: thunder_xcv, ver 1.0 Aug 19 00:12:46.997863 kernel: thunder_bgx, ver 1.0 Aug 19 00:12:46.997868 kernel: nicpf, ver 1.0 Aug 19 00:12:46.997873 kernel: nicvf, ver 1.0 Aug 19 00:12:46.997977 kernel: rtc-efi rtc-efi.0: registered as rtc0 Aug 19 00:12:46.998030 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-08-19T00:12:46 UTC (1755562366) Aug 19 00:12:46.998036 kernel: efifb: probing for efifb Aug 19 00:12:46.998041 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Aug 19 00:12:46.998046 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Aug 19 00:12:46.998051 kernel: efifb: scrolling: redraw Aug 19 00:12:46.998055 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Aug 19 00:12:46.998060 kernel: Console: switching to colour frame buffer device 128x48 Aug 19 00:12:46.998065 kernel: fb0: EFI VGA frame buffer device Aug 19 00:12:46.998071 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Aug 19 00:12:46.998075 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 19 00:12:46.998080 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Aug 19 00:12:46.998085 kernel: watchdog: NMI not fully supported Aug 19 00:12:46.998090 kernel: watchdog: Hard watchdog permanently disabled Aug 19 00:12:46.998094 kernel: NET: Registered PF_INET6 protocol family Aug 19 00:12:46.998099 kernel: Segment Routing with IPv6 Aug 19 00:12:46.998104 kernel: In-situ OAM (IOAM) with IPv6 Aug 19 00:12:46.998108 kernel: NET: Registered PF_PACKET protocol family Aug 19 00:12:46.998114 kernel: Key type dns_resolver registered Aug 19 00:12:46.998119 kernel: registered taskstats version 1 Aug 19 00:12:46.998124 kernel: Loading compiled-in X.509 certificates Aug 19 00:12:46.998129 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.41-flatcar: becc5a61d1c5dcbcd174f4649c64b863031dbaa8' Aug 19 00:12:46.998133 kernel: Demotion targets for Node 0: null Aug 19 00:12:46.998138 kernel: Key type .fscrypt registered Aug 19 00:12:46.998143 kernel: Key type fscrypt-provisioning registered Aug 19 00:12:46.998147 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 19 00:12:46.998152 kernel: ima: Allocated hash algorithm: sha1 Aug 19 00:12:46.998158 kernel: ima: No architecture policies found Aug 19 00:12:46.998162 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Aug 19 00:12:46.998167 kernel: clk: Disabling unused clocks Aug 19 00:12:46.998172 kernel: PM: genpd: Disabling unused power domains Aug 19 00:12:46.998177 kernel: Warning: unable to open an initial console. Aug 19 00:12:46.998181 kernel: Freeing unused kernel memory: 38912K Aug 19 00:12:46.998186 kernel: Run /init as init process Aug 19 00:12:46.998191 kernel: with arguments: Aug 19 00:12:46.998195 kernel: /init Aug 19 00:12:46.998201 kernel: with environment: Aug 19 00:12:46.998206 kernel: HOME=/ Aug 19 00:12:46.998210 kernel: TERM=linux Aug 19 00:12:46.998221 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 19 00:12:46.998227 systemd[1]: Successfully made /usr/ read-only. Aug 19 00:12:46.998234 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 00:12:46.998239 systemd[1]: Detected virtualization microsoft. Aug 19 00:12:46.998245 systemd[1]: Detected architecture arm64. Aug 19 00:12:46.998250 systemd[1]: Running in initrd. Aug 19 00:12:46.998255 systemd[1]: No hostname configured, using default hostname. Aug 19 00:12:46.998261 systemd[1]: Hostname set to . Aug 19 00:12:46.998266 systemd[1]: Initializing machine ID from random generator. Aug 19 00:12:46.998271 systemd[1]: Queued start job for default target initrd.target. Aug 19 00:12:46.998276 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:12:46.998281 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:12:46.998287 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 19 00:12:46.998293 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 00:12:46.998298 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 19 00:12:46.998304 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 19 00:12:46.998309 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 19 00:12:46.998315 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 19 00:12:46.998320 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:12:46.998326 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:12:46.998331 systemd[1]: Reached target paths.target - Path Units. Aug 19 00:12:46.998336 systemd[1]: Reached target slices.target - Slice Units. Aug 19 00:12:46.998341 systemd[1]: Reached target swap.target - Swaps. Aug 19 00:12:46.998347 systemd[1]: Reached target timers.target - Timer Units. Aug 19 00:12:46.998352 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 00:12:46.998357 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 00:12:46.998362 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 19 00:12:46.998367 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 19 00:12:46.998373 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:12:46.998378 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 00:12:46.998384 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:12:46.998389 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 00:12:46.998394 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 19 00:12:46.998399 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 00:12:46.998404 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 19 00:12:46.998410 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 19 00:12:46.998416 systemd[1]: Starting systemd-fsck-usr.service... Aug 19 00:12:46.998421 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 00:12:46.998426 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 00:12:46.998442 systemd-journald[225]: Collecting audit messages is disabled. Aug 19 00:12:46.998456 systemd-journald[225]: Journal started Aug 19 00:12:46.998470 systemd-journald[225]: Runtime Journal (/run/log/journal/7004dc1599e4425886e28633c5ba66c5) is 8M, max 78.5M, 70.5M free. Aug 19 00:12:47.002246 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:12:47.006909 systemd-modules-load[227]: Inserted module 'overlay' Aug 19 00:12:47.024225 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 19 00:12:47.024253 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 00:12:47.031420 kernel: Bridge firewalling registered Aug 19 00:12:47.031483 systemd-modules-load[227]: Inserted module 'br_netfilter' Aug 19 00:12:47.041668 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 19 00:12:47.046079 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:12:47.051483 systemd[1]: Finished systemd-fsck-usr.service. Aug 19 00:12:47.063252 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 00:12:47.070722 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:12:47.081360 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 19 00:12:47.089976 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 00:12:47.106809 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 00:12:47.117325 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 00:12:47.131258 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:12:47.140519 systemd-tmpfiles[246]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 19 00:12:47.143323 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 00:12:47.153242 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 00:12:47.163055 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:12:47.174091 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 19 00:12:47.187214 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 00:12:47.196786 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 00:12:47.216027 dracut-cmdline[260]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=a868ccde263e96e0a18737fdbf04ca04bbf30dfe23963f1ae3994966e8fc9468 Aug 19 00:12:47.211403 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:12:47.253906 systemd-resolved[261]: Positive Trust Anchors: Aug 19 00:12:47.253919 systemd-resolved[261]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 00:12:47.253938 systemd-resolved[261]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 00:12:47.255613 systemd-resolved[261]: Defaulting to hostname 'linux'. Aug 19 00:12:47.256827 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 00:12:47.261105 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:12:47.320232 kernel: SCSI subsystem initialized Aug 19 00:12:47.327232 kernel: Loading iSCSI transport class v2.0-870. Aug 19 00:12:47.333251 kernel: iscsi: registered transport (tcp) Aug 19 00:12:47.345551 kernel: iscsi: registered transport (qla4xxx) Aug 19 00:12:47.345590 kernel: QLogic iSCSI HBA Driver Aug 19 00:12:47.358840 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 00:12:47.382534 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:12:47.388189 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 00:12:47.432540 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 19 00:12:47.438329 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 19 00:12:47.504232 kernel: raid6: neonx8 gen() 18561 MB/s Aug 19 00:12:47.518223 kernel: raid6: neonx4 gen() 18553 MB/s Aug 19 00:12:47.537222 kernel: raid6: neonx2 gen() 17088 MB/s Aug 19 00:12:47.559235 kernel: raid6: neonx1 gen() 14997 MB/s Aug 19 00:12:47.574226 kernel: raid6: int64x8 gen() 10473 MB/s Aug 19 00:12:47.593223 kernel: raid6: int64x4 gen() 10611 MB/s Aug 19 00:12:47.613223 kernel: raid6: int64x2 gen() 8975 MB/s Aug 19 00:12:47.634046 kernel: raid6: int64x1 gen() 6996 MB/s Aug 19 00:12:47.634054 kernel: raid6: using algorithm neonx8 gen() 18561 MB/s Aug 19 00:12:47.655318 kernel: raid6: .... xor() 14904 MB/s, rmw enabled Aug 19 00:12:47.655383 kernel: raid6: using neon recovery algorithm Aug 19 00:12:47.663618 kernel: xor: measuring software checksum speed Aug 19 00:12:47.663627 kernel: 8regs : 28605 MB/sec Aug 19 00:12:47.665928 kernel: 32regs : 28793 MB/sec Aug 19 00:12:47.668182 kernel: arm64_neon : 37525 MB/sec Aug 19 00:12:47.671154 kernel: xor: using function: arm64_neon (37525 MB/sec) Aug 19 00:12:47.709241 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 19 00:12:47.715259 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 19 00:12:47.724360 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:12:47.751997 systemd-udevd[472]: Using default interface naming scheme 'v255'. Aug 19 00:12:47.755775 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:12:47.767134 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 19 00:12:47.802376 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Aug 19 00:12:47.822263 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 00:12:47.835356 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 00:12:47.867821 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:12:47.879675 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 19 00:12:47.937259 kernel: hv_vmbus: Vmbus version:5.3 Aug 19 00:12:47.941233 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:12:47.954612 kernel: pps_core: LinuxPPS API ver. 1 registered Aug 19 00:12:47.954631 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Aug 19 00:12:47.941328 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:12:47.967331 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:12:47.978873 kernel: PTP clock support registered Aug 19 00:12:47.978891 kernel: hv_vmbus: registering driver hv_netvsc Aug 19 00:12:47.978898 kernel: hv_vmbus: registering driver hyperv_keyboard Aug 19 00:12:47.979019 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:12:48.017678 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Aug 19 00:12:48.017706 kernel: hv_vmbus: registering driver hid_hyperv Aug 19 00:12:48.017714 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Aug 19 00:12:48.017720 kernel: hv_vmbus: registering driver hv_storvsc Aug 19 00:12:48.017726 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Aug 19 00:12:48.020383 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:12:48.035833 kernel: hv_utils: Registering HyperV Utility Driver Aug 19 00:12:48.035857 kernel: hv_vmbus: registering driver hv_utils Aug 19 00:12:48.020466 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:12:48.042888 kernel: hv_utils: Heartbeat IC version 3.0 Aug 19 00:12:48.042903 kernel: hv_utils: Shutdown IC version 3.2 Aug 19 00:12:48.033449 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:12:47.921854 kernel: hv_utils: TimeSync IC version 4.0 Aug 19 00:12:47.926187 kernel: scsi host0: storvsc_host_t Aug 19 00:12:47.926310 kernel: scsi host1: storvsc_host_t Aug 19 00:12:47.926379 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Aug 19 00:12:47.926395 systemd-journald[225]: Time jumped backwards, rotating. Aug 19 00:12:47.926420 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Aug 19 00:12:48.035590 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:12:47.940343 kernel: hv_netvsc 000d3ac4-8e1e-000d-3ac4-8e1e000d3ac4 eth0: VF slot 1 added Aug 19 00:12:47.907521 systemd-resolved[261]: Clock change detected. Flushing caches. Aug 19 00:12:47.949061 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:12:47.971167 kernel: hv_vmbus: registering driver hv_pci Aug 19 00:12:47.971181 kernel: hv_pci b6842335-28f3-444a-b2ec-dd38e325a1eb: PCI VMBus probing: Using version 0x10004 Aug 19 00:12:47.971314 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Aug 19 00:12:47.974170 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Aug 19 00:12:47.974308 kernel: hv_pci b6842335-28f3-444a-b2ec-dd38e325a1eb: PCI host bridge to bus 28f3:00 Aug 19 00:12:47.980741 kernel: sd 0:0:0:0: [sda] Write Protect is off Aug 19 00:12:47.980879 kernel: pci_bus 28f3:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Aug 19 00:12:47.980955 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Aug 19 00:12:47.988151 kernel: pci_bus 28f3:00: No busn resource found for root bus, will use [bus 00-ff] Aug 19 00:12:47.992534 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Aug 19 00:12:47.992625 kernel: pci 28f3:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Aug 19 00:12:48.001412 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#6 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Aug 19 00:12:48.007325 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#13 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Aug 19 00:12:48.007464 kernel: pci 28f3:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Aug 19 00:12:48.016251 kernel: pci 28f3:00:02.0: enabling Extended Tags Aug 19 00:12:48.028312 kernel: pci 28f3:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 28f3:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Aug 19 00:12:48.036230 kernel: pci_bus 28f3:00: busn_res: [bus 00-ff] end is updated to 00 Aug 19 00:12:48.036350 kernel: pci 28f3:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Aug 19 00:12:48.046859 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 19 00:12:48.046882 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Aug 19 00:12:48.054425 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Aug 19 00:12:48.054580 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 19 00:12:48.055528 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Aug 19 00:12:48.072265 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#206 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Aug 19 00:12:48.096277 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#244 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Aug 19 00:12:48.110381 kernel: mlx5_core 28f3:00:02.0: enabling device (0000 -> 0002) Aug 19 00:12:48.117395 kernel: mlx5_core 28f3:00:02.0: PTM is not supported by PCIe Aug 19 00:12:48.117530 kernel: mlx5_core 28f3:00:02.0: firmware version: 16.30.5006 Aug 19 00:12:48.289276 kernel: hv_netvsc 000d3ac4-8e1e-000d-3ac4-8e1e000d3ac4 eth0: VF registering: eth1 Aug 19 00:12:48.289479 kernel: mlx5_core 28f3:00:02.0 eth1: joined to eth0 Aug 19 00:12:48.294125 kernel: mlx5_core 28f3:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Aug 19 00:12:48.304249 kernel: mlx5_core 28f3:00:02.0 enP10483s1: renamed from eth1 Aug 19 00:12:48.668159 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Aug 19 00:12:48.692050 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Aug 19 00:12:48.702092 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Aug 19 00:12:48.709167 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Aug 19 00:12:48.715718 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 19 00:12:48.741166 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Aug 19 00:12:48.750293 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 19 00:12:48.764453 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#193 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Aug 19 00:12:48.770340 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 00:12:48.775062 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:12:48.784671 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 00:12:48.805332 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 19 00:12:48.789386 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 19 00:12:48.823856 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 19 00:12:49.815621 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#224 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Aug 19 00:12:49.831285 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 19 00:12:49.831318 disk-uuid[657]: The operation has completed successfully. Aug 19 00:12:49.897475 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 19 00:12:49.897569 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 19 00:12:49.922042 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 19 00:12:49.940267 sh[822]: Success Aug 19 00:12:49.974277 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 19 00:12:49.974308 kernel: device-mapper: uevent: version 1.0.3 Aug 19 00:12:49.978683 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 19 00:12:49.987243 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Aug 19 00:12:50.328730 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 19 00:12:50.335547 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 19 00:12:50.351272 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 19 00:12:50.367242 kernel: BTRFS: device fsid 1e492084-d287-4a43-8dc6-ad086a072625 devid 1 transid 45 /dev/mapper/usr (254:0) scanned by mount (840) Aug 19 00:12:50.377239 kernel: BTRFS info (device dm-0): first mount of filesystem 1e492084-d287-4a43-8dc6-ad086a072625 Aug 19 00:12:50.377265 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:12:50.377272 kernel: BTRFS info (device dm-0): using free-space-tree Aug 19 00:12:50.771648 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 19 00:12:50.778124 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 19 00:12:50.786020 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 19 00:12:50.786758 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 19 00:12:50.809149 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 19 00:12:50.839396 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (867) Aug 19 00:12:50.839423 kernel: BTRFS info (device sda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:12:50.843381 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:12:50.846299 kernel: BTRFS info (device sda6): using free-space-tree Aug 19 00:12:50.899473 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 00:12:50.908599 kernel: BTRFS info (device sda6): last unmount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:12:50.909175 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 00:12:50.916869 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 19 00:12:50.924338 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 19 00:12:50.946673 systemd-networkd[1007]: lo: Link UP Aug 19 00:12:50.946682 systemd-networkd[1007]: lo: Gained carrier Aug 19 00:12:50.947363 systemd-networkd[1007]: Enumeration completed Aug 19 00:12:50.948881 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 00:12:50.953022 systemd-networkd[1007]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:12:50.953025 systemd-networkd[1007]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:12:50.953437 systemd[1]: Reached target network.target - Network. Aug 19 00:12:51.019243 kernel: mlx5_core 28f3:00:02.0 enP10483s1: Link up Aug 19 00:12:51.051245 kernel: hv_netvsc 000d3ac4-8e1e-000d-3ac4-8e1e000d3ac4 eth0: Data path switched to VF: enP10483s1 Aug 19 00:12:51.051350 systemd-networkd[1007]: enP10483s1: Link UP Aug 19 00:12:51.051400 systemd-networkd[1007]: eth0: Link UP Aug 19 00:12:51.051461 systemd-networkd[1007]: eth0: Gained carrier Aug 19 00:12:51.051469 systemd-networkd[1007]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:12:51.069384 systemd-networkd[1007]: enP10483s1: Gained carrier Aug 19 00:12:51.077274 systemd-networkd[1007]: eth0: DHCPv4 address 10.200.20.41/24, gateway 10.200.20.1 acquired from 168.63.129.16 Aug 19 00:12:52.132054 ignition[1010]: Ignition 2.21.0 Aug 19 00:12:52.133877 ignition[1010]: Stage: fetch-offline Aug 19 00:12:52.133976 ignition[1010]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:12:52.135967 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 00:12:52.133982 ignition[1010]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 00:12:52.142237 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 19 00:12:52.134083 ignition[1010]: parsed url from cmdline: "" Aug 19 00:12:52.134085 ignition[1010]: no config URL provided Aug 19 00:12:52.134088 ignition[1010]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 00:12:52.134094 ignition[1010]: no config at "/usr/lib/ignition/user.ign" Aug 19 00:12:52.134098 ignition[1010]: failed to fetch config: resource requires networking Aug 19 00:12:52.134379 ignition[1010]: Ignition finished successfully Aug 19 00:12:52.170189 ignition[1020]: Ignition 2.21.0 Aug 19 00:12:52.170193 ignition[1020]: Stage: fetch Aug 19 00:12:52.170387 ignition[1020]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:12:52.170394 ignition[1020]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 00:12:52.170463 ignition[1020]: parsed url from cmdline: "" Aug 19 00:12:52.170466 ignition[1020]: no config URL provided Aug 19 00:12:52.170469 ignition[1020]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 00:12:52.170474 ignition[1020]: no config at "/usr/lib/ignition/user.ign" Aug 19 00:12:52.170505 ignition[1020]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Aug 19 00:12:52.269794 ignition[1020]: GET result: OK Aug 19 00:12:52.269854 ignition[1020]: config has been read from IMDS userdata Aug 19 00:12:52.269875 ignition[1020]: parsing config with SHA512: 08d7f7b324699c94c9e4a0e2b4af8831bb41287010c4d44c7f56a511821590c98d33b3c952731235f8660925c106eaec4977427b7f5bac40c646f358587eab4d Aug 19 00:12:52.276422 unknown[1020]: fetched base config from "system" Aug 19 00:12:52.276430 unknown[1020]: fetched base config from "system" Aug 19 00:12:52.276674 ignition[1020]: fetch: fetch complete Aug 19 00:12:52.276434 unknown[1020]: fetched user config from "azure" Aug 19 00:12:52.276678 ignition[1020]: fetch: fetch passed Aug 19 00:12:52.282334 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 19 00:12:52.276714 ignition[1020]: Ignition finished successfully Aug 19 00:12:52.287466 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 19 00:12:52.315014 ignition[1027]: Ignition 2.21.0 Aug 19 00:12:52.315028 ignition[1027]: Stage: kargs Aug 19 00:12:52.315180 ignition[1027]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:12:52.319526 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 19 00:12:52.315190 ignition[1027]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 00:12:52.324307 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 19 00:12:52.315870 ignition[1027]: kargs: kargs passed Aug 19 00:12:52.315905 ignition[1027]: Ignition finished successfully Aug 19 00:12:52.350204 ignition[1034]: Ignition 2.21.0 Aug 19 00:12:52.350217 ignition[1034]: Stage: disks Aug 19 00:12:52.354421 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 19 00:12:52.350432 ignition[1034]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:12:52.359113 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 19 00:12:52.350440 ignition[1034]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 00:12:52.366362 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 19 00:12:52.351498 ignition[1034]: disks: disks passed Aug 19 00:12:52.373612 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 00:12:52.351564 ignition[1034]: Ignition finished successfully Aug 19 00:12:52.381031 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 00:12:52.388967 systemd[1]: Reached target basic.target - Basic System. Aug 19 00:12:52.397507 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 19 00:12:52.475871 systemd-fsck[1042]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Aug 19 00:12:52.485078 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 19 00:12:52.490689 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 19 00:12:52.805358 systemd-networkd[1007]: eth0: Gained IPv6LL Aug 19 00:12:54.732242 kernel: EXT4-fs (sda9): mounted filesystem 593a9299-85f8-44ab-a00f-cf95b7233713 r/w with ordered data mode. Quota mode: none. Aug 19 00:12:54.732571 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 19 00:12:54.736057 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 19 00:12:54.774029 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 00:12:54.790753 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 19 00:12:54.805439 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 19 00:12:54.819590 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1056) Aug 19 00:12:54.819611 kernel: BTRFS info (device sda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:12:54.819926 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 19 00:12:54.838899 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:12:54.838915 kernel: BTRFS info (device sda6): using free-space-tree Aug 19 00:12:54.819954 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 00:12:54.825108 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 19 00:12:54.850163 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 19 00:12:54.857440 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 00:12:55.375597 coreos-metadata[1058]: Aug 19 00:12:55.375 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Aug 19 00:12:55.383200 coreos-metadata[1058]: Aug 19 00:12:55.383 INFO Fetch successful Aug 19 00:12:55.386748 coreos-metadata[1058]: Aug 19 00:12:55.386 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Aug 19 00:12:55.394178 coreos-metadata[1058]: Aug 19 00:12:55.394 INFO Fetch successful Aug 19 00:12:55.408428 coreos-metadata[1058]: Aug 19 00:12:55.408 INFO wrote hostname ci-4426.0.0-a-440c7464d3 to /sysroot/etc/hostname Aug 19 00:12:55.414991 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 19 00:12:55.832279 initrd-setup-root[1086]: cut: /sysroot/etc/passwd: No such file or directory Aug 19 00:12:55.897794 initrd-setup-root[1093]: cut: /sysroot/etc/group: No such file or directory Aug 19 00:12:55.919176 initrd-setup-root[1100]: cut: /sysroot/etc/shadow: No such file or directory Aug 19 00:12:55.923390 initrd-setup-root[1107]: cut: /sysroot/etc/gshadow: No such file or directory Aug 19 00:12:57.460213 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 19 00:12:57.466186 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 19 00:12:57.482780 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 19 00:12:57.488053 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 19 00:12:57.499241 kernel: BTRFS info (device sda6): last unmount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:12:57.511960 ignition[1175]: INFO : Ignition 2.21.0 Aug 19 00:12:57.511960 ignition[1175]: INFO : Stage: mount Aug 19 00:12:57.518443 ignition[1175]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:12:57.518443 ignition[1175]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 00:12:57.518443 ignition[1175]: INFO : mount: mount passed Aug 19 00:12:57.518443 ignition[1175]: INFO : Ignition finished successfully Aug 19 00:12:57.519280 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 19 00:12:57.532679 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 19 00:12:57.552452 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 19 00:12:57.560314 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 00:12:57.590411 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1186) Aug 19 00:12:57.590440 kernel: BTRFS info (device sda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:12:57.594627 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:12:57.597410 kernel: BTRFS info (device sda6): using free-space-tree Aug 19 00:12:57.601010 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 00:12:57.622677 ignition[1203]: INFO : Ignition 2.21.0 Aug 19 00:12:57.625394 ignition[1203]: INFO : Stage: files Aug 19 00:12:57.625394 ignition[1203]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:12:57.625394 ignition[1203]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 00:12:57.635712 ignition[1203]: DEBUG : files: compiled without relabeling support, skipping Aug 19 00:12:57.643234 ignition[1203]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 19 00:12:57.648809 ignition[1203]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 19 00:12:57.736247 ignition[1203]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 19 00:12:57.741683 ignition[1203]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 19 00:12:57.741683 ignition[1203]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 19 00:12:57.736657 unknown[1203]: wrote ssh authorized keys file for user: core Aug 19 00:12:57.783497 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 19 00:12:57.790717 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Aug 19 00:12:57.817731 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 19 00:12:57.921373 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 00:12:57.929101 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:12:58.000610 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:12:58.000610 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:12:58.000610 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Aug 19 00:12:58.718190 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 19 00:12:58.950084 ignition[1203]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:12:58.950084 ignition[1203]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 19 00:12:58.981147 ignition[1203]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 00:12:58.988335 ignition[1203]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 00:12:58.988335 ignition[1203]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 19 00:12:58.988335 ignition[1203]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Aug 19 00:12:58.988335 ignition[1203]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Aug 19 00:12:58.988335 ignition[1203]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 19 00:12:58.988335 ignition[1203]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 19 00:12:58.988335 ignition[1203]: INFO : files: files passed Aug 19 00:12:58.988335 ignition[1203]: INFO : Ignition finished successfully Aug 19 00:12:58.989359 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 19 00:12:59.000164 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 19 00:12:59.025750 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 19 00:12:59.041399 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 19 00:12:59.045309 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 19 00:12:59.068915 initrd-setup-root-after-ignition[1233]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:12:59.068915 initrd-setup-root-after-ignition[1233]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:12:59.084520 initrd-setup-root-after-ignition[1237]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:12:59.071380 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 00:12:59.078937 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 19 00:12:59.089295 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 19 00:12:59.135665 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 19 00:12:59.135757 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 19 00:12:59.143654 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 19 00:12:59.151413 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 19 00:12:59.158665 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 19 00:12:59.159199 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 19 00:12:59.190828 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 00:12:59.196447 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 19 00:12:59.222852 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:12:59.227165 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:12:59.235457 systemd[1]: Stopped target timers.target - Timer Units. Aug 19 00:12:59.242867 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 19 00:12:59.242947 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 00:12:59.253837 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 19 00:12:59.257667 systemd[1]: Stopped target basic.target - Basic System. Aug 19 00:12:59.264873 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 19 00:12:59.272535 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 00:12:59.279862 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 19 00:12:59.287594 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 19 00:12:59.295849 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 19 00:12:59.303676 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 00:12:59.311950 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 19 00:12:59.319401 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 19 00:12:59.327364 systemd[1]: Stopped target swap.target - Swaps. Aug 19 00:12:59.333981 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 19 00:12:59.334068 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 19 00:12:59.343967 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:12:59.348172 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:12:59.355662 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 19 00:12:59.358890 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:12:59.363384 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 19 00:12:59.363454 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 19 00:12:59.374463 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 19 00:12:59.374537 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 00:12:59.379256 systemd[1]: ignition-files.service: Deactivated successfully. Aug 19 00:12:59.379321 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 19 00:12:59.439295 ignition[1257]: INFO : Ignition 2.21.0 Aug 19 00:12:59.439295 ignition[1257]: INFO : Stage: umount Aug 19 00:12:59.439295 ignition[1257]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:12:59.439295 ignition[1257]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 00:12:59.439295 ignition[1257]: INFO : umount: umount passed Aug 19 00:12:59.439295 ignition[1257]: INFO : Ignition finished successfully Aug 19 00:12:59.385947 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 19 00:12:59.386007 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 19 00:12:59.395710 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 19 00:12:59.407482 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 19 00:12:59.407598 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:12:59.417455 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 19 00:12:59.435857 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 19 00:12:59.435985 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:12:59.441523 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 19 00:12:59.441599 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 00:12:59.451796 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 19 00:12:59.451868 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 19 00:12:59.459269 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 19 00:12:59.459345 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 19 00:12:59.468794 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 19 00:12:59.469257 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 19 00:12:59.483262 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 19 00:12:59.483308 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 19 00:12:59.493880 systemd[1]: Stopped target network.target - Network. Aug 19 00:12:59.500419 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 19 00:12:59.500474 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 00:12:59.509152 systemd[1]: Stopped target paths.target - Path Units. Aug 19 00:12:59.519722 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 19 00:12:59.523252 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:12:59.527906 systemd[1]: Stopped target slices.target - Slice Units. Aug 19 00:12:59.535116 systemd[1]: Stopped target sockets.target - Socket Units. Aug 19 00:12:59.542110 systemd[1]: iscsid.socket: Deactivated successfully. Aug 19 00:12:59.542150 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 00:12:59.549204 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 19 00:12:59.549238 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 00:12:59.556384 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 19 00:12:59.556429 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 19 00:12:59.563902 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 19 00:12:59.563934 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 19 00:12:59.571179 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 19 00:12:59.577909 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 19 00:12:59.585805 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 19 00:12:59.586282 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 19 00:12:59.586359 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 19 00:12:59.597182 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 19 00:12:59.597305 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 19 00:12:59.608762 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 19 00:12:59.608922 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 19 00:12:59.776374 kernel: hv_netvsc 000d3ac4-8e1e-000d-3ac4-8e1e000d3ac4 eth0: Data path switched from VF: enP10483s1 Aug 19 00:12:59.609015 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 19 00:12:59.619433 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 19 00:12:59.620046 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 19 00:12:59.627183 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 19 00:12:59.627215 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:12:59.638465 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 19 00:12:59.652465 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 19 00:12:59.652523 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 00:12:59.661061 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 19 00:12:59.661115 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:12:59.671325 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 19 00:12:59.671368 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 19 00:12:59.675392 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 19 00:12:59.675428 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:12:59.686193 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:12:59.694466 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 19 00:12:59.694520 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:12:59.711245 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 19 00:12:59.711376 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:12:59.723352 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 19 00:12:59.723431 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 19 00:12:59.730466 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 19 00:12:59.730492 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:12:59.737487 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 19 00:12:59.737520 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 19 00:12:59.748914 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 19 00:12:59.748948 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 19 00:12:59.757905 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 19 00:12:59.757946 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 00:12:59.777373 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 19 00:12:59.793273 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 19 00:12:59.793329 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:12:59.801412 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 19 00:12:59.801451 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:12:59.810692 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:12:59.810734 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:12:59.819494 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 19 00:12:59.819533 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 19 00:12:59.819559 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:12:59.819781 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 19 00:12:59.819891 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 19 00:12:59.826116 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 19 00:12:59.826183 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 19 00:12:59.835517 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 19 00:12:59.835633 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 19 00:12:59.842947 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 19 00:12:59.843040 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 19 00:12:59.850491 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 19 00:12:59.858151 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 19 00:12:59.879810 systemd[1]: Switching root. Aug 19 00:13:00.249237 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Aug 19 00:13:00.249295 systemd-journald[225]: Journal stopped Aug 19 00:13:09.648364 kernel: SELinux: policy capability network_peer_controls=1 Aug 19 00:13:09.648383 kernel: SELinux: policy capability open_perms=1 Aug 19 00:13:09.648390 kernel: SELinux: policy capability extended_socket_class=1 Aug 19 00:13:09.648397 kernel: SELinux: policy capability always_check_network=0 Aug 19 00:13:09.648404 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 19 00:13:09.648409 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 19 00:13:09.648415 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 19 00:13:09.648420 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 19 00:13:09.648426 kernel: SELinux: policy capability userspace_initial_context=0 Aug 19 00:13:09.648431 kernel: audit: type=1403 audit(1755562381.327:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 19 00:13:09.648438 systemd[1]: Successfully loaded SELinux policy in 220.288ms. Aug 19 00:13:09.648445 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.267ms. Aug 19 00:13:09.648452 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 00:13:09.648458 systemd[1]: Detected virtualization microsoft. Aug 19 00:13:09.648464 systemd[1]: Detected architecture arm64. Aug 19 00:13:09.648471 systemd[1]: Detected first boot. Aug 19 00:13:09.648477 systemd[1]: Hostname set to . Aug 19 00:13:09.648483 systemd[1]: Initializing machine ID from random generator. Aug 19 00:13:09.648489 zram_generator::config[1300]: No configuration found. Aug 19 00:13:09.648495 kernel: NET: Registered PF_VSOCK protocol family Aug 19 00:13:09.648501 systemd[1]: Populated /etc with preset unit settings. Aug 19 00:13:09.648507 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 19 00:13:09.648514 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 19 00:13:09.648520 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 19 00:13:09.648526 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 19 00:13:09.648532 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 19 00:13:09.648539 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 19 00:13:09.648544 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 19 00:13:09.648550 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 19 00:13:09.648557 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 19 00:13:09.648563 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 19 00:13:09.648569 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 19 00:13:09.648575 systemd[1]: Created slice user.slice - User and Session Slice. Aug 19 00:13:09.648581 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:13:09.648587 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:13:09.648593 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 19 00:13:09.648599 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 19 00:13:09.648605 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 19 00:13:09.648612 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 00:13:09.648618 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Aug 19 00:13:09.648626 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:13:09.648632 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:13:09.648638 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 19 00:13:09.648644 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 19 00:13:09.648650 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 19 00:13:09.648657 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 19 00:13:09.648664 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:13:09.648670 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 00:13:09.648676 systemd[1]: Reached target slices.target - Slice Units. Aug 19 00:13:09.648682 systemd[1]: Reached target swap.target - Swaps. Aug 19 00:13:09.648688 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 19 00:13:09.648694 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 19 00:13:09.648701 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 19 00:13:09.648707 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:13:09.648713 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 00:13:09.648720 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:13:09.648726 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 19 00:13:09.648732 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 19 00:13:09.648739 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 19 00:13:09.648745 systemd[1]: Mounting media.mount - External Media Directory... Aug 19 00:13:09.648751 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 19 00:13:09.648757 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 19 00:13:09.648763 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 19 00:13:09.648770 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 19 00:13:09.648776 systemd[1]: Reached target machines.target - Containers. Aug 19 00:13:09.648782 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 19 00:13:09.648790 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:13:09.648797 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 00:13:09.648803 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 19 00:13:09.648809 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:13:09.648815 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 00:13:09.648821 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:13:09.648827 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 19 00:13:09.648834 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:13:09.648840 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 19 00:13:09.648847 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 19 00:13:09.648853 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 19 00:13:09.648859 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 19 00:13:09.648865 systemd[1]: Stopped systemd-fsck-usr.service. Aug 19 00:13:09.648872 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:13:09.648878 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 00:13:09.648884 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 00:13:09.648890 kernel: loop: module loaded Aug 19 00:13:09.648897 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 00:13:09.648903 kernel: fuse: init (API version 7.41) Aug 19 00:13:09.648909 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 19 00:13:09.648915 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 19 00:13:09.648921 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 00:13:09.648927 systemd[1]: verity-setup.service: Deactivated successfully. Aug 19 00:13:09.648933 systemd[1]: Stopped verity-setup.service. Aug 19 00:13:09.648940 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 19 00:13:09.648945 kernel: ACPI: bus type drm_connector registered Aug 19 00:13:09.648952 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 19 00:13:09.648959 systemd[1]: Mounted media.mount - External Media Directory. Aug 19 00:13:09.648976 systemd-journald[1390]: Collecting audit messages is disabled. Aug 19 00:13:09.648992 systemd-journald[1390]: Journal started Aug 19 00:13:09.649006 systemd-journald[1390]: Runtime Journal (/run/log/journal/3163fa07b46f4f4aa0e11d56b2a6867c) is 8M, max 78.5M, 70.5M free. Aug 19 00:13:08.722624 systemd[1]: Queued start job for default target multi-user.target. Aug 19 00:13:08.733634 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 19 00:13:08.733980 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 19 00:13:08.734213 systemd[1]: systemd-journald.service: Consumed 2.141s CPU time. Aug 19 00:13:09.656962 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 00:13:09.657567 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 19 00:13:09.661570 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 19 00:13:09.665638 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 19 00:13:09.669424 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 19 00:13:09.673747 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:13:09.678660 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 19 00:13:09.678785 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 19 00:13:09.683084 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:13:09.683213 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:13:09.687891 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 00:13:09.688017 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 00:13:09.692432 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:13:09.692558 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:13:09.697204 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 19 00:13:09.697402 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 19 00:13:09.701862 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:13:09.701969 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:13:09.706297 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 00:13:09.710666 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:13:09.715613 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 19 00:13:09.726385 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 19 00:13:09.734039 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 00:13:09.741372 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 19 00:13:09.751976 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 19 00:13:09.758722 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 19 00:13:09.758749 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 00:13:09.763429 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 19 00:13:09.768880 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 19 00:13:09.773455 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:13:09.781340 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 19 00:13:09.785958 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 19 00:13:09.790172 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 00:13:09.790768 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 19 00:13:09.795130 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 00:13:09.795791 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 00:13:09.800356 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 19 00:13:09.805926 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 19 00:13:09.811514 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:13:09.816738 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 19 00:13:09.821046 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 19 00:13:09.840868 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 19 00:13:09.846066 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 19 00:13:09.850915 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 19 00:13:09.873542 systemd-journald[1390]: Time spent on flushing to /var/log/journal/3163fa07b46f4f4aa0e11d56b2a6867c is 43.102ms for 938 entries. Aug 19 00:13:09.873542 systemd-journald[1390]: System Journal (/var/log/journal/3163fa07b46f4f4aa0e11d56b2a6867c) is 11.8M, max 2.6G, 2.6G free. Aug 19 00:13:10.014620 systemd-journald[1390]: Received client request to flush runtime journal. Aug 19 00:13:10.014679 systemd-journald[1390]: /var/log/journal/3163fa07b46f4f4aa0e11d56b2a6867c/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Aug 19 00:13:10.014698 systemd-journald[1390]: Rotating system journal. Aug 19 00:13:10.014712 kernel: loop0: detected capacity change from 0 to 100608 Aug 19 00:13:10.015890 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 19 00:13:10.023444 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 19 00:13:10.025277 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:13:10.029753 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 19 00:13:10.612260 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 19 00:13:10.703249 kernel: loop1: detected capacity change from 0 to 29264 Aug 19 00:13:10.811615 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 19 00:13:10.817724 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 00:13:11.037556 systemd-tmpfiles[1457]: ACLs are not supported, ignoring. Aug 19 00:13:11.037568 systemd-tmpfiles[1457]: ACLs are not supported, ignoring. Aug 19 00:13:11.054865 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:13:11.546247 kernel: loop2: detected capacity change from 0 to 119320 Aug 19 00:13:12.025110 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 19 00:13:12.031082 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:13:12.055429 systemd-udevd[1462]: Using default interface naming scheme 'v255'. Aug 19 00:13:12.187254 kernel: loop3: detected capacity change from 0 to 211168 Aug 19 00:13:12.213253 kernel: loop4: detected capacity change from 0 to 100608 Aug 19 00:13:12.223273 kernel: loop5: detected capacity change from 0 to 29264 Aug 19 00:13:12.233260 kernel: loop6: detected capacity change from 0 to 119320 Aug 19 00:13:12.244255 kernel: loop7: detected capacity change from 0 to 211168 Aug 19 00:13:12.253129 (sd-merge)[1465]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Aug 19 00:13:12.253494 (sd-merge)[1465]: Merged extensions into '/usr'. Aug 19 00:13:12.255793 systemd[1]: Reload requested from client PID 1438 ('systemd-sysext') (unit systemd-sysext.service)... Aug 19 00:13:12.255882 systemd[1]: Reloading... Aug 19 00:13:12.299514 zram_generator::config[1487]: No configuration found. Aug 19 00:13:12.470344 systemd[1]: Reloading finished in 214 ms. Aug 19 00:13:12.499317 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 19 00:13:12.514127 systemd[1]: Starting ensure-sysext.service... Aug 19 00:13:12.518351 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 00:13:12.573782 systemd[1]: Reload requested from client PID 1546 ('systemctl') (unit ensure-sysext.service)... Aug 19 00:13:12.573913 systemd[1]: Reloading... Aug 19 00:13:12.598246 systemd-tmpfiles[1547]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 19 00:13:12.613723 systemd-tmpfiles[1547]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 19 00:13:12.628320 zram_generator::config[1581]: No configuration found. Aug 19 00:13:12.640019 systemd-tmpfiles[1547]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 19 00:13:12.640163 systemd-tmpfiles[1547]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 19 00:13:12.640621 systemd-tmpfiles[1547]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 19 00:13:12.640755 systemd-tmpfiles[1547]: ACLs are not supported, ignoring. Aug 19 00:13:12.640782 systemd-tmpfiles[1547]: ACLs are not supported, ignoring. Aug 19 00:13:12.690340 systemd-tmpfiles[1547]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 00:13:12.690351 systemd-tmpfiles[1547]: Skipping /boot Aug 19 00:13:12.694809 systemd-tmpfiles[1547]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 00:13:12.694821 systemd-tmpfiles[1547]: Skipping /boot Aug 19 00:13:12.753305 systemd[1]: Reloading finished in 179 ms. Aug 19 00:13:12.768496 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:13:12.777176 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 00:13:12.813032 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 19 00:13:12.825415 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 19 00:13:12.833405 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 00:13:12.840325 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 19 00:13:12.853076 systemd[1]: Finished ensure-sysext.service. Aug 19 00:13:12.856535 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Aug 19 00:13:12.861163 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:13:12.862375 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:13:12.871396 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 00:13:12.880872 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:13:12.887768 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:13:12.892299 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:13:12.892410 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:13:12.892514 systemd[1]: Reached target time-set.target - System Time Set. Aug 19 00:13:12.896595 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:13:12.896820 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:13:12.901557 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 00:13:12.901772 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 00:13:12.905827 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:13:12.906027 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:13:12.910694 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:13:12.910909 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:13:12.920283 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 19 00:13:12.925194 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 00:13:12.925394 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 00:13:12.926676 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 19 00:13:12.971074 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 19 00:13:13.073027 systemd-resolved[1636]: Positive Trust Anchors: Aug 19 00:13:13.073042 systemd-resolved[1636]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 00:13:13.073062 systemd-resolved[1636]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 00:13:13.127105 augenrules[1672]: No rules Aug 19 00:13:13.128263 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 00:13:13.128455 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 00:13:13.145970 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:13:13.154027 systemd-resolved[1636]: Using system hostname 'ci-4426.0.0-a-440c7464d3'. Aug 19 00:13:13.155403 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 00:13:13.163368 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 00:13:13.172137 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 19 00:13:13.181069 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:13:13.236276 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Aug 19 00:13:13.331250 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#203 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Aug 19 00:13:13.367157 kernel: mousedev: PS/2 mouse device common for all mice Aug 19 00:13:13.367254 kernel: hv_vmbus: registering driver hv_balloon Aug 19 00:13:13.367275 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Aug 19 00:13:13.371474 kernel: hv_balloon: Memory hot add disabled on ARM64 Aug 19 00:13:13.376520 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Aug 19 00:13:13.413274 kernel: hv_vmbus: registering driver hyperv_fb Aug 19 00:13:13.421054 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Aug 19 00:13:13.421106 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Aug 19 00:13:13.425756 kernel: Console: switching to colour dummy device 80x25 Aug 19 00:13:13.422699 systemd-networkd[1695]: lo: Link UP Aug 19 00:13:13.423921 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:13:13.426255 systemd-networkd[1695]: lo: Gained carrier Aug 19 00:13:13.428704 systemd-networkd[1695]: Enumeration completed Aug 19 00:13:13.429220 systemd-networkd[1695]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:13:13.429302 systemd-networkd[1695]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:13:13.435743 kernel: Console: switching to colour frame buffer device 128x48 Aug 19 00:13:13.437172 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 00:13:13.444663 systemd[1]: Reached target network.target - Network. Aug 19 00:13:13.450570 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 19 00:13:13.456801 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 19 00:13:13.470132 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:13:13.470342 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:13:13.476066 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:13:13.480592 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:13:13.487951 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:13:13.488114 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:13:13.496511 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:13:13.511254 kernel: mlx5_core 28f3:00:02.0 enP10483s1: Link up Aug 19 00:13:13.538322 kernel: hv_netvsc 000d3ac4-8e1e-000d-3ac4-8e1e000d3ac4 eth0: Data path switched to VF: enP10483s1 Aug 19 00:13:13.539667 systemd-networkd[1695]: enP10483s1: Link UP Aug 19 00:13:13.539836 systemd-networkd[1695]: eth0: Link UP Aug 19 00:13:13.539839 systemd-networkd[1695]: eth0: Gained carrier Aug 19 00:13:13.539861 systemd-networkd[1695]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:13:13.545669 systemd-networkd[1695]: enP10483s1: Gained carrier Aug 19 00:13:13.549818 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 19 00:13:13.555358 systemd-networkd[1695]: eth0: DHCPv4 address 10.200.20.41/24, gateway 10.200.20.1 acquired from 168.63.129.16 Aug 19 00:13:13.572439 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Aug 19 00:13:13.583561 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 19 00:13:13.631247 kernel: MACsec IEEE 802.1AE Aug 19 00:13:13.683008 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 19 00:13:15.077446 systemd-networkd[1695]: eth0: Gained IPv6LL Aug 19 00:13:15.080292 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 19 00:13:15.085052 systemd[1]: Reached target network-online.target - Network is Online. Aug 19 00:13:17.076160 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:13:18.827557 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 19 00:13:18.832597 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 00:13:31.994223 ldconfig[1433]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 19 00:13:32.009068 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 19 00:13:32.015874 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 19 00:13:32.043480 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 19 00:13:32.047922 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 00:13:32.051800 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 19 00:13:32.056734 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 19 00:13:32.061651 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 19 00:13:32.065708 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 19 00:13:32.070207 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 19 00:13:32.075212 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 19 00:13:32.075251 systemd[1]: Reached target paths.target - Path Units. Aug 19 00:13:32.078737 systemd[1]: Reached target timers.target - Timer Units. Aug 19 00:13:32.111219 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 19 00:13:32.116331 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 19 00:13:32.121263 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 19 00:13:32.125978 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 19 00:13:32.130498 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 19 00:13:32.141612 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 19 00:13:32.161117 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 19 00:13:32.165897 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 19 00:13:32.169753 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 00:13:32.173064 systemd[1]: Reached target basic.target - Basic System. Aug 19 00:13:32.176347 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 19 00:13:32.176368 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 19 00:13:32.194952 systemd[1]: Starting chronyd.service - NTP client/server... Aug 19 00:13:32.209317 systemd[1]: Starting containerd.service - containerd container runtime... Aug 19 00:13:32.214151 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 19 00:13:32.219827 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 19 00:13:32.229701 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 19 00:13:32.234992 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 19 00:13:32.241380 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 19 00:13:32.245332 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 19 00:13:32.247084 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Aug 19 00:13:32.254098 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Aug 19 00:13:32.256323 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:13:32.261842 jq[1834]: false Aug 19 00:13:32.262257 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 19 00:13:32.268095 KVP[1836]: KVP starting; pid is:1836 Aug 19 00:13:32.268346 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 19 00:13:32.270780 chronyd[1826]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Aug 19 00:13:32.272862 KVP[1836]: KVP LIC Version: 3.1 Aug 19 00:13:32.272992 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 19 00:13:32.274244 kernel: hv_utils: KVP IC version 4.0 Aug 19 00:13:32.282306 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 19 00:13:32.289370 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 19 00:13:32.299026 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 19 00:13:32.304598 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 19 00:13:32.304925 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 19 00:13:32.305358 systemd[1]: Starting update-engine.service - Update Engine... Aug 19 00:13:32.311851 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 19 00:13:32.320078 extend-filesystems[1835]: Found /dev/sda6 Aug 19 00:13:32.319458 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 19 00:13:32.323890 chronyd[1826]: Timezone right/UTC failed leap second check, ignoring Aug 19 00:13:32.327952 systemd[1]: Started chronyd.service - NTP client/server. Aug 19 00:13:32.334267 jq[1851]: true Aug 19 00:13:32.324017 chronyd[1826]: Loaded seccomp filter (level 2) Aug 19 00:13:32.332739 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 19 00:13:32.337645 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 19 00:13:32.340302 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 19 00:13:32.340676 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 19 00:13:32.361582 systemd[1]: motdgen.service: Deactivated successfully. Aug 19 00:13:32.364484 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 19 00:13:32.366275 extend-filesystems[1835]: Found /dev/sda9 Aug 19 00:13:32.378479 extend-filesystems[1835]: Checking size of /dev/sda9 Aug 19 00:13:32.382189 update_engine[1850]: I20250819 00:13:32.375937 1850 main.cc:92] Flatcar Update Engine starting Aug 19 00:13:32.382617 (ntainerd)[1864]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 19 00:13:32.384870 jq[1863]: true Aug 19 00:13:32.406672 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 19 00:13:32.410334 extend-filesystems[1835]: Old size kept for /dev/sda9 Aug 19 00:13:32.413669 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 19 00:13:32.414276 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 19 00:13:32.425907 systemd-logind[1846]: New seat seat0. Aug 19 00:13:32.426587 systemd-logind[1846]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 19 00:13:32.426896 systemd[1]: Started systemd-logind.service - User Login Management. Aug 19 00:13:32.448039 tar[1857]: linux-arm64/LICENSE Aug 19 00:13:32.448039 tar[1857]: linux-arm64/helm Aug 19 00:13:32.544124 bash[1896]: Updated "/home/core/.ssh/authorized_keys" Aug 19 00:13:32.545164 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 19 00:13:32.555062 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Aug 19 00:13:32.704300 sshd_keygen[1875]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 19 00:13:32.737519 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 19 00:13:32.745824 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 19 00:13:32.752335 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Aug 19 00:13:32.759734 dbus-daemon[1829]: [system] SELinux support is enabled Aug 19 00:13:32.760184 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 19 00:13:32.762591 update_engine[1850]: I20250819 00:13:32.762471 1850 update_check_scheduler.cc:74] Next update check in 11m33s Aug 19 00:13:32.767745 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 19 00:13:32.767765 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 19 00:13:32.769525 dbus-daemon[1829]: [system] Successfully activated service 'org.freedesktop.systemd1' Aug 19 00:13:32.774451 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 19 00:13:32.774469 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 19 00:13:32.781531 systemd[1]: Started update-engine.service - Update Engine. Aug 19 00:13:32.792940 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 19 00:13:32.800201 systemd[1]: issuegen.service: Deactivated successfully. Aug 19 00:13:32.805777 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 19 00:13:32.811934 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Aug 19 00:13:32.824351 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 19 00:13:32.850150 tar[1857]: linux-arm64/README.md Aug 19 00:13:32.851746 coreos-metadata[1828]: Aug 19 00:13:32.851 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Aug 19 00:13:32.857438 coreos-metadata[1828]: Aug 19 00:13:32.857 INFO Fetch successful Aug 19 00:13:32.857611 coreos-metadata[1828]: Aug 19 00:13:32.857 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Aug 19 00:13:32.862531 coreos-metadata[1828]: Aug 19 00:13:32.862 INFO Fetch successful Aug 19 00:13:32.862531 coreos-metadata[1828]: Aug 19 00:13:32.862 INFO Fetching http://168.63.129.16/machine/a057d9a7-a247-4de0-9ebb-ea5ea1e4a284/bbdaa16b%2D3502%2D4883%2Da6b8%2Dfc4897c46d53.%5Fci%2D4426.0.0%2Da%2D440c7464d3?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Aug 19 00:13:32.864432 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 19 00:13:32.864743 coreos-metadata[1828]: Aug 19 00:13:32.864 INFO Fetch successful Aug 19 00:13:32.864856 coreos-metadata[1828]: Aug 19 00:13:32.864 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Aug 19 00:13:32.873564 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 19 00:13:32.878994 coreos-metadata[1828]: Aug 19 00:13:32.878 INFO Fetch successful Aug 19 00:13:32.879958 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Aug 19 00:13:32.886659 systemd[1]: Reached target getty.target - Login Prompts. Aug 19 00:13:32.893689 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 19 00:13:32.923037 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 19 00:13:32.927715 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 19 00:13:32.983437 locksmithd[1982]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 19 00:13:33.162026 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:13:33.343623 containerd[1864]: time="2025-08-19T00:13:33Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 19 00:13:33.344424 containerd[1864]: time="2025-08-19T00:13:33.344393408Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Aug 19 00:13:33.349532 containerd[1864]: time="2025-08-19T00:13:33.349501896Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.384µs" Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349591120Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349612688Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349745056Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349756000Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349771496Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349815328Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349822728Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349975056Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349984856Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349992088Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.349998120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350273 containerd[1864]: time="2025-08-19T00:13:33.350046000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350462 containerd[1864]: time="2025-08-19T00:13:33.350191472Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350462 containerd[1864]: time="2025-08-19T00:13:33.350210696Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 00:13:33.350462 containerd[1864]: time="2025-08-19T00:13:33.350216832Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 19 00:13:33.350462 containerd[1864]: time="2025-08-19T00:13:33.350267664Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 19 00:13:33.350462 containerd[1864]: time="2025-08-19T00:13:33.350410776Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 19 00:13:33.350524 containerd[1864]: time="2025-08-19T00:13:33.350466488Z" level=info msg="metadata content store policy set" policy=shared Aug 19 00:13:33.365469 containerd[1864]: time="2025-08-19T00:13:33.365442784Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 19 00:13:33.365524 containerd[1864]: time="2025-08-19T00:13:33.365483392Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 19 00:13:33.365524 containerd[1864]: time="2025-08-19T00:13:33.365495248Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 19 00:13:33.365524 containerd[1864]: time="2025-08-19T00:13:33.365503280Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 19 00:13:33.365524 containerd[1864]: time="2025-08-19T00:13:33.365511320Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 19 00:13:33.365524 containerd[1864]: time="2025-08-19T00:13:33.365519560Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 19 00:13:33.365588 containerd[1864]: time="2025-08-19T00:13:33.365530568Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 19 00:13:33.365588 containerd[1864]: time="2025-08-19T00:13:33.365538384Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 19 00:13:33.365588 containerd[1864]: time="2025-08-19T00:13:33.365547304Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 19 00:13:33.365588 containerd[1864]: time="2025-08-19T00:13:33.365553880Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 19 00:13:33.365588 containerd[1864]: time="2025-08-19T00:13:33.365560112Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 19 00:13:33.365588 containerd[1864]: time="2025-08-19T00:13:33.365570272Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 19 00:13:33.365693 containerd[1864]: time="2025-08-19T00:13:33.365664560Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 19 00:13:33.365693 containerd[1864]: time="2025-08-19T00:13:33.365682920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 19 00:13:33.365693 containerd[1864]: time="2025-08-19T00:13:33.365693568Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 19 00:13:33.365734 containerd[1864]: time="2025-08-19T00:13:33.365701288Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 19 00:13:33.365734 containerd[1864]: time="2025-08-19T00:13:33.365707976Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 19 00:13:33.365734 containerd[1864]: time="2025-08-19T00:13:33.365714680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 19 00:13:33.365734 containerd[1864]: time="2025-08-19T00:13:33.365721616Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 19 00:13:33.365734 containerd[1864]: time="2025-08-19T00:13:33.365728072Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 19 00:13:33.365843 containerd[1864]: time="2025-08-19T00:13:33.365735944Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 19 00:13:33.365843 containerd[1864]: time="2025-08-19T00:13:33.365742424Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 19 00:13:33.365843 containerd[1864]: time="2025-08-19T00:13:33.365749472Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 19 00:13:33.365843 containerd[1864]: time="2025-08-19T00:13:33.365794488Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 19 00:13:33.365843 containerd[1864]: time="2025-08-19T00:13:33.365803760Z" level=info msg="Start snapshots syncer" Aug 19 00:13:33.365843 containerd[1864]: time="2025-08-19T00:13:33.365825800Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 19 00:13:33.365987 containerd[1864]: time="2025-08-19T00:13:33.365960312Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 19 00:13:33.366083 containerd[1864]: time="2025-08-19T00:13:33.365994336Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 19 00:13:33.366083 containerd[1864]: time="2025-08-19T00:13:33.366046536Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 19 00:13:33.366142 containerd[1864]: time="2025-08-19T00:13:33.366135096Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 19 00:13:33.366156 containerd[1864]: time="2025-08-19T00:13:33.366149824Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 19 00:13:33.366169 containerd[1864]: time="2025-08-19T00:13:33.366156760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 19 00:13:33.366169 containerd[1864]: time="2025-08-19T00:13:33.366165352Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 19 00:13:33.366192 containerd[1864]: time="2025-08-19T00:13:33.366172648Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 19 00:13:33.366192 containerd[1864]: time="2025-08-19T00:13:33.366179856Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 19 00:13:33.366192 containerd[1864]: time="2025-08-19T00:13:33.366186808Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 19 00:13:33.366237 containerd[1864]: time="2025-08-19T00:13:33.366210568Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 19 00:13:33.366237 containerd[1864]: time="2025-08-19T00:13:33.366218072Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 19 00:13:33.366285 containerd[1864]: time="2025-08-19T00:13:33.366224280Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 19 00:13:33.366285 containerd[1864]: time="2025-08-19T00:13:33.366266336Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 00:13:33.366285 containerd[1864]: time="2025-08-19T00:13:33.366275056Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 00:13:33.366285 containerd[1864]: time="2025-08-19T00:13:33.366280128Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 00:13:33.366285 containerd[1864]: time="2025-08-19T00:13:33.366285608Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366290584Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366296888Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366303696Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366314136Z" level=info msg="runtime interface created" Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366317176Z" level=info msg="created NRI interface" Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366322496Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366330040Z" level=info msg="Connect containerd service" Aug 19 00:13:33.366394 containerd[1864]: time="2025-08-19T00:13:33.366346936Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 19 00:13:33.366937 containerd[1864]: time="2025-08-19T00:13:33.366912064Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 00:13:33.646678 (kubelet)[2013]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:13:33.974612 kubelet[2013]: E0819 00:13:33.974506 2013 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:13:33.976397 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:13:33.976502 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:13:33.976774 systemd[1]: kubelet.service: Consumed 537ms CPU time, 258M memory peak. Aug 19 00:13:34.004187 containerd[1864]: time="2025-08-19T00:13:34.004135912Z" level=info msg="Start subscribing containerd event" Aug 19 00:13:34.004274 containerd[1864]: time="2025-08-19T00:13:34.004192136Z" level=info msg="Start recovering state" Aug 19 00:13:34.004307 containerd[1864]: time="2025-08-19T00:13:34.004294160Z" level=info msg="Start event monitor" Aug 19 00:13:34.004330 containerd[1864]: time="2025-08-19T00:13:34.004320168Z" level=info msg="Start cni network conf syncer for default" Aug 19 00:13:34.004344 containerd[1864]: time="2025-08-19T00:13:34.004329984Z" level=info msg="Start streaming server" Aug 19 00:13:34.004370 containerd[1864]: time="2025-08-19T00:13:34.004360992Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 19 00:13:34.004370 containerd[1864]: time="2025-08-19T00:13:34.004368656Z" level=info msg="runtime interface starting up..." Aug 19 00:13:34.004393 containerd[1864]: time="2025-08-19T00:13:34.004373040Z" level=info msg="starting plugins..." Aug 19 00:13:34.004393 containerd[1864]: time="2025-08-19T00:13:34.004389960Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 19 00:13:34.004550 containerd[1864]: time="2025-08-19T00:13:34.004525664Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 19 00:13:34.004649 containerd[1864]: time="2025-08-19T00:13:34.004637616Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 19 00:13:34.004735 containerd[1864]: time="2025-08-19T00:13:34.004724800Z" level=info msg="containerd successfully booted in 0.661694s" Aug 19 00:13:34.004833 systemd[1]: Started containerd.service - containerd container runtime. Aug 19 00:13:34.010053 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 19 00:13:34.015497 systemd[1]: Startup finished in 1.564s (kernel) + 14.641s (initrd) + 32.906s (userspace) = 49.113s. Aug 19 00:13:34.719735 login[1993]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Aug 19 00:13:34.733680 login[1992]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:34.738621 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 19 00:13:34.739361 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 19 00:13:34.744163 systemd-logind[1846]: New session 2 of user core. Aug 19 00:13:34.768606 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 19 00:13:34.770325 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 19 00:13:34.792222 (systemd)[2040]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 19 00:13:34.793809 systemd-logind[1846]: New session c1 of user core. Aug 19 00:13:35.134480 systemd[2040]: Queued start job for default target default.target. Aug 19 00:13:35.140897 systemd[2040]: Created slice app.slice - User Application Slice. Aug 19 00:13:35.140919 systemd[2040]: Reached target paths.target - Paths. Aug 19 00:13:35.140948 systemd[2040]: Reached target timers.target - Timers. Aug 19 00:13:35.141866 systemd[2040]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 19 00:13:35.148442 systemd[2040]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 19 00:13:35.148482 systemd[2040]: Reached target sockets.target - Sockets. Aug 19 00:13:35.148508 systemd[2040]: Reached target basic.target - Basic System. Aug 19 00:13:35.148528 systemd[2040]: Reached target default.target - Main User Target. Aug 19 00:13:35.148547 systemd[2040]: Startup finished in 350ms. Aug 19 00:13:35.148755 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 19 00:13:35.150605 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 19 00:13:35.501007 waagent[1985]: 2025-08-19T00:13:35.500880Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Aug 19 00:13:35.504797 waagent[1985]: 2025-08-19T00:13:35.504764Z INFO Daemon Daemon OS: flatcar 4426.0.0 Aug 19 00:13:35.507768 waagent[1985]: 2025-08-19T00:13:35.507741Z INFO Daemon Daemon Python: 3.11.13 Aug 19 00:13:35.510632 waagent[1985]: 2025-08-19T00:13:35.510599Z INFO Daemon Daemon Run daemon Aug 19 00:13:35.513543 waagent[1985]: 2025-08-19T00:13:35.513509Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4426.0.0' Aug 19 00:13:35.519247 waagent[1985]: 2025-08-19T00:13:35.519218Z INFO Daemon Daemon Using waagent for provisioning Aug 19 00:13:35.523029 waagent[1985]: 2025-08-19T00:13:35.522998Z INFO Daemon Daemon Activate resource disk Aug 19 00:13:35.526173 waagent[1985]: 2025-08-19T00:13:35.526149Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Aug 19 00:13:35.533619 waagent[1985]: 2025-08-19T00:13:35.533585Z INFO Daemon Daemon Found device: None Aug 19 00:13:35.536706 waagent[1985]: 2025-08-19T00:13:35.536679Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Aug 19 00:13:35.542240 waagent[1985]: 2025-08-19T00:13:35.542213Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Aug 19 00:13:35.550048 waagent[1985]: 2025-08-19T00:13:35.550013Z INFO Daemon Daemon Clean protocol and wireserver endpoint Aug 19 00:13:35.553890 waagent[1985]: 2025-08-19T00:13:35.553863Z INFO Daemon Daemon Running default provisioning handler Aug 19 00:13:35.562202 waagent[1985]: 2025-08-19T00:13:35.562162Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Aug 19 00:13:35.571308 waagent[1985]: 2025-08-19T00:13:35.571275Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Aug 19 00:13:35.577760 waagent[1985]: 2025-08-19T00:13:35.577734Z INFO Daemon Daemon cloud-init is enabled: False Aug 19 00:13:35.581065 waagent[1985]: 2025-08-19T00:13:35.581046Z INFO Daemon Daemon Copying ovf-env.xml Aug 19 00:13:35.713970 waagent[1985]: 2025-08-19T00:13:35.712709Z INFO Daemon Daemon Successfully mounted dvd Aug 19 00:13:35.720831 login[1993]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:35.724578 systemd-logind[1846]: New session 1 of user core. Aug 19 00:13:35.735327 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 19 00:13:35.738664 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Aug 19 00:13:35.743331 waagent[1985]: 2025-08-19T00:13:35.742501Z INFO Daemon Daemon Detect protocol endpoint Aug 19 00:13:35.745838 waagent[1985]: 2025-08-19T00:13:35.745792Z INFO Daemon Daemon Clean protocol and wireserver endpoint Aug 19 00:13:35.749855 waagent[1985]: 2025-08-19T00:13:35.749806Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Aug 19 00:13:35.754131 waagent[1985]: 2025-08-19T00:13:35.754101Z INFO Daemon Daemon Test for route to 168.63.129.16 Aug 19 00:13:35.758121 waagent[1985]: 2025-08-19T00:13:35.758088Z INFO Daemon Daemon Route to 168.63.129.16 exists Aug 19 00:13:35.761796 waagent[1985]: 2025-08-19T00:13:35.761770Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Aug 19 00:13:35.808836 waagent[1985]: 2025-08-19T00:13:35.808790Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Aug 19 00:13:35.813373 waagent[1985]: 2025-08-19T00:13:35.813348Z INFO Daemon Daemon Wire protocol version:2012-11-30 Aug 19 00:13:35.817565 waagent[1985]: 2025-08-19T00:13:35.816998Z INFO Daemon Daemon Server preferred version:2015-04-05 Aug 19 00:13:35.897411 waagent[1985]: 2025-08-19T00:13:35.897334Z INFO Daemon Daemon Initializing goal state during protocol detection Aug 19 00:13:35.902062 waagent[1985]: 2025-08-19T00:13:35.902028Z INFO Daemon Daemon Forcing an update of the goal state. Aug 19 00:13:35.909290 waagent[1985]: 2025-08-19T00:13:35.909255Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Aug 19 00:13:35.962637 waagent[1985]: 2025-08-19T00:13:35.962606Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Aug 19 00:13:35.966537 waagent[1985]: 2025-08-19T00:13:35.966505Z INFO Daemon Aug 19 00:13:35.968412 waagent[1985]: 2025-08-19T00:13:35.968386Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 8b64d583-c69c-4f3a-abb8-e537d2cfb471 eTag: 16674349773925400803 source: Fabric] Aug 19 00:13:35.976064 waagent[1985]: 2025-08-19T00:13:35.976033Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Aug 19 00:13:35.981037 waagent[1985]: 2025-08-19T00:13:35.981010Z INFO Daemon Aug 19 00:13:35.982973 waagent[1985]: 2025-08-19T00:13:35.982950Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Aug 19 00:13:35.990799 waagent[1985]: 2025-08-19T00:13:35.990775Z INFO Daemon Daemon Downloading artifacts profile blob Aug 19 00:13:36.108763 waagent[1985]: 2025-08-19T00:13:36.108678Z INFO Daemon Downloaded certificate {'thumbprint': '3AA6D277423F183CFDD796E439C7FA2D21E85F7D', 'hasPrivateKey': True} Aug 19 00:13:36.115336 waagent[1985]: 2025-08-19T00:13:36.115299Z INFO Daemon Fetch goal state completed Aug 19 00:13:36.155817 waagent[1985]: 2025-08-19T00:13:36.155785Z INFO Daemon Daemon Starting provisioning Aug 19 00:13:36.159284 waagent[1985]: 2025-08-19T00:13:36.159254Z INFO Daemon Daemon Handle ovf-env.xml. Aug 19 00:13:36.162478 waagent[1985]: 2025-08-19T00:13:36.162456Z INFO Daemon Daemon Set hostname [ci-4426.0.0-a-440c7464d3] Aug 19 00:13:36.182298 waagent[1985]: 2025-08-19T00:13:36.182260Z INFO Daemon Daemon Publish hostname [ci-4426.0.0-a-440c7464d3] Aug 19 00:13:36.186523 waagent[1985]: 2025-08-19T00:13:36.186491Z INFO Daemon Daemon Examine /proc/net/route for primary interface Aug 19 00:13:36.190829 waagent[1985]: 2025-08-19T00:13:36.190800Z INFO Daemon Daemon Primary interface is [eth0] Aug 19 00:13:36.199803 systemd-networkd[1695]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:13:36.199809 systemd-networkd[1695]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:13:36.199832 systemd-networkd[1695]: eth0: DHCP lease lost Aug 19 00:13:36.200707 waagent[1985]: 2025-08-19T00:13:36.200661Z INFO Daemon Daemon Create user account if not exists Aug 19 00:13:36.204374 waagent[1985]: 2025-08-19T00:13:36.204340Z INFO Daemon Daemon User core already exists, skip useradd Aug 19 00:13:36.208089 waagent[1985]: 2025-08-19T00:13:36.208063Z INFO Daemon Daemon Configure sudoer Aug 19 00:13:36.214447 waagent[1985]: 2025-08-19T00:13:36.214406Z INFO Daemon Daemon Configure sshd Aug 19 00:13:36.220760 waagent[1985]: 2025-08-19T00:13:36.220721Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Aug 19 00:13:36.228920 waagent[1985]: 2025-08-19T00:13:36.228890Z INFO Daemon Daemon Deploy ssh public key. Aug 19 00:13:36.229263 systemd-networkd[1695]: eth0: DHCPv4 address 10.200.20.41/24, gateway 10.200.20.1 acquired from 168.63.129.16 Aug 19 00:13:37.399999 waagent[1985]: 2025-08-19T00:13:37.399927Z INFO Daemon Daemon Provisioning complete Aug 19 00:13:37.412392 waagent[1985]: 2025-08-19T00:13:37.412351Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Aug 19 00:13:37.416408 waagent[1985]: 2025-08-19T00:13:37.416379Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Aug 19 00:13:37.423009 waagent[1985]: 2025-08-19T00:13:37.422985Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Aug 19 00:13:37.521097 waagent[2092]: 2025-08-19T00:13:37.521035Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Aug 19 00:13:37.522269 waagent[2092]: 2025-08-19T00:13:37.521543Z INFO ExtHandler ExtHandler OS: flatcar 4426.0.0 Aug 19 00:13:37.522269 waagent[2092]: 2025-08-19T00:13:37.521599Z INFO ExtHandler ExtHandler Python: 3.11.13 Aug 19 00:13:37.522269 waagent[2092]: 2025-08-19T00:13:37.521633Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Aug 19 00:13:37.589251 waagent[2092]: 2025-08-19T00:13:37.589193Z INFO ExtHandler ExtHandler Distro: flatcar-4426.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Aug 19 00:13:37.589381 waagent[2092]: 2025-08-19T00:13:37.589355Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Aug 19 00:13:37.589416 waagent[2092]: 2025-08-19T00:13:37.589401Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Aug 19 00:13:37.596625 waagent[2092]: 2025-08-19T00:13:37.596581Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Aug 19 00:13:37.601175 waagent[2092]: 2025-08-19T00:13:37.601143Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Aug 19 00:13:37.601534 waagent[2092]: 2025-08-19T00:13:37.601501Z INFO ExtHandler Aug 19 00:13:37.601584 waagent[2092]: 2025-08-19T00:13:37.601567Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 6dcd675a-94f9-45c1-bd32-10947798d1e6 eTag: 16674349773925400803 source: Fabric] Aug 19 00:13:37.601794 waagent[2092]: 2025-08-19T00:13:37.601769Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Aug 19 00:13:37.602177 waagent[2092]: 2025-08-19T00:13:37.602149Z INFO ExtHandler Aug 19 00:13:37.602211 waagent[2092]: 2025-08-19T00:13:37.602197Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Aug 19 00:13:37.605228 waagent[2092]: 2025-08-19T00:13:37.605202Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Aug 19 00:13:37.732980 waagent[2092]: 2025-08-19T00:13:37.732867Z INFO ExtHandler Downloaded certificate {'thumbprint': '3AA6D277423F183CFDD796E439C7FA2D21E85F7D', 'hasPrivateKey': True} Aug 19 00:13:37.733326 waagent[2092]: 2025-08-19T00:13:37.733292Z INFO ExtHandler Fetch goal state completed Aug 19 00:13:37.744813 waagent[2092]: 2025-08-19T00:13:37.744771Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.1 11 Feb 2025 (Library: OpenSSL 3.4.1 11 Feb 2025) Aug 19 00:13:37.751900 waagent[2092]: 2025-08-19T00:13:37.751855Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2092 Aug 19 00:13:37.751996 waagent[2092]: 2025-08-19T00:13:37.751973Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Aug 19 00:13:37.752241 waagent[2092]: 2025-08-19T00:13:37.752201Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Aug 19 00:13:37.753352 waagent[2092]: 2025-08-19T00:13:37.753320Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4426.0.0', '', 'Flatcar Container Linux by Kinvolk'] Aug 19 00:13:37.753662 waagent[2092]: 2025-08-19T00:13:37.753633Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4426.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Aug 19 00:13:37.753767 waagent[2092]: 2025-08-19T00:13:37.753746Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Aug 19 00:13:37.754176 waagent[2092]: 2025-08-19T00:13:37.754147Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Aug 19 00:13:38.147514 waagent[2092]: 2025-08-19T00:13:38.147474Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Aug 19 00:13:38.147692 waagent[2092]: 2025-08-19T00:13:38.147665Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Aug 19 00:13:38.152253 waagent[2092]: 2025-08-19T00:13:38.151968Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Aug 19 00:13:38.156362 systemd[1]: Reload requested from client PID 2107 ('systemctl') (unit waagent.service)... Aug 19 00:13:38.156550 systemd[1]: Reloading... Aug 19 00:13:38.226254 zram_generator::config[2155]: No configuration found. Aug 19 00:13:38.361992 systemd[1]: Reloading finished in 205 ms. Aug 19 00:13:38.376243 waagent[2092]: 2025-08-19T00:13:38.375347Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Aug 19 00:13:38.376243 waagent[2092]: 2025-08-19T00:13:38.375463Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Aug 19 00:13:38.797460 waagent[2092]: 2025-08-19T00:13:38.796703Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Aug 19 00:13:38.797460 waagent[2092]: 2025-08-19T00:13:38.797000Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Aug 19 00:13:38.797741 waagent[2092]: 2025-08-19T00:13:38.797660Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Aug 19 00:13:38.797741 waagent[2092]: 2025-08-19T00:13:38.797726Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Aug 19 00:13:38.797901 waagent[2092]: 2025-08-19T00:13:38.797872Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Aug 19 00:13:38.798008 waagent[2092]: 2025-08-19T00:13:38.797965Z INFO ExtHandler ExtHandler Starting env monitor service. Aug 19 00:13:38.798128 waagent[2092]: 2025-08-19T00:13:38.798098Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Aug 19 00:13:38.798128 waagent[2092]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Aug 19 00:13:38.798128 waagent[2092]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Aug 19 00:13:38.798128 waagent[2092]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Aug 19 00:13:38.798128 waagent[2092]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Aug 19 00:13:38.798128 waagent[2092]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Aug 19 00:13:38.798128 waagent[2092]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Aug 19 00:13:38.798588 waagent[2092]: 2025-08-19T00:13:38.798558Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Aug 19 00:13:38.798718 waagent[2092]: 2025-08-19T00:13:38.798697Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Aug 19 00:13:38.798981 waagent[2092]: 2025-08-19T00:13:38.798945Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Aug 19 00:13:38.799094 waagent[2092]: 2025-08-19T00:13:38.799064Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Aug 19 00:13:38.799172 waagent[2092]: 2025-08-19T00:13:38.799140Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Aug 19 00:13:38.799465 waagent[2092]: 2025-08-19T00:13:38.799434Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Aug 19 00:13:38.799604 waagent[2092]: 2025-08-19T00:13:38.799573Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Aug 19 00:13:38.799665 waagent[2092]: 2025-08-19T00:13:38.799641Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Aug 19 00:13:38.799844 waagent[2092]: 2025-08-19T00:13:38.799806Z INFO EnvHandler ExtHandler Configure routes Aug 19 00:13:38.800389 waagent[2092]: 2025-08-19T00:13:38.800367Z INFO EnvHandler ExtHandler Gateway:None Aug 19 00:13:38.800793 waagent[2092]: 2025-08-19T00:13:38.800765Z INFO EnvHandler ExtHandler Routes:None Aug 19 00:13:38.806226 waagent[2092]: 2025-08-19T00:13:38.806195Z INFO ExtHandler ExtHandler Aug 19 00:13:38.806421 waagent[2092]: 2025-08-19T00:13:38.806392Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 6f439051-b143-47d5-b49b-22a4a0565d41 correlation d10b02c9-dd40-4120-b912-9338bee66754 created: 2025-08-19T00:11:56.599167Z] Aug 19 00:13:38.806762 waagent[2092]: 2025-08-19T00:13:38.806731Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Aug 19 00:13:38.807247 waagent[2092]: 2025-08-19T00:13:38.807202Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Aug 19 00:13:38.838684 waagent[2092]: 2025-08-19T00:13:38.838651Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Aug 19 00:13:38.838684 waagent[2092]: Try `iptables -h' or 'iptables --help' for more information.) Aug 19 00:13:38.839066 waagent[2092]: 2025-08-19T00:13:38.839037Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 2123F664-3948-4795-BBE3-45BD6F2D21B7;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Aug 19 00:13:38.906319 waagent[2092]: 2025-08-19T00:13:38.906264Z INFO MonitorHandler ExtHandler Network interfaces: Aug 19 00:13:38.906319 waagent[2092]: Executing ['ip', '-a', '-o', 'link']: Aug 19 00:13:38.906319 waagent[2092]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Aug 19 00:13:38.906319 waagent[2092]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c4:8e:1e brd ff:ff:ff:ff:ff:ff Aug 19 00:13:38.906319 waagent[2092]: 3: enP10483s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c4:8e:1e brd ff:ff:ff:ff:ff:ff\ altname enP10483p0s2 Aug 19 00:13:38.906319 waagent[2092]: Executing ['ip', '-4', '-a', '-o', 'address']: Aug 19 00:13:38.906319 waagent[2092]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Aug 19 00:13:38.906319 waagent[2092]: 2: eth0 inet 10.200.20.41/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Aug 19 00:13:38.906319 waagent[2092]: Executing ['ip', '-6', '-a', '-o', 'address']: Aug 19 00:13:38.906319 waagent[2092]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Aug 19 00:13:38.906319 waagent[2092]: 2: eth0 inet6 fe80::20d:3aff:fec4:8e1e/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Aug 19 00:13:38.976128 waagent[2092]: 2025-08-19T00:13:38.976082Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Aug 19 00:13:38.976128 waagent[2092]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Aug 19 00:13:38.976128 waagent[2092]: pkts bytes target prot opt in out source destination Aug 19 00:13:38.976128 waagent[2092]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Aug 19 00:13:38.976128 waagent[2092]: pkts bytes target prot opt in out source destination Aug 19 00:13:38.976128 waagent[2092]: Chain OUTPUT (policy ACCEPT 6 packets, 884 bytes) Aug 19 00:13:38.976128 waagent[2092]: pkts bytes target prot opt in out source destination Aug 19 00:13:38.976128 waagent[2092]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Aug 19 00:13:38.976128 waagent[2092]: 1 52 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Aug 19 00:13:38.976128 waagent[2092]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Aug 19 00:13:38.978356 waagent[2092]: 2025-08-19T00:13:38.978315Z INFO EnvHandler ExtHandler Current Firewall rules: Aug 19 00:13:38.978356 waagent[2092]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Aug 19 00:13:38.978356 waagent[2092]: pkts bytes target prot opt in out source destination Aug 19 00:13:38.978356 waagent[2092]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Aug 19 00:13:38.978356 waagent[2092]: pkts bytes target prot opt in out source destination Aug 19 00:13:38.978356 waagent[2092]: Chain OUTPUT (policy ACCEPT 6 packets, 884 bytes) Aug 19 00:13:38.978356 waagent[2092]: pkts bytes target prot opt in out source destination Aug 19 00:13:38.978356 waagent[2092]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Aug 19 00:13:38.978356 waagent[2092]: 1 52 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Aug 19 00:13:38.978356 waagent[2092]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Aug 19 00:13:38.978528 waagent[2092]: 2025-08-19T00:13:38.978505Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Aug 19 00:13:43.520660 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 19 00:13:43.522070 systemd[1]: Started sshd@0-10.200.20.41:22-10.200.16.10:59968.service - OpenSSH per-connection server daemon (10.200.16.10:59968). Aug 19 00:13:44.002597 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 19 00:13:44.004475 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:13:44.132010 sshd[2234]: Accepted publickey for core from 10.200.16.10 port 59968 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:13:44.133074 sshd-session[2234]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:44.138815 systemd-logind[1846]: New session 3 of user core. Aug 19 00:13:44.147833 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 19 00:13:44.153343 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:13:44.159431 (kubelet)[2245]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:13:44.227118 kubelet[2245]: E0819 00:13:44.227072 2245 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:13:44.229766 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:13:44.229956 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:13:44.231323 systemd[1]: kubelet.service: Consumed 108ms CPU time, 106.5M memory peak. Aug 19 00:13:44.555249 systemd[1]: Started sshd@1-10.200.20.41:22-10.200.16.10:59974.service - OpenSSH per-connection server daemon (10.200.16.10:59974). Aug 19 00:13:45.037780 sshd[2255]: Accepted publickey for core from 10.200.16.10 port 59974 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:13:45.038806 sshd-session[2255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:45.042224 systemd-logind[1846]: New session 4 of user core. Aug 19 00:13:45.046335 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 19 00:13:45.386854 sshd[2258]: Connection closed by 10.200.16.10 port 59974 Aug 19 00:13:45.387408 sshd-session[2255]: pam_unix(sshd:session): session closed for user core Aug 19 00:13:45.390200 systemd[1]: sshd@1-10.200.20.41:22-10.200.16.10:59974.service: Deactivated successfully. Aug 19 00:13:45.391544 systemd[1]: session-4.scope: Deactivated successfully. Aug 19 00:13:45.392085 systemd-logind[1846]: Session 4 logged out. Waiting for processes to exit. Aug 19 00:13:45.393581 systemd-logind[1846]: Removed session 4. Aug 19 00:13:45.474995 systemd[1]: Started sshd@2-10.200.20.41:22-10.200.16.10:59976.service - OpenSSH per-connection server daemon (10.200.16.10:59976). Aug 19 00:13:45.959533 sshd[2264]: Accepted publickey for core from 10.200.16.10 port 59976 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:13:45.960569 sshd-session[2264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:45.964266 systemd-logind[1846]: New session 5 of user core. Aug 19 00:13:45.970331 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 19 00:13:46.303300 sshd[2267]: Connection closed by 10.200.16.10 port 59976 Aug 19 00:13:46.303744 sshd-session[2264]: pam_unix(sshd:session): session closed for user core Aug 19 00:13:46.306784 systemd-logind[1846]: Session 5 logged out. Waiting for processes to exit. Aug 19 00:13:46.307041 systemd[1]: sshd@2-10.200.20.41:22-10.200.16.10:59976.service: Deactivated successfully. Aug 19 00:13:46.310366 systemd[1]: session-5.scope: Deactivated successfully. Aug 19 00:13:46.311727 systemd-logind[1846]: Removed session 5. Aug 19 00:13:46.387317 systemd[1]: Started sshd@3-10.200.20.41:22-10.200.16.10:59988.service - OpenSSH per-connection server daemon (10.200.16.10:59988). Aug 19 00:13:46.845472 sshd[2273]: Accepted publickey for core from 10.200.16.10 port 59988 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:13:46.846489 sshd-session[2273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:46.850164 systemd-logind[1846]: New session 6 of user core. Aug 19 00:13:46.856339 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 19 00:13:47.178737 sshd[2276]: Connection closed by 10.200.16.10 port 59988 Aug 19 00:13:47.179318 sshd-session[2273]: pam_unix(sshd:session): session closed for user core Aug 19 00:13:47.181873 systemd[1]: sshd@3-10.200.20.41:22-10.200.16.10:59988.service: Deactivated successfully. Aug 19 00:13:47.183099 systemd[1]: session-6.scope: Deactivated successfully. Aug 19 00:13:47.183963 systemd-logind[1846]: Session 6 logged out. Waiting for processes to exit. Aug 19 00:13:47.184891 systemd-logind[1846]: Removed session 6. Aug 19 00:13:47.264262 systemd[1]: Started sshd@4-10.200.20.41:22-10.200.16.10:59990.service - OpenSSH per-connection server daemon (10.200.16.10:59990). Aug 19 00:13:47.739338 sshd[2282]: Accepted publickey for core from 10.200.16.10 port 59990 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:13:47.742517 sshd-session[2282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:47.745845 systemd-logind[1846]: New session 7 of user core. Aug 19 00:13:47.756487 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 19 00:13:48.255459 sudo[2286]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 19 00:13:48.255677 sudo[2286]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:13:48.284534 sudo[2286]: pam_unix(sudo:session): session closed for user root Aug 19 00:13:48.360470 sshd[2285]: Connection closed by 10.200.16.10 port 59990 Aug 19 00:13:48.359707 sshd-session[2282]: pam_unix(sshd:session): session closed for user core Aug 19 00:13:48.362588 systemd-logind[1846]: Session 7 logged out. Waiting for processes to exit. Aug 19 00:13:48.363238 systemd[1]: sshd@4-10.200.20.41:22-10.200.16.10:59990.service: Deactivated successfully. Aug 19 00:13:48.364411 systemd[1]: session-7.scope: Deactivated successfully. Aug 19 00:13:48.367041 systemd-logind[1846]: Removed session 7. Aug 19 00:13:48.449397 systemd[1]: Started sshd@5-10.200.20.41:22-10.200.16.10:60004.service - OpenSSH per-connection server daemon (10.200.16.10:60004). Aug 19 00:13:48.940285 sshd[2292]: Accepted publickey for core from 10.200.16.10 port 60004 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:13:48.941306 sshd-session[2292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:48.944860 systemd-logind[1846]: New session 8 of user core. Aug 19 00:13:48.956337 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 19 00:13:49.213294 sudo[2297]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 19 00:13:49.213500 sudo[2297]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:13:49.220118 sudo[2297]: pam_unix(sudo:session): session closed for user root Aug 19 00:13:49.223322 sudo[2296]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 19 00:13:49.223508 sudo[2296]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:13:49.230359 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 00:13:49.257483 augenrules[2319]: No rules Aug 19 00:13:49.258444 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 00:13:49.258617 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 00:13:49.259532 sudo[2296]: pam_unix(sudo:session): session closed for user root Aug 19 00:13:49.340035 sshd[2295]: Connection closed by 10.200.16.10 port 60004 Aug 19 00:13:49.340458 sshd-session[2292]: pam_unix(sshd:session): session closed for user core Aug 19 00:13:49.343880 systemd[1]: sshd@5-10.200.20.41:22-10.200.16.10:60004.service: Deactivated successfully. Aug 19 00:13:49.345158 systemd[1]: session-8.scope: Deactivated successfully. Aug 19 00:13:49.347749 systemd-logind[1846]: Session 8 logged out. Waiting for processes to exit. Aug 19 00:13:49.348895 systemd-logind[1846]: Removed session 8. Aug 19 00:13:49.436512 systemd[1]: Started sshd@6-10.200.20.41:22-10.200.16.10:60008.service - OpenSSH per-connection server daemon (10.200.16.10:60008). Aug 19 00:13:49.913824 sshd[2328]: Accepted publickey for core from 10.200.16.10 port 60008 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:13:49.914863 sshd-session[2328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:13:49.918208 systemd-logind[1846]: New session 9 of user core. Aug 19 00:13:49.924351 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 19 00:13:50.180551 sudo[2332]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 19 00:13:50.180760 sudo[2332]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:13:51.665219 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 19 00:13:51.676452 (dockerd)[2350]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 19 00:13:52.669258 dockerd[2350]: time="2025-08-19T00:13:52.668583032Z" level=info msg="Starting up" Aug 19 00:13:52.670399 dockerd[2350]: time="2025-08-19T00:13:52.670372520Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 19 00:13:52.677660 dockerd[2350]: time="2025-08-19T00:13:52.677625520Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Aug 19 00:13:53.562505 systemd[1]: var-lib-docker-metacopy\x2dcheck3357493624-merged.mount: Deactivated successfully. Aug 19 00:13:53.662632 dockerd[2350]: time="2025-08-19T00:13:53.662591520Z" level=info msg="Loading containers: start." Aug 19 00:13:53.755265 kernel: Initializing XFRM netlink socket Aug 19 00:13:54.252520 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 19 00:13:54.253630 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:13:54.286658 systemd-networkd[1695]: docker0: Link UP Aug 19 00:13:56.113988 chronyd[1826]: Selected source PHC0 Aug 19 00:13:57.619651 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:13:57.622146 (kubelet)[2529]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:13:57.652172 kubelet[2529]: E0819 00:13:57.652118 2529 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:13:57.654179 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:13:57.654391 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:13:57.654835 systemd[1]: kubelet.service: Consumed 103ms CPU time, 105.6M memory peak. Aug 19 00:13:57.671543 dockerd[2350]: time="2025-08-19T00:13:57.671450134Z" level=info msg="Loading containers: done." Aug 19 00:13:58.059558 dockerd[2350]: time="2025-08-19T00:13:58.059455360Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 19 00:13:58.059558 dockerd[2350]: time="2025-08-19T00:13:58.059543728Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Aug 19 00:13:58.059729 dockerd[2350]: time="2025-08-19T00:13:58.059646840Z" level=info msg="Initializing buildkit" Aug 19 00:13:58.269700 dockerd[2350]: time="2025-08-19T00:13:58.269658338Z" level=info msg="Completed buildkit initialization" Aug 19 00:13:58.274854 dockerd[2350]: time="2025-08-19T00:13:58.274816674Z" level=info msg="Daemon has completed initialization" Aug 19 00:13:58.274854 dockerd[2350]: time="2025-08-19T00:13:58.274889258Z" level=info msg="API listen on /run/docker.sock" Aug 19 00:13:58.274980 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 19 00:13:58.735396 containerd[1864]: time="2025-08-19T00:13:58.735352418Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Aug 19 00:14:00.131920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount832105383.mount: Deactivated successfully. Aug 19 00:14:01.460239 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Aug 19 00:14:05.892512 containerd[1864]: time="2025-08-19T00:14:05.892462899Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:05.911827 containerd[1864]: time="2025-08-19T00:14:05.911791210Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=27352613" Aug 19 00:14:05.915117 containerd[1864]: time="2025-08-19T00:14:05.915089511Z" level=info msg="ImageCreate event name:\"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:05.963225 containerd[1864]: time="2025-08-19T00:14:05.963190142Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:05.964092 containerd[1864]: time="2025-08-19T00:14:05.963888612Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"27349413\" in 7.228484857s" Aug 19 00:14:05.964092 containerd[1864]: time="2025-08-19T00:14:05.963915676Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\"" Aug 19 00:14:05.965271 containerd[1864]: time="2025-08-19T00:14:05.965248901Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Aug 19 00:14:07.752708 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 19 00:14:07.755123 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:14:07.853507 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:07.861134 (kubelet)[2640]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:14:07.963485 kubelet[2640]: E0819 00:14:07.963432 2640 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:14:07.965979 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:14:07.966207 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:14:07.966780 systemd[1]: kubelet.service: Consumed 105ms CPU time, 104.8M memory peak. Aug 19 00:14:08.289915 containerd[1864]: time="2025-08-19T00:14:08.289864697Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:08.293124 containerd[1864]: time="2025-08-19T00:14:08.292966200Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=23536977" Aug 19 00:14:08.296102 containerd[1864]: time="2025-08-19T00:14:08.296080031Z" level=info msg="ImageCreate event name:\"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:08.300160 containerd[1864]: time="2025-08-19T00:14:08.300120851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:08.300729 containerd[1864]: time="2025-08-19T00:14:08.300628098Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"25093155\" in 2.33520252s" Aug 19 00:14:08.300729 containerd[1864]: time="2025-08-19T00:14:08.300653923Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\"" Aug 19 00:14:08.301579 containerd[1864]: time="2025-08-19T00:14:08.301554527Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Aug 19 00:14:09.257926 containerd[1864]: time="2025-08-19T00:14:09.257879348Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:09.260812 containerd[1864]: time="2025-08-19T00:14:09.260782381Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=18292014" Aug 19 00:14:09.264341 containerd[1864]: time="2025-08-19T00:14:09.264305713Z" level=info msg="ImageCreate event name:\"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:09.268936 containerd[1864]: time="2025-08-19T00:14:09.268889549Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:09.269577 containerd[1864]: time="2025-08-19T00:14:09.269425806Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"19848210\" in 967.844767ms" Aug 19 00:14:09.269577 containerd[1864]: time="2025-08-19T00:14:09.269451295Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\"" Aug 19 00:14:09.270041 containerd[1864]: time="2025-08-19T00:14:09.270022280Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Aug 19 00:14:10.898633 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount968364723.mount: Deactivated successfully. Aug 19 00:14:11.169047 containerd[1864]: time="2025-08-19T00:14:11.168552827Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:11.171393 containerd[1864]: time="2025-08-19T00:14:11.171371865Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=28199959" Aug 19 00:14:11.174804 containerd[1864]: time="2025-08-19T00:14:11.174781569Z" level=info msg="ImageCreate event name:\"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:11.179132 containerd[1864]: time="2025-08-19T00:14:11.179109670Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:11.179936 containerd[1864]: time="2025-08-19T00:14:11.179835076Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"28198978\" in 1.909732625s" Aug 19 00:14:11.179936 containerd[1864]: time="2025-08-19T00:14:11.179859652Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\"" Aug 19 00:14:11.180439 containerd[1864]: time="2025-08-19T00:14:11.180393485Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Aug 19 00:14:11.789312 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2954697147.mount: Deactivated successfully. Aug 19 00:14:17.925411 update_engine[1850]: I20250819 00:14:17.925343 1850 update_attempter.cc:509] Updating boot flags... Aug 19 00:14:18.002558 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Aug 19 00:14:18.003983 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:14:22.297054 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:22.299427 (kubelet)[2827]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:14:22.322103 kubelet[2827]: E0819 00:14:22.322057 2827 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:14:22.324089 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:14:22.324280 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:14:22.324685 systemd[1]: kubelet.service: Consumed 100ms CPU time, 104.2M memory peak. Aug 19 00:14:23.301063 containerd[1864]: time="2025-08-19T00:14:23.300999690Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:23.303707 containerd[1864]: time="2025-08-19T00:14:23.303682924Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" Aug 19 00:14:23.366686 containerd[1864]: time="2025-08-19T00:14:23.366650029Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:23.412238 containerd[1864]: time="2025-08-19T00:14:23.412195990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:23.412895 containerd[1864]: time="2025-08-19T00:14:23.412759223Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 12.232343458s" Aug 19 00:14:23.412895 containerd[1864]: time="2025-08-19T00:14:23.412787304Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Aug 19 00:14:23.413240 containerd[1864]: time="2025-08-19T00:14:23.413206877Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 19 00:14:24.370935 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount458812254.mount: Deactivated successfully. Aug 19 00:14:24.600371 containerd[1864]: time="2025-08-19T00:14:24.600316958Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:14:24.603837 containerd[1864]: time="2025-08-19T00:14:24.603801146Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Aug 19 00:14:24.666160 containerd[1864]: time="2025-08-19T00:14:24.666038596Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:14:24.670275 containerd[1864]: time="2025-08-19T00:14:24.670237645Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:14:24.670670 containerd[1864]: time="2025-08-19T00:14:24.670554727Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 1.257324648s" Aug 19 00:14:24.670670 containerd[1864]: time="2025-08-19T00:14:24.670581311Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Aug 19 00:14:24.671139 containerd[1864]: time="2025-08-19T00:14:24.671112824Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Aug 19 00:14:26.115118 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4141078211.mount: Deactivated successfully. Aug 19 00:14:30.795196 containerd[1864]: time="2025-08-19T00:14:30.795143119Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:30.798239 containerd[1864]: time="2025-08-19T00:14:30.798123149Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465295" Aug 19 00:14:30.801589 containerd[1864]: time="2025-08-19T00:14:30.801563673Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:30.806204 containerd[1864]: time="2025-08-19T00:14:30.806158366Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:14:30.806947 containerd[1864]: time="2025-08-19T00:14:30.806785608Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 6.1356478s" Aug 19 00:14:30.806947 containerd[1864]: time="2025-08-19T00:14:30.806812337Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Aug 19 00:14:32.502834 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Aug 19 00:14:32.505209 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:14:32.807647 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:32.815608 (kubelet)[2930]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:14:32.842250 kubelet[2930]: E0819 00:14:32.842109 2930 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:14:32.845428 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:14:32.845697 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:14:32.846189 systemd[1]: kubelet.service: Consumed 100ms CPU time, 106.6M memory peak. Aug 19 00:14:33.040224 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:33.040348 systemd[1]: kubelet.service: Consumed 100ms CPU time, 106.6M memory peak. Aug 19 00:14:33.041942 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:14:33.059869 systemd[1]: Reload requested from client PID 2944 ('systemctl') (unit session-9.scope)... Aug 19 00:14:33.060224 systemd[1]: Reloading... Aug 19 00:14:33.143438 zram_generator::config[2996]: No configuration found. Aug 19 00:14:33.281697 systemd[1]: Reloading finished in 221 ms. Aug 19 00:14:34.127204 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 19 00:14:34.127298 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 19 00:14:34.127529 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:34.128917 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:14:39.976979 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:39.984448 (kubelet)[3054]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 00:14:40.009367 kubelet[3054]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:14:40.009367 kubelet[3054]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 00:14:40.009367 kubelet[3054]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:14:40.009591 kubelet[3054]: I0819 00:14:40.009363 3054 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 00:14:40.520061 kubelet[3054]: I0819 00:14:40.519965 3054 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 00:14:40.520061 kubelet[3054]: I0819 00:14:40.519994 3054 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 00:14:40.520390 kubelet[3054]: I0819 00:14:40.520373 3054 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 00:14:40.542677 kubelet[3054]: E0819 00:14:40.542642 3054 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.41:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 19 00:14:40.542784 kubelet[3054]: I0819 00:14:40.542769 3054 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 00:14:40.548334 kubelet[3054]: I0819 00:14:40.548317 3054 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 00:14:40.550693 kubelet[3054]: I0819 00:14:40.550673 3054 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 00:14:40.551915 kubelet[3054]: I0819 00:14:40.551887 3054 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 00:14:40.552032 kubelet[3054]: I0819 00:14:40.551917 3054 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.0.0-a-440c7464d3","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 00:14:40.552102 kubelet[3054]: I0819 00:14:40.552040 3054 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 00:14:40.552102 kubelet[3054]: I0819 00:14:40.552047 3054 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 00:14:40.552726 kubelet[3054]: I0819 00:14:40.552709 3054 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:14:40.555042 kubelet[3054]: I0819 00:14:40.555029 3054 kubelet.go:480] "Attempting to sync node with API server" Aug 19 00:14:40.555069 kubelet[3054]: I0819 00:14:40.555047 3054 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 00:14:40.555069 kubelet[3054]: I0819 00:14:40.555065 3054 kubelet.go:386] "Adding apiserver pod source" Aug 19 00:14:40.555104 kubelet[3054]: I0819 00:14:40.555077 3054 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 00:14:40.557419 kubelet[3054]: E0819 00:14:40.557196 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.41:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 19 00:14:40.557634 kubelet[3054]: E0819 00:14:40.557616 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.41:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.0.0-a-440c7464d3&limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 00:14:40.557768 kubelet[3054]: I0819 00:14:40.557756 3054 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 00:14:40.558170 kubelet[3054]: I0819 00:14:40.558155 3054 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 00:14:40.558292 kubelet[3054]: W0819 00:14:40.558281 3054 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 19 00:14:40.560613 kubelet[3054]: I0819 00:14:40.560600 3054 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 00:14:40.560708 kubelet[3054]: I0819 00:14:40.560702 3054 server.go:1289] "Started kubelet" Aug 19 00:14:40.562856 kubelet[3054]: I0819 00:14:40.562726 3054 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 00:14:40.564686 kubelet[3054]: E0819 00:14:40.563866 3054 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.41:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.41:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.0.0-a-440c7464d3.185d02c652d29f01 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.0.0-a-440c7464d3,UID:ci-4426.0.0-a-440c7464d3,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.0.0-a-440c7464d3,},FirstTimestamp:2025-08-19 00:14:40.560676609 +0000 UTC m=+0.573307440,LastTimestamp:2025-08-19 00:14:40.560676609 +0000 UTC m=+0.573307440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.0.0-a-440c7464d3,}" Aug 19 00:14:40.565249 kubelet[3054]: I0819 00:14:40.565212 3054 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 00:14:40.567273 kubelet[3054]: I0819 00:14:40.566832 3054 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 00:14:40.567273 kubelet[3054]: E0819 00:14:40.567010 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:40.567273 kubelet[3054]: I0819 00:14:40.567078 3054 server.go:317] "Adding debug handlers to kubelet server" Aug 19 00:14:40.568586 kubelet[3054]: I0819 00:14:40.568224 3054 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 00:14:40.568586 kubelet[3054]: I0819 00:14:40.568302 3054 reconciler.go:26] "Reconciler: start to sync state" Aug 19 00:14:40.571037 kubelet[3054]: I0819 00:14:40.570980 3054 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 00:14:40.571203 kubelet[3054]: I0819 00:14:40.571185 3054 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 00:14:40.572408 kubelet[3054]: I0819 00:14:40.572389 3054 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 00:14:40.573688 kubelet[3054]: I0819 00:14:40.573663 3054 factory.go:223] Registration of the systemd container factory successfully Aug 19 00:14:40.573750 kubelet[3054]: I0819 00:14:40.573738 3054 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 00:14:40.574035 kubelet[3054]: E0819 00:14:40.574006 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.41:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 00:14:40.574085 kubelet[3054]: E0819 00:14:40.574063 3054 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.41:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-440c7464d3?timeout=10s\": dial tcp 10.200.20.41:6443: connect: connection refused" interval="200ms" Aug 19 00:14:40.575537 kubelet[3054]: I0819 00:14:40.575513 3054 factory.go:223] Registration of the containerd container factory successfully Aug 19 00:14:40.592509 kubelet[3054]: E0819 00:14:40.592463 3054 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 00:14:40.596048 kubelet[3054]: I0819 00:14:40.596035 3054 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 00:14:40.596289 kubelet[3054]: I0819 00:14:40.596124 3054 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 00:14:40.596289 kubelet[3054]: I0819 00:14:40.596141 3054 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:14:40.667199 kubelet[3054]: E0819 00:14:40.667178 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:40.767586 kubelet[3054]: E0819 00:14:40.767516 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:40.775211 kubelet[3054]: E0819 00:14:40.775048 3054 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.41:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-440c7464d3?timeout=10s\": dial tcp 10.200.20.41:6443: connect: connection refused" interval="400ms" Aug 19 00:14:40.868362 kubelet[3054]: E0819 00:14:40.868271 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:40.968681 kubelet[3054]: E0819 00:14:40.968645 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:41.068874 kubelet[3054]: E0819 00:14:41.068791 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:41.169186 kubelet[3054]: E0819 00:14:41.169161 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:41.175685 kubelet[3054]: E0819 00:14:41.175661 3054 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.41:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-440c7464d3?timeout=10s\": dial tcp 10.200.20.41:6443: connect: connection refused" interval="800ms" Aug 19 00:14:41.202721 kubelet[3054]: I0819 00:14:41.202472 3054 policy_none.go:49] "None policy: Start" Aug 19 00:14:41.202721 kubelet[3054]: I0819 00:14:41.202496 3054 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 00:14:41.202721 kubelet[3054]: I0819 00:14:41.202506 3054 state_mem.go:35] "Initializing new in-memory state store" Aug 19 00:14:41.214143 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 19 00:14:41.224307 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 19 00:14:41.227360 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 19 00:14:41.236154 kubelet[3054]: E0819 00:14:41.235746 3054 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 00:14:41.236154 kubelet[3054]: I0819 00:14:41.235894 3054 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 00:14:41.236154 kubelet[3054]: I0819 00:14:41.235904 3054 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 00:14:41.236154 kubelet[3054]: I0819 00:14:41.236089 3054 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 00:14:41.237899 kubelet[3054]: E0819 00:14:41.237884 3054 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 00:14:41.238023 kubelet[3054]: E0819 00:14:41.238013 3054 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:41.288947 kubelet[3054]: I0819 00:14:41.288858 3054 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 00:14:41.289789 kubelet[3054]: I0819 00:14:41.289762 3054 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 00:14:41.289789 kubelet[3054]: I0819 00:14:41.289789 3054 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 00:14:41.289945 kubelet[3054]: I0819 00:14:41.289814 3054 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 00:14:41.289945 kubelet[3054]: I0819 00:14:41.289818 3054 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 00:14:41.289945 kubelet[3054]: E0819 00:14:41.289850 3054 kubelet.go:2460] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Aug 19 00:14:41.291099 kubelet[3054]: E0819 00:14:41.291075 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.41:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 19 00:14:41.337957 kubelet[3054]: I0819 00:14:41.337636 3054 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:41.338790 kubelet[3054]: E0819 00:14:41.338683 3054 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.41:6443/api/v1/nodes\": dial tcp 10.200.20.41:6443: connect: connection refused" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:41.473255 kubelet[3054]: I0819 00:14:41.473174 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/efa0ef5413f8e2c6d966d8c05a1dd925-ca-certs\") pod \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" (UID: \"efa0ef5413f8e2c6d966d8c05a1dd925\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:41.473255 kubelet[3054]: I0819 00:14:41.473254 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/efa0ef5413f8e2c6d966d8c05a1dd925-k8s-certs\") pod \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" (UID: \"efa0ef5413f8e2c6d966d8c05a1dd925\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:41.473391 kubelet[3054]: I0819 00:14:41.473267 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/efa0ef5413f8e2c6d966d8c05a1dd925-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" (UID: \"efa0ef5413f8e2c6d966d8c05a1dd925\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:41.540704 kubelet[3054]: I0819 00:14:41.540681 3054 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:41.708638 kubelet[3054]: E0819 00:14:41.540988 3054 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.41:6443/api/v1/nodes\": dial tcp 10.200.20.41:6443: connect: connection refused" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:41.708638 kubelet[3054]: E0819 00:14:41.545388 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.41:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 00:14:41.708638 kubelet[3054]: E0819 00:14:41.560126 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.41:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.0.0-a-440c7464d3&limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 00:14:41.943144 kubelet[3054]: I0819 00:14:41.943120 3054 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.056335 kubelet[3054]: E0819 00:14:41.943437 3054 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.41:6443/api/v1/nodes\": dial tcp 10.200.20.41:6443: connect: connection refused" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.056335 kubelet[3054]: E0819 00:14:41.976881 3054 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.41:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-440c7464d3?timeout=10s\": dial tcp 10.200.20.41:6443: connect: connection refused" interval="1.6s" Aug 19 00:14:42.062946 systemd[1]: Created slice kubepods-burstable-podefa0ef5413f8e2c6d966d8c05a1dd925.slice - libcontainer container kubepods-burstable-podefa0ef5413f8e2c6d966d8c05a1dd925.slice. Aug 19 00:14:42.072740 kubelet[3054]: E0819 00:14:42.072715 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.073553 containerd[1864]: time="2025-08-19T00:14:42.073443963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.0.0-a-440c7464d3,Uid:efa0ef5413f8e2c6d966d8c05a1dd925,Namespace:kube-system,Attempt:0,}" Aug 19 00:14:42.076600 kubelet[3054]: I0819 00:14:42.076583 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-ca-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.076600 kubelet[3054]: I0819 00:14:42.076602 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.076698 kubelet[3054]: I0819 00:14:42.076615 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-k8s-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.076698 kubelet[3054]: I0819 00:14:42.076625 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-kubeconfig\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.076698 kubelet[3054]: I0819 00:14:42.076642 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.105981 kubelet[3054]: E0819 00:14:42.105957 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.41:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 19 00:14:42.428679 kubelet[3054]: E0819 00:14:42.428560 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.41:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 19 00:14:42.598919 kubelet[3054]: E0819 00:14:42.598876 3054 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.41:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 19 00:14:42.744980 kubelet[3054]: I0819 00:14:42.744894 3054 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:42.745376 kubelet[3054]: E0819 00:14:42.745297 3054 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.41:6443/api/v1/nodes\": dial tcp 10.200.20.41:6443: connect: connection refused" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:43.347249 kubelet[3054]: E0819 00:14:43.347203 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.41:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 00:14:43.577501 kubelet[3054]: E0819 00:14:43.577449 3054 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.41:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-440c7464d3?timeout=10s\": dial tcp 10.200.20.41:6443: connect: connection refused" interval="3.2s" Aug 19 00:14:45.455539 kubelet[3054]: E0819 00:14:44.065337 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.41:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.0.0-a-440c7464d3&limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 00:14:45.455539 kubelet[3054]: I0819 00:14:44.347262 3054 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:45.455539 kubelet[3054]: E0819 00:14:44.347552 3054 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.41:6443/api/v1/nodes\": dial tcp 10.200.20.41:6443: connect: connection refused" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:45.455539 kubelet[3054]: E0819 00:14:44.684916 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.41:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 19 00:14:45.508138 systemd[1]: Created slice kubepods-burstable-pod28649fa82ff9de2ea9240397aa36865a.slice - libcontainer container kubepods-burstable-pod28649fa82ff9de2ea9240397aa36865a.slice. Aug 19 00:14:45.510301 kubelet[3054]: E0819 00:14:45.510176 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:45.510963 containerd[1864]: time="2025-08-19T00:14:45.510858985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.0.0-a-440c7464d3,Uid:28649fa82ff9de2ea9240397aa36865a,Namespace:kube-system,Attempt:0,}" Aug 19 00:14:45.526669 systemd[1]: Created slice kubepods-burstable-pod9b6c17d03eb7e5aa70c75363bceb7650.slice - libcontainer container kubepods-burstable-pod9b6c17d03eb7e5aa70c75363bceb7650.slice. Aug 19 00:14:45.528636 kubelet[3054]: E0819 00:14:45.528611 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:45.548059 kubelet[3054]: E0819 00:14:45.548032 3054 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.41:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 19 00:14:45.595105 kubelet[3054]: I0819 00:14:45.595075 3054 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9b6c17d03eb7e5aa70c75363bceb7650-kubeconfig\") pod \"kube-scheduler-ci-4426.0.0-a-440c7464d3\" (UID: \"9b6c17d03eb7e5aa70c75363bceb7650\") " pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:45.771092 kubelet[3054]: E0819 00:14:45.770996 3054 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.41:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.41:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.0.0-a-440c7464d3.185d02c652d29f01 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.0.0-a-440c7464d3,UID:ci-4426.0.0-a-440c7464d3,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.0.0-a-440c7464d3,},FirstTimestamp:2025-08-19 00:14:40.560676609 +0000 UTC m=+0.573307440,LastTimestamp:2025-08-19 00:14:40.560676609 +0000 UTC m=+0.573307440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.0.0-a-440c7464d3,}" Aug 19 00:14:45.830406 containerd[1864]: time="2025-08-19T00:14:45.830291593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.0.0-a-440c7464d3,Uid:9b6c17d03eb7e5aa70c75363bceb7650,Namespace:kube-system,Attempt:0,}" Aug 19 00:14:45.921136 containerd[1864]: time="2025-08-19T00:14:45.921106477Z" level=info msg="connecting to shim 11f2805f909bfd48e2285ca0b32e4d194e8e28e5d5409f3268d1d371e0a859a2" address="unix:///run/containerd/s/e85ac6b2bf7102670dbc0d32dca78db4b31790adadbac83842f81e73317eab1d" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:14:45.938360 systemd[1]: Started cri-containerd-11f2805f909bfd48e2285ca0b32e4d194e8e28e5d5409f3268d1d371e0a859a2.scope - libcontainer container 11f2805f909bfd48e2285ca0b32e4d194e8e28e5d5409f3268d1d371e0a859a2. Aug 19 00:14:46.062403 containerd[1864]: time="2025-08-19T00:14:46.062037946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.0.0-a-440c7464d3,Uid:efa0ef5413f8e2c6d966d8c05a1dd925,Namespace:kube-system,Attempt:0,} returns sandbox id \"11f2805f909bfd48e2285ca0b32e4d194e8e28e5d5409f3268d1d371e0a859a2\"" Aug 19 00:14:46.108497 containerd[1864]: time="2025-08-19T00:14:46.108441599Z" level=info msg="CreateContainer within sandbox \"11f2805f909bfd48e2285ca0b32e4d194e8e28e5d5409f3268d1d371e0a859a2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 19 00:14:46.162424 containerd[1864]: time="2025-08-19T00:14:46.162351725Z" level=info msg="connecting to shim 379a0550e778bcff8723d9d0decfaa1b9462607bf3c979f84f1ae2564dbb39f8" address="unix:///run/containerd/s/3294f8d5ae2bad4502da2f3a1afa0a009753eb76321419debe1446ff2d71f8b5" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:14:46.184354 systemd[1]: Started cri-containerd-379a0550e778bcff8723d9d0decfaa1b9462607bf3c979f84f1ae2564dbb39f8.scope - libcontainer container 379a0550e778bcff8723d9d0decfaa1b9462607bf3c979f84f1ae2564dbb39f8. Aug 19 00:14:46.311209 containerd[1864]: time="2025-08-19T00:14:46.311152656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.0.0-a-440c7464d3,Uid:28649fa82ff9de2ea9240397aa36865a,Namespace:kube-system,Attempt:0,} returns sandbox id \"379a0550e778bcff8723d9d0decfaa1b9462607bf3c979f84f1ae2564dbb39f8\"" Aug 19 00:14:46.452022 containerd[1864]: time="2025-08-19T00:14:46.451915648Z" level=info msg="CreateContainer within sandbox \"379a0550e778bcff8723d9d0decfaa1b9462607bf3c979f84f1ae2564dbb39f8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 19 00:14:46.640039 kubelet[3054]: E0819 00:14:46.639996 3054 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.41:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.41:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 19 00:14:46.778010 kubelet[3054]: E0819 00:14:46.777967 3054 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.41:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-440c7464d3?timeout=10s\": dial tcp 10.200.20.41:6443: connect: connection refused" interval="6.4s" Aug 19 00:14:46.822847 containerd[1864]: time="2025-08-19T00:14:46.822804173Z" level=info msg="Container aef2203c2d4633ba8b039afc8ae848908958498154df4abf51b720b1b61ee72c: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:14:46.960248 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1583700611.mount: Deactivated successfully. Aug 19 00:14:46.960886 containerd[1864]: time="2025-08-19T00:14:46.960851524Z" level=info msg="Container bfcba63f36d0ca321d7f7a68840e2dce5f586089fa0f4d452fa1e1b833966354: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:14:46.968552 containerd[1864]: time="2025-08-19T00:14:46.968514355Z" level=info msg="connecting to shim 0370136abe79649c54e482b6667fbab7e075271d98ef7795e0e535f56763c0c7" address="unix:///run/containerd/s/5444a6958665a8c912749da1893f90a0d19446b2b8788173c772db6531d085cb" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:14:46.989352 systemd[1]: Started cri-containerd-0370136abe79649c54e482b6667fbab7e075271d98ef7795e0e535f56763c0c7.scope - libcontainer container 0370136abe79649c54e482b6667fbab7e075271d98ef7795e0e535f56763c0c7. Aug 19 00:14:47.117853 containerd[1864]: time="2025-08-19T00:14:47.117747912Z" level=info msg="CreateContainer within sandbox \"11f2805f909bfd48e2285ca0b32e4d194e8e28e5d5409f3268d1d371e0a859a2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"aef2203c2d4633ba8b039afc8ae848908958498154df4abf51b720b1b61ee72c\"" Aug 19 00:14:47.118765 containerd[1864]: time="2025-08-19T00:14:47.118745227Z" level=info msg="StartContainer for \"aef2203c2d4633ba8b039afc8ae848908958498154df4abf51b720b1b61ee72c\"" Aug 19 00:14:47.119655 containerd[1864]: time="2025-08-19T00:14:47.119634043Z" level=info msg="connecting to shim aef2203c2d4633ba8b039afc8ae848908958498154df4abf51b720b1b61ee72c" address="unix:///run/containerd/s/e85ac6b2bf7102670dbc0d32dca78db4b31790adadbac83842f81e73317eab1d" protocol=ttrpc version=3 Aug 19 00:14:47.139339 systemd[1]: Started cri-containerd-aef2203c2d4633ba8b039afc8ae848908958498154df4abf51b720b1b61ee72c.scope - libcontainer container aef2203c2d4633ba8b039afc8ae848908958498154df4abf51b720b1b61ee72c. Aug 19 00:14:47.213850 containerd[1864]: time="2025-08-19T00:14:47.213816665Z" level=info msg="StartContainer for \"aef2203c2d4633ba8b039afc8ae848908958498154df4abf51b720b1b61ee72c\" returns successfully" Aug 19 00:14:47.215993 containerd[1864]: time="2025-08-19T00:14:47.215929490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.0.0-a-440c7464d3,Uid:9b6c17d03eb7e5aa70c75363bceb7650,Namespace:kube-system,Attempt:0,} returns sandbox id \"0370136abe79649c54e482b6667fbab7e075271d98ef7795e0e535f56763c0c7\"" Aug 19 00:14:47.260510 containerd[1864]: time="2025-08-19T00:14:47.260411267Z" level=info msg="CreateContainer within sandbox \"0370136abe79649c54e482b6667fbab7e075271d98ef7795e0e535f56763c0c7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 19 00:14:47.306949 kubelet[3054]: E0819 00:14:47.306924 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:47.353686 containerd[1864]: time="2025-08-19T00:14:47.353658289Z" level=info msg="CreateContainer within sandbox \"379a0550e778bcff8723d9d0decfaa1b9462607bf3c979f84f1ae2564dbb39f8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"bfcba63f36d0ca321d7f7a68840e2dce5f586089fa0f4d452fa1e1b833966354\"" Aug 19 00:14:47.354152 containerd[1864]: time="2025-08-19T00:14:47.354121349Z" level=info msg="StartContainer for \"bfcba63f36d0ca321d7f7a68840e2dce5f586089fa0f4d452fa1e1b833966354\"" Aug 19 00:14:47.354921 containerd[1864]: time="2025-08-19T00:14:47.354867986Z" level=info msg="connecting to shim bfcba63f36d0ca321d7f7a68840e2dce5f586089fa0f4d452fa1e1b833966354" address="unix:///run/containerd/s/3294f8d5ae2bad4502da2f3a1afa0a009753eb76321419debe1446ff2d71f8b5" protocol=ttrpc version=3 Aug 19 00:14:47.370338 systemd[1]: Started cri-containerd-bfcba63f36d0ca321d7f7a68840e2dce5f586089fa0f4d452fa1e1b833966354.scope - libcontainer container bfcba63f36d0ca321d7f7a68840e2dce5f586089fa0f4d452fa1e1b833966354. Aug 19 00:14:47.420592 containerd[1864]: time="2025-08-19T00:14:47.420369042Z" level=info msg="StartContainer for \"bfcba63f36d0ca321d7f7a68840e2dce5f586089fa0f4d452fa1e1b833966354\" returns successfully" Aug 19 00:14:47.519604 containerd[1864]: time="2025-08-19T00:14:47.519546520Z" level=info msg="Container 4fb7f0745ac0b3fb8c9d2834ae9ec304a8ed8e21357470025aa7265faf0c2264: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:14:47.549454 kubelet[3054]: I0819 00:14:47.549410 3054 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:47.653073 containerd[1864]: time="2025-08-19T00:14:47.652960602Z" level=info msg="CreateContainer within sandbox \"0370136abe79649c54e482b6667fbab7e075271d98ef7795e0e535f56763c0c7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"4fb7f0745ac0b3fb8c9d2834ae9ec304a8ed8e21357470025aa7265faf0c2264\"" Aug 19 00:14:47.653814 containerd[1864]: time="2025-08-19T00:14:47.653785792Z" level=info msg="StartContainer for \"4fb7f0745ac0b3fb8c9d2834ae9ec304a8ed8e21357470025aa7265faf0c2264\"" Aug 19 00:14:47.654825 containerd[1864]: time="2025-08-19T00:14:47.654780547Z" level=info msg="connecting to shim 4fb7f0745ac0b3fb8c9d2834ae9ec304a8ed8e21357470025aa7265faf0c2264" address="unix:///run/containerd/s/5444a6958665a8c912749da1893f90a0d19446b2b8788173c772db6531d085cb" protocol=ttrpc version=3 Aug 19 00:14:47.679581 systemd[1]: Started cri-containerd-4fb7f0745ac0b3fb8c9d2834ae9ec304a8ed8e21357470025aa7265faf0c2264.scope - libcontainer container 4fb7f0745ac0b3fb8c9d2834ae9ec304a8ed8e21357470025aa7265faf0c2264. Aug 19 00:14:47.765617 containerd[1864]: time="2025-08-19T00:14:47.763156745Z" level=info msg="StartContainer for \"4fb7f0745ac0b3fb8c9d2834ae9ec304a8ed8e21357470025aa7265faf0c2264\" returns successfully" Aug 19 00:14:48.314300 kubelet[3054]: E0819 00:14:48.314272 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:48.316040 kubelet[3054]: E0819 00:14:48.315941 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:48.316432 kubelet[3054]: E0819 00:14:48.316415 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.109663 kubelet[3054]: I0819 00:14:49.109628 3054 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.109663 kubelet[3054]: E0819 00:14:49.109662 3054 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4426.0.0-a-440c7464d3\": node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.144255 kubelet[3054]: E0819 00:14:49.144218 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.244491 kubelet[3054]: E0819 00:14:49.244457 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.316432 kubelet[3054]: E0819 00:14:49.316223 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.317285 kubelet[3054]: E0819 00:14:49.316372 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.317285 kubelet[3054]: E0819 00:14:49.316809 3054 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-a-440c7464d3\" not found" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.345333 kubelet[3054]: E0819 00:14:49.345313 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.446038 kubelet[3054]: E0819 00:14:49.445782 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.546376 kubelet[3054]: E0819 00:14:49.546333 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.647090 kubelet[3054]: E0819 00:14:49.647055 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.747562 kubelet[3054]: E0819 00:14:49.747468 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.848037 kubelet[3054]: E0819 00:14:49.848012 3054 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:49.969249 kubelet[3054]: I0819 00:14:49.969070 3054 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.972494 kubelet[3054]: E0819 00:14:49.972466 3054 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.972985 kubelet[3054]: I0819 00:14:49.972581 3054 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.974058 kubelet[3054]: E0819 00:14:49.973986 3054 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.975261 kubelet[3054]: I0819 00:14:49.974168 3054 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:49.977643 kubelet[3054]: E0819 00:14:49.977617 3054 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.0.0-a-440c7464d3\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:50.415068 kubelet[3054]: I0819 00:14:50.316117 3054 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:50.415068 kubelet[3054]: I0819 00:14:50.322502 3054 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:50.415068 kubelet[3054]: I0819 00:14:50.411435 3054 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:50.417576 kubelet[3054]: I0819 00:14:50.417544 3054 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:50.560758 kubelet[3054]: I0819 00:14:50.560714 3054 apiserver.go:52] "Watching apiserver" Aug 19 00:14:50.569252 kubelet[3054]: I0819 00:14:50.569224 3054 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 00:14:51.308594 kubelet[3054]: I0819 00:14:51.308472 3054 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" podStartSLOduration=1.3084470320000001 podStartE2EDuration="1.308447032s" podCreationTimestamp="2025-08-19 00:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:14:51.308424832 +0000 UTC m=+11.321055695" watchObservedRunningTime="2025-08-19 00:14:51.308447032 +0000 UTC m=+11.321077863" Aug 19 00:14:51.409746 systemd[1]: Reload requested from client PID 3340 ('systemctl') (unit session-9.scope)... Aug 19 00:14:51.409758 systemd[1]: Reloading... Aug 19 00:14:51.453424 kubelet[3054]: I0819 00:14:51.453172 3054 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:51.460133 kubelet[3054]: I0819 00:14:51.459907 3054 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:51.460953 kubelet[3054]: I0819 00:14:51.460915 3054 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" podStartSLOduration=1.460902968 podStartE2EDuration="1.460902968s" podCreationTimestamp="2025-08-19 00:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:14:51.31663951 +0000 UTC m=+11.329270341" watchObservedRunningTime="2025-08-19 00:14:51.460902968 +0000 UTC m=+11.473533799" Aug 19 00:14:51.482327 zram_generator::config[3387]: No configuration found. Aug 19 00:14:51.637197 systemd[1]: Reloading finished in 227 ms. Aug 19 00:14:51.653615 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:14:51.668791 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 00:14:51.669028 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:51.669084 systemd[1]: kubelet.service: Consumed 823ms CPU time, 125.9M memory peak. Aug 19 00:14:51.670790 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:14:51.781267 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:14:51.785295 (kubelet)[3451]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 00:14:51.860615 kubelet[3451]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:14:51.860615 kubelet[3451]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 00:14:51.860615 kubelet[3451]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:14:51.860893 kubelet[3451]: I0819 00:14:51.860641 3451 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 00:14:51.867637 kubelet[3451]: I0819 00:14:51.867275 3451 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 00:14:51.867637 kubelet[3451]: I0819 00:14:51.867313 3451 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 00:14:51.867637 kubelet[3451]: I0819 00:14:51.867579 3451 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 00:14:51.868945 kubelet[3451]: I0819 00:14:51.868921 3451 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Aug 19 00:14:51.870673 kubelet[3451]: I0819 00:14:51.870519 3451 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 00:14:51.874299 kubelet[3451]: I0819 00:14:51.873817 3451 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 00:14:51.877152 kubelet[3451]: I0819 00:14:51.877078 3451 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 00:14:51.877500 kubelet[3451]: I0819 00:14:51.877462 3451 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 00:14:51.877622 kubelet[3451]: I0819 00:14:51.877502 3451 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.0.0-a-440c7464d3","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 00:14:51.877699 kubelet[3451]: I0819 00:14:51.877624 3451 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 00:14:51.877699 kubelet[3451]: I0819 00:14:51.877631 3451 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 00:14:51.877699 kubelet[3451]: I0819 00:14:51.877663 3451 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:14:51.877854 kubelet[3451]: I0819 00:14:51.877774 3451 kubelet.go:480] "Attempting to sync node with API server" Aug 19 00:14:51.877854 kubelet[3451]: I0819 00:14:51.877785 3451 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 00:14:51.877854 kubelet[3451]: I0819 00:14:51.877822 3451 kubelet.go:386] "Adding apiserver pod source" Aug 19 00:14:51.877854 kubelet[3451]: I0819 00:14:51.877832 3451 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 00:14:51.880479 kubelet[3451]: I0819 00:14:51.880463 3451 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 00:14:51.882909 kubelet[3451]: I0819 00:14:51.882415 3451 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 00:14:51.884017 kubelet[3451]: I0819 00:14:51.884000 3451 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 00:14:51.884110 kubelet[3451]: I0819 00:14:51.884102 3451 server.go:1289] "Started kubelet" Aug 19 00:14:51.887417 kubelet[3451]: I0819 00:14:51.886791 3451 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 00:14:51.889388 kubelet[3451]: I0819 00:14:51.888905 3451 server.go:317] "Adding debug handlers to kubelet server" Aug 19 00:14:51.892760 kubelet[3451]: I0819 00:14:51.892733 3451 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 00:14:51.893109 kubelet[3451]: I0819 00:14:51.893092 3451 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 00:14:51.894502 kubelet[3451]: I0819 00:14:51.894383 3451 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 00:14:51.894567 kubelet[3451]: I0819 00:14:51.894552 3451 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 00:14:51.894613 kubelet[3451]: I0819 00:14:51.894596 3451 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 00:14:51.894667 kubelet[3451]: I0819 00:14:51.894655 3451 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 00:14:51.894738 kubelet[3451]: I0819 00:14:51.894728 3451 reconciler.go:26] "Reconciler: start to sync state" Aug 19 00:14:51.895476 kubelet[3451]: E0819 00:14:51.895114 3451 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-440c7464d3\" not found" Aug 19 00:14:51.897430 kubelet[3451]: I0819 00:14:51.897396 3451 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 00:14:51.900114 kubelet[3451]: I0819 00:14:51.900085 3451 factory.go:223] Registration of the containerd container factory successfully Aug 19 00:14:51.900114 kubelet[3451]: I0819 00:14:51.900100 3451 factory.go:223] Registration of the systemd container factory successfully Aug 19 00:14:51.903454 kubelet[3451]: I0819 00:14:51.903422 3451 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 00:14:51.904316 kubelet[3451]: I0819 00:14:51.904295 3451 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 00:14:51.904465 kubelet[3451]: I0819 00:14:51.904398 3451 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 00:14:51.904465 kubelet[3451]: I0819 00:14:51.904418 3451 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 00:14:51.904465 kubelet[3451]: I0819 00:14:51.904425 3451 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 00:14:51.905095 kubelet[3451]: E0819 00:14:51.904459 3451 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.947980 3451 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.947995 3451 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.948016 3451 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.948106 3451 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.948114 3451 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.948126 3451 policy_none.go:49] "None policy: Start" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.948133 3451 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.948140 3451 state_mem.go:35] "Initializing new in-memory state store" Aug 19 00:14:51.948252 kubelet[3451]: I0819 00:14:51.948194 3451 state_mem.go:75] "Updated machine memory state" Aug 19 00:14:51.951998 kubelet[3451]: E0819 00:14:51.951766 3451 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 00:14:51.953105 kubelet[3451]: I0819 00:14:51.953084 3451 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 00:14:51.953129 kubelet[3451]: I0819 00:14:51.953109 3451 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 00:14:51.953707 kubelet[3451]: I0819 00:14:51.953689 3451 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 00:14:51.956559 kubelet[3451]: E0819 00:14:51.956033 3451 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 00:14:52.005738 kubelet[3451]: I0819 00:14:52.005704 3451 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.006378 kubelet[3451]: I0819 00:14:52.006012 3451 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.006574 kubelet[3451]: I0819 00:14:52.006549 3451 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.015606 kubelet[3451]: I0819 00:14:52.015169 3451 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:52.015665 kubelet[3451]: E0819 00:14:52.015630 3451 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.0.0-a-440c7464d3\" already exists" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.015798 kubelet[3451]: I0819 00:14:52.015780 3451 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:52.015988 kubelet[3451]: E0819 00:14:52.015944 3451 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" already exists" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.016189 kubelet[3451]: I0819 00:14:52.016171 3451 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:52.016250 kubelet[3451]: E0819 00:14:52.016206 3451 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" already exists" pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.054885 kubelet[3451]: I0819 00:14:52.054863 3451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.063179 kubelet[3451]: I0819 00:14:52.063114 3451 kubelet_node_status.go:124] "Node was previously registered" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.063275 kubelet[3451]: I0819 00:14:52.063218 3451 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.095600 kubelet[3451]: I0819 00:14:52.095554 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/efa0ef5413f8e2c6d966d8c05a1dd925-ca-certs\") pod \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" (UID: \"efa0ef5413f8e2c6d966d8c05a1dd925\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.196678 kubelet[3451]: I0819 00:14:52.196592 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/efa0ef5413f8e2c6d966d8c05a1dd925-k8s-certs\") pod \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" (UID: \"efa0ef5413f8e2c6d966d8c05a1dd925\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.197304 kubelet[3451]: I0819 00:14:52.197183 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-ca-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.197304 kubelet[3451]: I0819 00:14:52.197202 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.197304 kubelet[3451]: I0819 00:14:52.197214 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-kubeconfig\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.197485 kubelet[3451]: I0819 00:14:52.197376 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.197485 kubelet[3451]: I0819 00:14:52.197404 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9b6c17d03eb7e5aa70c75363bceb7650-kubeconfig\") pod \"kube-scheduler-ci-4426.0.0-a-440c7464d3\" (UID: \"9b6c17d03eb7e5aa70c75363bceb7650\") " pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.197485 kubelet[3451]: I0819 00:14:52.197448 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/efa0ef5413f8e2c6d966d8c05a1dd925-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" (UID: \"efa0ef5413f8e2c6d966d8c05a1dd925\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.197485 kubelet[3451]: I0819 00:14:52.197459 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/28649fa82ff9de2ea9240397aa36865a-k8s-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-440c7464d3\" (UID: \"28649fa82ff9de2ea9240397aa36865a\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.878797 kubelet[3451]: I0819 00:14:52.878756 3451 apiserver.go:52] "Watching apiserver" Aug 19 00:14:52.895736 kubelet[3451]: I0819 00:14:52.895705 3451 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 00:14:52.935817 kubelet[3451]: I0819 00:14:52.935742 3451 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.937341 kubelet[3451]: I0819 00:14:52.935970 3451 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.948383 kubelet[3451]: I0819 00:14:52.948364 3451 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:52.948740 kubelet[3451]: E0819 00:14:52.948627 3451 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.0.0-a-440c7464d3\" already exists" pod="kube-system/kube-scheduler-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.949107 kubelet[3451]: I0819 00:14:52.949075 3451 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 19 00:14:52.949166 kubelet[3451]: E0819 00:14:52.949112 3451 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.0.0-a-440c7464d3\" already exists" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" Aug 19 00:14:52.979730 kubelet[3451]: I0819 00:14:52.979645 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426.0.0-a-440c7464d3" podStartSLOduration=1.979636642 podStartE2EDuration="1.979636642s" podCreationTimestamp="2025-08-19 00:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:14:52.979515655 +0000 UTC m=+1.191228000" watchObservedRunningTime="2025-08-19 00:14:52.979636642 +0000 UTC m=+1.191348987" Aug 19 00:14:58.104823 kubelet[3451]: I0819 00:14:58.104790 3451 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 19 00:14:58.105201 containerd[1864]: time="2025-08-19T00:14:58.105066564Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 19 00:14:58.105433 kubelet[3451]: I0819 00:14:58.105253 3451 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 19 00:14:58.966427 systemd[1]: Created slice kubepods-besteffort-pod2b2468c9_3648_4591_a65d_f73667836de0.slice - libcontainer container kubepods-besteffort-pod2b2468c9_3648_4591_a65d_f73667836de0.slice. Aug 19 00:14:59.040852 kubelet[3451]: I0819 00:14:59.040830 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2b2468c9-3648-4591-a65d-f73667836de0-kube-proxy\") pod \"kube-proxy-7lgwp\" (UID: \"2b2468c9-3648-4591-a65d-f73667836de0\") " pod="kube-system/kube-proxy-7lgwp" Aug 19 00:14:59.041029 kubelet[3451]: I0819 00:14:59.041008 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlxd\" (UniqueName: \"kubernetes.io/projected/2b2468c9-3648-4591-a65d-f73667836de0-kube-api-access-zhlxd\") pod \"kube-proxy-7lgwp\" (UID: \"2b2468c9-3648-4591-a65d-f73667836de0\") " pod="kube-system/kube-proxy-7lgwp" Aug 19 00:14:59.041029 kubelet[3451]: I0819 00:14:59.041061 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2b2468c9-3648-4591-a65d-f73667836de0-xtables-lock\") pod \"kube-proxy-7lgwp\" (UID: \"2b2468c9-3648-4591-a65d-f73667836de0\") " pod="kube-system/kube-proxy-7lgwp" Aug 19 00:14:59.041029 kubelet[3451]: I0819 00:14:59.041076 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b2468c9-3648-4591-a65d-f73667836de0-lib-modules\") pod \"kube-proxy-7lgwp\" (UID: \"2b2468c9-3648-4591-a65d-f73667836de0\") " pod="kube-system/kube-proxy-7lgwp" Aug 19 00:14:59.235697 systemd[1]: Created slice kubepods-besteffort-pod58bba165_b5ec_4c29_9fe5_023ab0397c1a.slice - libcontainer container kubepods-besteffort-pod58bba165_b5ec_4c29_9fe5_023ab0397c1a.slice. Aug 19 00:14:59.242650 kubelet[3451]: I0819 00:14:59.242625 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/58bba165-b5ec-4c29-9fe5-023ab0397c1a-var-lib-calico\") pod \"tigera-operator-747864d56d-4b89l\" (UID: \"58bba165-b5ec-4c29-9fe5-023ab0397c1a\") " pod="tigera-operator/tigera-operator-747864d56d-4b89l" Aug 19 00:14:59.242917 kubelet[3451]: I0819 00:14:59.242901 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvq8g\" (UniqueName: \"kubernetes.io/projected/58bba165-b5ec-4c29-9fe5-023ab0397c1a-kube-api-access-gvq8g\") pod \"tigera-operator-747864d56d-4b89l\" (UID: \"58bba165-b5ec-4c29-9fe5-023ab0397c1a\") " pod="tigera-operator/tigera-operator-747864d56d-4b89l" Aug 19 00:14:59.279127 containerd[1864]: time="2025-08-19T00:14:59.279102038Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7lgwp,Uid:2b2468c9-3648-4591-a65d-f73667836de0,Namespace:kube-system,Attempt:0,}" Aug 19 00:14:59.320397 containerd[1864]: time="2025-08-19T00:14:59.320363544Z" level=info msg="connecting to shim 49791fcc39850ddec9ef288e27cdf086d3717af972f99ef72c2118ccd128d7c7" address="unix:///run/containerd/s/cdc726e35772e41be6231b70b2766a174de2e780fab9aa4ce25087a306c8ea33" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:14:59.338356 systemd[1]: Started cri-containerd-49791fcc39850ddec9ef288e27cdf086d3717af972f99ef72c2118ccd128d7c7.scope - libcontainer container 49791fcc39850ddec9ef288e27cdf086d3717af972f99ef72c2118ccd128d7c7. Aug 19 00:14:59.365656 containerd[1864]: time="2025-08-19T00:14:59.365627322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7lgwp,Uid:2b2468c9-3648-4591-a65d-f73667836de0,Namespace:kube-system,Attempt:0,} returns sandbox id \"49791fcc39850ddec9ef288e27cdf086d3717af972f99ef72c2118ccd128d7c7\"" Aug 19 00:14:59.373985 containerd[1864]: time="2025-08-19T00:14:59.373954509Z" level=info msg="CreateContainer within sandbox \"49791fcc39850ddec9ef288e27cdf086d3717af972f99ef72c2118ccd128d7c7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 19 00:14:59.396411 containerd[1864]: time="2025-08-19T00:14:59.396109032Z" level=info msg="Container 0d60955e15bbf1e921e9114d51f631bba1a0ce35ad09396c12022c878e8bac72: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:14:59.412901 containerd[1864]: time="2025-08-19T00:14:59.412810599Z" level=info msg="CreateContainer within sandbox \"49791fcc39850ddec9ef288e27cdf086d3717af972f99ef72c2118ccd128d7c7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0d60955e15bbf1e921e9114d51f631bba1a0ce35ad09396c12022c878e8bac72\"" Aug 19 00:14:59.413426 containerd[1864]: time="2025-08-19T00:14:59.413383680Z" level=info msg="StartContainer for \"0d60955e15bbf1e921e9114d51f631bba1a0ce35ad09396c12022c878e8bac72\"" Aug 19 00:14:59.414554 containerd[1864]: time="2025-08-19T00:14:59.414505266Z" level=info msg="connecting to shim 0d60955e15bbf1e921e9114d51f631bba1a0ce35ad09396c12022c878e8bac72" address="unix:///run/containerd/s/cdc726e35772e41be6231b70b2766a174de2e780fab9aa4ce25087a306c8ea33" protocol=ttrpc version=3 Aug 19 00:14:59.428349 systemd[1]: Started cri-containerd-0d60955e15bbf1e921e9114d51f631bba1a0ce35ad09396c12022c878e8bac72.scope - libcontainer container 0d60955e15bbf1e921e9114d51f631bba1a0ce35ad09396c12022c878e8bac72. Aug 19 00:14:59.462266 containerd[1864]: time="2025-08-19T00:14:59.462214126Z" level=info msg="StartContainer for \"0d60955e15bbf1e921e9114d51f631bba1a0ce35ad09396c12022c878e8bac72\" returns successfully" Aug 19 00:14:59.540383 containerd[1864]: time="2025-08-19T00:14:59.540119327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-4b89l,Uid:58bba165-b5ec-4c29-9fe5-023ab0397c1a,Namespace:tigera-operator,Attempt:0,}" Aug 19 00:14:59.584809 containerd[1864]: time="2025-08-19T00:14:59.584743926Z" level=info msg="connecting to shim e76ec1b7d65587e5d86497c3e89151fb932c9eb4cdc3e3e24095918a4e74391a" address="unix:///run/containerd/s/52e0447a3b42ea4554eaddbf842f1815a912369dc6a57e39c9dd09cb364fef31" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:14:59.608361 systemd[1]: Started cri-containerd-e76ec1b7d65587e5d86497c3e89151fb932c9eb4cdc3e3e24095918a4e74391a.scope - libcontainer container e76ec1b7d65587e5d86497c3e89151fb932c9eb4cdc3e3e24095918a4e74391a. Aug 19 00:14:59.633472 containerd[1864]: time="2025-08-19T00:14:59.633438863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-4b89l,Uid:58bba165-b5ec-4c29-9fe5-023ab0397c1a,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e76ec1b7d65587e5d86497c3e89151fb932c9eb4cdc3e3e24095918a4e74391a\"" Aug 19 00:14:59.634896 containerd[1864]: time="2025-08-19T00:14:59.634861042Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 19 00:15:02.118030 kubelet[3451]: I0819 00:15:02.117977 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7lgwp" podStartSLOduration=4.117964635 podStartE2EDuration="4.117964635s" podCreationTimestamp="2025-08-19 00:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:14:59.959142155 +0000 UTC m=+8.170854500" watchObservedRunningTime="2025-08-19 00:15:02.117964635 +0000 UTC m=+10.329676980" Aug 19 00:15:02.624520 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount323790057.mount: Deactivated successfully. Aug 19 00:15:03.573376 containerd[1864]: time="2025-08-19T00:15:03.573326041Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:03.576253 containerd[1864]: time="2025-08-19T00:15:03.576208103Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Aug 19 00:15:03.579823 containerd[1864]: time="2025-08-19T00:15:03.579779074Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:03.583837 containerd[1864]: time="2025-08-19T00:15:03.583798067Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:03.584290 containerd[1864]: time="2025-08-19T00:15:03.584108628Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 3.949198896s" Aug 19 00:15:03.584290 containerd[1864]: time="2025-08-19T00:15:03.584135373Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Aug 19 00:15:03.592915 containerd[1864]: time="2025-08-19T00:15:03.592877179Z" level=info msg="CreateContainer within sandbox \"e76ec1b7d65587e5d86497c3e89151fb932c9eb4cdc3e3e24095918a4e74391a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 19 00:15:03.617653 containerd[1864]: time="2025-08-19T00:15:03.617484549Z" level=info msg="Container 9f60f0d242c2a21350c903fea709aff15de36c7c7edd99efc3314d282b1be0b7: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:15:03.664452 containerd[1864]: time="2025-08-19T00:15:03.664419461Z" level=info msg="CreateContainer within sandbox \"e76ec1b7d65587e5d86497c3e89151fb932c9eb4cdc3e3e24095918a4e74391a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9f60f0d242c2a21350c903fea709aff15de36c7c7edd99efc3314d282b1be0b7\"" Aug 19 00:15:03.665016 containerd[1864]: time="2025-08-19T00:15:03.664986102Z" level=info msg="StartContainer for \"9f60f0d242c2a21350c903fea709aff15de36c7c7edd99efc3314d282b1be0b7\"" Aug 19 00:15:03.665836 containerd[1864]: time="2025-08-19T00:15:03.665789846Z" level=info msg="connecting to shim 9f60f0d242c2a21350c903fea709aff15de36c7c7edd99efc3314d282b1be0b7" address="unix:///run/containerd/s/52e0447a3b42ea4554eaddbf842f1815a912369dc6a57e39c9dd09cb364fef31" protocol=ttrpc version=3 Aug 19 00:15:03.682352 systemd[1]: Started cri-containerd-9f60f0d242c2a21350c903fea709aff15de36c7c7edd99efc3314d282b1be0b7.scope - libcontainer container 9f60f0d242c2a21350c903fea709aff15de36c7c7edd99efc3314d282b1be0b7. Aug 19 00:15:03.708916 containerd[1864]: time="2025-08-19T00:15:03.708880178Z" level=info msg="StartContainer for \"9f60f0d242c2a21350c903fea709aff15de36c7c7edd99efc3314d282b1be0b7\" returns successfully" Aug 19 00:15:08.841951 sudo[2332]: pam_unix(sudo:session): session closed for user root Aug 19 00:15:08.912340 sshd[2331]: Connection closed by 10.200.16.10 port 60008 Aug 19 00:15:08.912637 sshd-session[2328]: pam_unix(sshd:session): session closed for user core Aug 19 00:15:08.917348 systemd[1]: session-9.scope: Deactivated successfully. Aug 19 00:15:08.919275 systemd[1]: session-9.scope: Consumed 3.210s CPU time, 221.1M memory peak. Aug 19 00:15:08.921700 systemd[1]: sshd@6-10.200.20.41:22-10.200.16.10:60008.service: Deactivated successfully. Aug 19 00:15:08.923906 systemd-logind[1846]: Session 9 logged out. Waiting for processes to exit. Aug 19 00:15:08.925180 systemd-logind[1846]: Removed session 9. Aug 19 00:15:12.336206 kubelet[3451]: I0819 00:15:12.336151 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-4b89l" podStartSLOduration=9.385702076 podStartE2EDuration="13.33613321s" podCreationTimestamp="2025-08-19 00:14:59 +0000 UTC" firstStartedPulling="2025-08-19 00:14:59.634435805 +0000 UTC m=+7.846148150" lastFinishedPulling="2025-08-19 00:15:03.584866939 +0000 UTC m=+11.796579284" observedRunningTime="2025-08-19 00:15:03.967487238 +0000 UTC m=+12.179199591" watchObservedRunningTime="2025-08-19 00:15:12.33613321 +0000 UTC m=+20.547845563" Aug 19 00:15:12.349801 systemd[1]: Created slice kubepods-besteffort-podf9771acd_192a_4514_bf88_e3278b09c57f.slice - libcontainer container kubepods-besteffort-podf9771acd_192a_4514_bf88_e3278b09c57f.slice. Aug 19 00:15:12.424803 kubelet[3451]: I0819 00:15:12.424759 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kn5r\" (UniqueName: \"kubernetes.io/projected/f9771acd-192a-4514-bf88-e3278b09c57f-kube-api-access-8kn5r\") pod \"calico-typha-6db5cf5bd6-r7fgg\" (UID: \"f9771acd-192a-4514-bf88-e3278b09c57f\") " pod="calico-system/calico-typha-6db5cf5bd6-r7fgg" Aug 19 00:15:12.424803 kubelet[3451]: I0819 00:15:12.424800 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f9771acd-192a-4514-bf88-e3278b09c57f-typha-certs\") pod \"calico-typha-6db5cf5bd6-r7fgg\" (UID: \"f9771acd-192a-4514-bf88-e3278b09c57f\") " pod="calico-system/calico-typha-6db5cf5bd6-r7fgg" Aug 19 00:15:12.424803 kubelet[3451]: I0819 00:15:12.424814 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9771acd-192a-4514-bf88-e3278b09c57f-tigera-ca-bundle\") pod \"calico-typha-6db5cf5bd6-r7fgg\" (UID: \"f9771acd-192a-4514-bf88-e3278b09c57f\") " pod="calico-system/calico-typha-6db5cf5bd6-r7fgg" Aug 19 00:15:12.455709 systemd[1]: Created slice kubepods-besteffort-pod50143929_c239_498b_8a4e_d351083d2e0c.slice - libcontainer container kubepods-besteffort-pod50143929_c239_498b_8a4e_d351083d2e0c.slice. Aug 19 00:15:12.525322 kubelet[3451]: I0819 00:15:12.525275 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-cni-log-dir\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525322 kubelet[3451]: I0819 00:15:12.525323 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-xtables-lock\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525467 kubelet[3451]: I0819 00:15:12.525340 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-cni-bin-dir\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525467 kubelet[3451]: I0819 00:15:12.525349 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-flexvol-driver-host\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525467 kubelet[3451]: I0819 00:15:12.525359 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-var-run-calico\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525467 kubelet[3451]: I0819 00:15:12.525370 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/50143929-c239-498b-8a4e-d351083d2e0c-node-certs\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525467 kubelet[3451]: I0819 00:15:12.525379 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-policysync\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525550 kubelet[3451]: I0819 00:15:12.525390 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50143929-c239-498b-8a4e-d351083d2e0c-tigera-ca-bundle\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525550 kubelet[3451]: I0819 00:15:12.525403 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-var-lib-calico\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525550 kubelet[3451]: I0819 00:15:12.525413 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-lib-modules\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525550 kubelet[3451]: I0819 00:15:12.525423 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grtmz\" (UniqueName: \"kubernetes.io/projected/50143929-c239-498b-8a4e-d351083d2e0c-kube-api-access-grtmz\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.525550 kubelet[3451]: I0819 00:15:12.525433 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/50143929-c239-498b-8a4e-d351083d2e0c-cni-net-dir\") pod \"calico-node-kq8gb\" (UID: \"50143929-c239-498b-8a4e-d351083d2e0c\") " pod="calico-system/calico-node-kq8gb" Aug 19 00:15:12.599326 kubelet[3451]: E0819 00:15:12.598943 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:12.626867 kubelet[3451]: I0819 00:15:12.626476 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82bb62c5-1662-42fe-899a-c8b29c1bb13e-socket-dir\") pod \"csi-node-driver-c697k\" (UID: \"82bb62c5-1662-42fe-899a-c8b29c1bb13e\") " pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:12.627200 kubelet[3451]: I0819 00:15:12.627141 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/82bb62c5-1662-42fe-899a-c8b29c1bb13e-varrun\") pod \"csi-node-driver-c697k\" (UID: \"82bb62c5-1662-42fe-899a-c8b29c1bb13e\") " pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:12.627492 kubelet[3451]: I0819 00:15:12.627469 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpq9h\" (UniqueName: \"kubernetes.io/projected/82bb62c5-1662-42fe-899a-c8b29c1bb13e-kube-api-access-lpq9h\") pod \"csi-node-driver-c697k\" (UID: \"82bb62c5-1662-42fe-899a-c8b29c1bb13e\") " pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:12.627882 kubelet[3451]: E0819 00:15:12.627837 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.627882 kubelet[3451]: W0819 00:15:12.627852 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.628032 kubelet[3451]: E0819 00:15:12.627966 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.628269 kubelet[3451]: E0819 00:15:12.628237 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.628269 kubelet[3451]: W0819 00:15:12.628248 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.628269 kubelet[3451]: E0819 00:15:12.628258 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.628525 kubelet[3451]: E0819 00:15:12.628513 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.628621 kubelet[3451]: W0819 00:15:12.628594 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.628621 kubelet[3451]: E0819 00:15:12.628609 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.628873 kubelet[3451]: E0819 00:15:12.628846 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.628873 kubelet[3451]: W0819 00:15:12.628856 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.628873 kubelet[3451]: E0819 00:15:12.628864 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.631448 kubelet[3451]: E0819 00:15:12.631434 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.631626 kubelet[3451]: W0819 00:15:12.631562 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.631626 kubelet[3451]: E0819 00:15:12.631582 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.631804 kubelet[3451]: E0819 00:15:12.631792 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.631946 kubelet[3451]: W0819 00:15:12.631856 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.631946 kubelet[3451]: E0819 00:15:12.631871 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.633164 kubelet[3451]: E0819 00:15:12.633146 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.633290 kubelet[3451]: W0819 00:15:12.633275 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.633355 kubelet[3451]: E0819 00:15:12.633338 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.633567 kubelet[3451]: E0819 00:15:12.633554 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.633738 kubelet[3451]: W0819 00:15:12.633623 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.633738 kubelet[3451]: E0819 00:15:12.633639 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.634483 kubelet[3451]: E0819 00:15:12.634313 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.634483 kubelet[3451]: W0819 00:15:12.634325 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.634483 kubelet[3451]: E0819 00:15:12.634336 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.634856 kubelet[3451]: E0819 00:15:12.634789 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.634856 kubelet[3451]: W0819 00:15:12.634801 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.634856 kubelet[3451]: E0819 00:15:12.634811 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.635163 kubelet[3451]: E0819 00:15:12.635150 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.635387 kubelet[3451]: W0819 00:15:12.635255 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.635387 kubelet[3451]: E0819 00:15:12.635275 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.635387 kubelet[3451]: I0819 00:15:12.635334 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82bb62c5-1662-42fe-899a-c8b29c1bb13e-kubelet-dir\") pod \"csi-node-driver-c697k\" (UID: \"82bb62c5-1662-42fe-899a-c8b29c1bb13e\") " pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:12.635723 kubelet[3451]: E0819 00:15:12.635709 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.635925 kubelet[3451]: W0819 00:15:12.635790 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.635925 kubelet[3451]: E0819 00:15:12.635805 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.636335 kubelet[3451]: E0819 00:15:12.636212 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.636483 kubelet[3451]: W0819 00:15:12.636225 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.636483 kubelet[3451]: E0819 00:15:12.636427 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.636849 kubelet[3451]: E0819 00:15:12.636836 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.637007 kubelet[3451]: W0819 00:15:12.636918 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.637007 kubelet[3451]: E0819 00:15:12.636935 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.637592 kubelet[3451]: E0819 00:15:12.637436 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.637592 kubelet[3451]: W0819 00:15:12.637448 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.637592 kubelet[3451]: E0819 00:15:12.637457 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.638123 kubelet[3451]: E0819 00:15:12.637879 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.638123 kubelet[3451]: W0819 00:15:12.637892 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.638123 kubelet[3451]: E0819 00:15:12.637902 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.638443 kubelet[3451]: E0819 00:15:12.638372 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.638799 kubelet[3451]: W0819 00:15:12.638519 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.638799 kubelet[3451]: E0819 00:15:12.638537 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.639020 kubelet[3451]: E0819 00:15:12.639006 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.639183 kubelet[3451]: W0819 00:15:12.639084 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.639183 kubelet[3451]: E0819 00:15:12.639100 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.639672 kubelet[3451]: E0819 00:15:12.639608 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.639672 kubelet[3451]: W0819 00:15:12.639620 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.639672 kubelet[3451]: E0819 00:15:12.639630 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.639998 kubelet[3451]: E0819 00:15:12.639983 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.640344 kubelet[3451]: W0819 00:15:12.640259 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.640344 kubelet[3451]: E0819 00:15:12.640285 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.640344 kubelet[3451]: I0819 00:15:12.640326 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82bb62c5-1662-42fe-899a-c8b29c1bb13e-registration-dir\") pod \"csi-node-driver-c697k\" (UID: \"82bb62c5-1662-42fe-899a-c8b29c1bb13e\") " pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:12.640581 kubelet[3451]: E0819 00:15:12.640555 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.640759 kubelet[3451]: W0819 00:15:12.640683 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.640759 kubelet[3451]: E0819 00:15:12.640701 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.641152 kubelet[3451]: E0819 00:15:12.641118 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.641152 kubelet[3451]: W0819 00:15:12.641130 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.641152 kubelet[3451]: E0819 00:15:12.641139 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.641661 kubelet[3451]: E0819 00:15:12.641584 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.641661 kubelet[3451]: W0819 00:15:12.641597 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.641661 kubelet[3451]: E0819 00:15:12.641607 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.642051 kubelet[3451]: E0819 00:15:12.642039 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.642196 kubelet[3451]: W0819 00:15:12.642093 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.642196 kubelet[3451]: E0819 00:15:12.642106 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.642473 kubelet[3451]: E0819 00:15:12.642462 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.642558 kubelet[3451]: W0819 00:15:12.642546 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.642671 kubelet[3451]: E0819 00:15:12.642590 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.642841 kubelet[3451]: E0819 00:15:12.642829 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.642922 kubelet[3451]: W0819 00:15:12.642910 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.642994 kubelet[3451]: E0819 00:15:12.642976 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.643194 kubelet[3451]: E0819 00:15:12.643184 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.643453 kubelet[3451]: W0819 00:15:12.643273 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.643453 kubelet[3451]: E0819 00:15:12.643288 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.643651 kubelet[3451]: E0819 00:15:12.643638 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.644074 kubelet[3451]: W0819 00:15:12.643960 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.644074 kubelet[3451]: E0819 00:15:12.643980 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.644364 kubelet[3451]: E0819 00:15:12.644261 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.644364 kubelet[3451]: W0819 00:15:12.644272 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.644364 kubelet[3451]: E0819 00:15:12.644281 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.644668 kubelet[3451]: E0819 00:15:12.644624 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.644668 kubelet[3451]: W0819 00:15:12.644637 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.644668 kubelet[3451]: E0819 00:15:12.644651 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.645349 kubelet[3451]: E0819 00:15:12.645300 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.645349 kubelet[3451]: W0819 00:15:12.645313 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.645349 kubelet[3451]: E0819 00:15:12.645323 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.645767 kubelet[3451]: E0819 00:15:12.645755 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.645922 kubelet[3451]: W0819 00:15:12.645834 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.645922 kubelet[3451]: E0819 00:15:12.645847 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.646137 kubelet[3451]: E0819 00:15:12.646125 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.646307 kubelet[3451]: W0819 00:15:12.646177 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.646307 kubelet[3451]: E0819 00:15:12.646190 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.646663 kubelet[3451]: E0819 00:15:12.646581 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.646663 kubelet[3451]: W0819 00:15:12.646596 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.646663 kubelet[3451]: E0819 00:15:12.646606 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.647299 kubelet[3451]: E0819 00:15:12.647213 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.647716 kubelet[3451]: W0819 00:15:12.647422 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.647716 kubelet[3451]: E0819 00:15:12.647464 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.648660 kubelet[3451]: E0819 00:15:12.648635 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.648889 kubelet[3451]: W0819 00:15:12.648825 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.648889 kubelet[3451]: E0819 00:15:12.648845 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.649435 kubelet[3451]: E0819 00:15:12.649349 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.649435 kubelet[3451]: W0819 00:15:12.649363 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.649435 kubelet[3451]: E0819 00:15:12.649376 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.650402 kubelet[3451]: E0819 00:15:12.650380 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.650532 kubelet[3451]: W0819 00:15:12.650462 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.650532 kubelet[3451]: E0819 00:15:12.650477 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.650749 kubelet[3451]: E0819 00:15:12.650737 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.650864 kubelet[3451]: W0819 00:15:12.650806 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.650864 kubelet[3451]: E0819 00:15:12.650821 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.651043 kubelet[3451]: E0819 00:15:12.651032 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.651233 kubelet[3451]: W0819 00:15:12.651099 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.651233 kubelet[3451]: E0819 00:15:12.651113 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.651525 kubelet[3451]: E0819 00:15:12.651438 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.651525 kubelet[3451]: W0819 00:15:12.651449 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.651525 kubelet[3451]: E0819 00:15:12.651458 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.651890 kubelet[3451]: E0819 00:15:12.651749 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.651890 kubelet[3451]: W0819 00:15:12.651760 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.651890 kubelet[3451]: E0819 00:15:12.651769 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.652319 kubelet[3451]: E0819 00:15:12.652180 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.652319 kubelet[3451]: W0819 00:15:12.652201 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.652319 kubelet[3451]: E0819 00:15:12.652213 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.652749 kubelet[3451]: E0819 00:15:12.652671 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.652749 kubelet[3451]: W0819 00:15:12.652683 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.652749 kubelet[3451]: E0819 00:15:12.652693 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.653064 kubelet[3451]: E0819 00:15:12.653027 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.653064 kubelet[3451]: W0819 00:15:12.653041 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.653187 kubelet[3451]: E0819 00:15:12.653153 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.656001 containerd[1864]: time="2025-08-19T00:15:12.655896551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6db5cf5bd6-r7fgg,Uid:f9771acd-192a-4514-bf88-e3278b09c57f,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:12.662166 kubelet[3451]: E0819 00:15:12.662145 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.662166 kubelet[3451]: W0819 00:15:12.662161 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.662280 kubelet[3451]: E0819 00:15:12.662173 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.706993 containerd[1864]: time="2025-08-19T00:15:12.706943748Z" level=info msg="connecting to shim 7132b5582ea8e02e6b96d72f3a0dc826953ba2ee5527fff4554f8052d76b57a0" address="unix:///run/containerd/s/629da28644244dcf40c33186f9c5b0ed6593886b45df5ea4369259ccdf50e61b" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:15:12.729377 systemd[1]: Started cri-containerd-7132b5582ea8e02e6b96d72f3a0dc826953ba2ee5527fff4554f8052d76b57a0.scope - libcontainer container 7132b5582ea8e02e6b96d72f3a0dc826953ba2ee5527fff4554f8052d76b57a0. Aug 19 00:15:12.746813 kubelet[3451]: E0819 00:15:12.746668 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.746813 kubelet[3451]: W0819 00:15:12.746688 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.746813 kubelet[3451]: E0819 00:15:12.746706 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.747344 kubelet[3451]: E0819 00:15:12.747115 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.747702 kubelet[3451]: W0819 00:15:12.747207 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.747702 kubelet[3451]: E0819 00:15:12.747687 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.748206 kubelet[3451]: E0819 00:15:12.748108 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.748206 kubelet[3451]: W0819 00:15:12.748150 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.748206 kubelet[3451]: E0819 00:15:12.748163 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.748869 kubelet[3451]: E0819 00:15:12.748831 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.748869 kubelet[3451]: W0819 00:15:12.748845 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.749188 kubelet[3451]: E0819 00:15:12.749003 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.749844 kubelet[3451]: E0819 00:15:12.749721 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.749844 kubelet[3451]: W0819 00:15:12.749735 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.749844 kubelet[3451]: E0819 00:15:12.749746 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.751531 kubelet[3451]: E0819 00:15:12.751386 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.751531 kubelet[3451]: W0819 00:15:12.751405 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.751531 kubelet[3451]: E0819 00:15:12.751418 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.751899 kubelet[3451]: E0819 00:15:12.751855 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.752146 kubelet[3451]: W0819 00:15:12.752128 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.752323 kubelet[3451]: E0819 00:15:12.752210 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.753001 kubelet[3451]: E0819 00:15:12.752688 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.753001 kubelet[3451]: W0819 00:15:12.752702 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.753001 kubelet[3451]: E0819 00:15:12.752713 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.753563 kubelet[3451]: E0819 00:15:12.753463 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.753741 kubelet[3451]: W0819 00:15:12.753638 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.753826 kubelet[3451]: E0819 00:15:12.753804 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.754389 kubelet[3451]: E0819 00:15:12.754261 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.754389 kubelet[3451]: W0819 00:15:12.754359 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.754389 kubelet[3451]: E0819 00:15:12.754375 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.755471 kubelet[3451]: E0819 00:15:12.755270 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.755471 kubelet[3451]: W0819 00:15:12.755285 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.755471 kubelet[3451]: E0819 00:15:12.755297 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.756642 kubelet[3451]: E0819 00:15:12.756536 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.756913 kubelet[3451]: W0819 00:15:12.756718 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.756913 kubelet[3451]: E0819 00:15:12.756737 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.757130 kubelet[3451]: E0819 00:15:12.757115 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.757547 kubelet[3451]: W0819 00:15:12.757452 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.757547 kubelet[3451]: E0819 00:15:12.757485 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.758140 kubelet[3451]: E0819 00:15:12.758017 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.758140 kubelet[3451]: W0819 00:15:12.758033 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.758140 kubelet[3451]: E0819 00:15:12.758044 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.758877 kubelet[3451]: E0819 00:15:12.758731 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.758877 kubelet[3451]: W0819 00:15:12.758744 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.758877 kubelet[3451]: E0819 00:15:12.758754 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.760327 kubelet[3451]: E0819 00:15:12.760154 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.760327 kubelet[3451]: W0819 00:15:12.760201 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.760327 kubelet[3451]: E0819 00:15:12.760217 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.760744 containerd[1864]: time="2025-08-19T00:15:12.760561438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kq8gb,Uid:50143929-c239-498b-8a4e-d351083d2e0c,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:12.760949 kubelet[3451]: E0819 00:15:12.760935 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.761587 kubelet[3451]: W0819 00:15:12.761212 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.761587 kubelet[3451]: E0819 00:15:12.761252 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.762339 kubelet[3451]: E0819 00:15:12.762304 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.762606 kubelet[3451]: W0819 00:15:12.762543 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.763039 kubelet[3451]: E0819 00:15:12.762777 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.764025 kubelet[3451]: E0819 00:15:12.763502 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.764237 kubelet[3451]: W0819 00:15:12.764160 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.764237 kubelet[3451]: E0819 00:15:12.764193 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.765382 kubelet[3451]: E0819 00:15:12.765362 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.766206 kubelet[3451]: W0819 00:15:12.766069 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.766206 kubelet[3451]: E0819 00:15:12.766089 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.767863 kubelet[3451]: E0819 00:15:12.767843 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.768483 kubelet[3451]: W0819 00:15:12.768321 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.768483 kubelet[3451]: E0819 00:15:12.768366 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.768803 kubelet[3451]: E0819 00:15:12.768708 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.768803 kubelet[3451]: W0819 00:15:12.768725 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.768803 kubelet[3451]: E0819 00:15:12.768736 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.769548 kubelet[3451]: E0819 00:15:12.769505 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.769548 kubelet[3451]: W0819 00:15:12.769525 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.769548 kubelet[3451]: E0819 00:15:12.769536 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.770046 kubelet[3451]: E0819 00:15:12.769941 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.770046 kubelet[3451]: W0819 00:15:12.769973 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.770046 kubelet[3451]: E0819 00:15:12.769986 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.772119 kubelet[3451]: E0819 00:15:12.771242 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.772119 kubelet[3451]: W0819 00:15:12.772115 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.772298 kubelet[3451]: E0819 00:15:12.772130 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.772362 kubelet[3451]: E0819 00:15:12.772349 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:12.772362 kubelet[3451]: W0819 00:15:12.772360 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:12.772414 kubelet[3451]: E0819 00:15:12.772370 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:12.794837 containerd[1864]: time="2025-08-19T00:15:12.794804118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6db5cf5bd6-r7fgg,Uid:f9771acd-192a-4514-bf88-e3278b09c57f,Namespace:calico-system,Attempt:0,} returns sandbox id \"7132b5582ea8e02e6b96d72f3a0dc826953ba2ee5527fff4554f8052d76b57a0\"" Aug 19 00:15:12.796108 containerd[1864]: time="2025-08-19T00:15:12.796067252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 19 00:15:12.815686 containerd[1864]: time="2025-08-19T00:15:12.815648165Z" level=info msg="connecting to shim 871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7" address="unix:///run/containerd/s/95157f1e749fbcbcb6e85cc3da5bd496bbdc9960063eb8607612fbd66eedc350" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:15:12.835427 systemd[1]: Started cri-containerd-871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7.scope - libcontainer container 871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7. Aug 19 00:15:12.862839 containerd[1864]: time="2025-08-19T00:15:12.862741956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kq8gb,Uid:50143929-c239-498b-8a4e-d351083d2e0c,Namespace:calico-system,Attempt:0,} returns sandbox id \"871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7\"" Aug 19 00:15:13.905248 kubelet[3451]: E0819 00:15:13.905171 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:13.960220 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1628333412.mount: Deactivated successfully. Aug 19 00:15:14.911084 containerd[1864]: time="2025-08-19T00:15:14.911038840Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:14.914086 containerd[1864]: time="2025-08-19T00:15:14.913981536Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Aug 19 00:15:14.917422 containerd[1864]: time="2025-08-19T00:15:14.917397102Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:14.921676 containerd[1864]: time="2025-08-19T00:15:14.921636093Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:14.921913 containerd[1864]: time="2025-08-19T00:15:14.921890533Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 2.125798392s" Aug 19 00:15:14.921951 containerd[1864]: time="2025-08-19T00:15:14.921915205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Aug 19 00:15:14.923155 containerd[1864]: time="2025-08-19T00:15:14.923134162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 19 00:15:14.940125 containerd[1864]: time="2025-08-19T00:15:14.940079500Z" level=info msg="CreateContainer within sandbox \"7132b5582ea8e02e6b96d72f3a0dc826953ba2ee5527fff4554f8052d76b57a0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 19 00:15:14.965724 containerd[1864]: time="2025-08-19T00:15:14.965672473Z" level=info msg="Container b8fe88a6a30d6c7341675f71123a02449e0cd1000b3c2e6917a5f2574a95647b: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:15:14.987292 containerd[1864]: time="2025-08-19T00:15:14.987267118Z" level=info msg="CreateContainer within sandbox \"7132b5582ea8e02e6b96d72f3a0dc826953ba2ee5527fff4554f8052d76b57a0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b8fe88a6a30d6c7341675f71123a02449e0cd1000b3c2e6917a5f2574a95647b\"" Aug 19 00:15:14.987786 containerd[1864]: time="2025-08-19T00:15:14.987755565Z" level=info msg="StartContainer for \"b8fe88a6a30d6c7341675f71123a02449e0cd1000b3c2e6917a5f2574a95647b\"" Aug 19 00:15:14.989100 containerd[1864]: time="2025-08-19T00:15:14.989076108Z" level=info msg="connecting to shim b8fe88a6a30d6c7341675f71123a02449e0cd1000b3c2e6917a5f2574a95647b" address="unix:///run/containerd/s/629da28644244dcf40c33186f9c5b0ed6593886b45df5ea4369259ccdf50e61b" protocol=ttrpc version=3 Aug 19 00:15:15.009357 systemd[1]: Started cri-containerd-b8fe88a6a30d6c7341675f71123a02449e0cd1000b3c2e6917a5f2574a95647b.scope - libcontainer container b8fe88a6a30d6c7341675f71123a02449e0cd1000b3c2e6917a5f2574a95647b. Aug 19 00:15:15.043552 containerd[1864]: time="2025-08-19T00:15:15.043484670Z" level=info msg="StartContainer for \"b8fe88a6a30d6c7341675f71123a02449e0cd1000b3c2e6917a5f2574a95647b\" returns successfully" Aug 19 00:15:15.907124 kubelet[3451]: E0819 00:15:15.907083 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:16.029408 kubelet[3451]: E0819 00:15:16.029335 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.029408 kubelet[3451]: W0819 00:15:16.029356 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.029408 kubelet[3451]: E0819 00:15:16.029373 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.029854 kubelet[3451]: E0819 00:15:16.029748 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.029854 kubelet[3451]: W0819 00:15:16.029762 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.029854 kubelet[3451]: E0819 00:15:16.029793 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.030109 kubelet[3451]: E0819 00:15:16.030098 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.030259 kubelet[3451]: W0819 00:15:16.030224 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.030397 kubelet[3451]: E0819 00:15:16.030305 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.030595 kubelet[3451]: E0819 00:15:16.030552 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.030595 kubelet[3451]: W0819 00:15:16.030572 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.030750 kubelet[3451]: E0819 00:15:16.030676 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.030947 kubelet[3451]: E0819 00:15:16.030936 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.031087 kubelet[3451]: W0819 00:15:16.030990 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.031087 kubelet[3451]: E0819 00:15:16.031002 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.031304 kubelet[3451]: E0819 00:15:16.031261 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.031304 kubelet[3451]: W0819 00:15:16.031275 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.031304 kubelet[3451]: E0819 00:15:16.031284 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.031567 kubelet[3451]: E0819 00:15:16.031557 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.031702 kubelet[3451]: W0819 00:15:16.031601 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.031702 kubelet[3451]: E0819 00:15:16.031613 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.031862 kubelet[3451]: E0819 00:15:16.031852 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.031948 kubelet[3451]: W0819 00:15:16.031922 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.032093 kubelet[3451]: E0819 00:15:16.031934 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.032289 kubelet[3451]: E0819 00:15:16.032262 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.032289 kubelet[3451]: W0819 00:15:16.032273 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.032459 kubelet[3451]: E0819 00:15:16.032372 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.032617 kubelet[3451]: E0819 00:15:16.032555 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.032617 kubelet[3451]: W0819 00:15:16.032565 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.032617 kubelet[3451]: E0819 00:15:16.032574 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.032809 kubelet[3451]: E0819 00:15:16.032799 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.032942 kubelet[3451]: W0819 00:15:16.032859 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.032942 kubelet[3451]: E0819 00:15:16.032871 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.033165 kubelet[3451]: E0819 00:15:16.033113 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.033165 kubelet[3451]: W0819 00:15:16.033124 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.033165 kubelet[3451]: E0819 00:15:16.033133 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.033429 kubelet[3451]: E0819 00:15:16.033386 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.033429 kubelet[3451]: W0819 00:15:16.033397 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.033429 kubelet[3451]: E0819 00:15:16.033406 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.033759 kubelet[3451]: E0819 00:15:16.033672 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.033759 kubelet[3451]: W0819 00:15:16.033682 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.033759 kubelet[3451]: E0819 00:15:16.033694 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.033906 kubelet[3451]: E0819 00:15:16.033896 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.033980 kubelet[3451]: W0819 00:15:16.033970 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.034109 kubelet[3451]: E0819 00:15:16.034014 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.072868 kubelet[3451]: E0819 00:15:16.072849 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.073042 kubelet[3451]: W0819 00:15:16.072929 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.073042 kubelet[3451]: E0819 00:15:16.072945 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.073761 kubelet[3451]: E0819 00:15:16.073631 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.073761 kubelet[3451]: W0819 00:15:16.073644 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.073761 kubelet[3451]: E0819 00:15:16.073654 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.074001 kubelet[3451]: E0819 00:15:16.073946 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.074217 kubelet[3451]: W0819 00:15:16.074157 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.074217 kubelet[3451]: E0819 00:15:16.074174 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.074518 kubelet[3451]: E0819 00:15:16.074489 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.074518 kubelet[3451]: W0819 00:15:16.074502 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.074518 kubelet[3451]: E0819 00:15:16.074513 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.074934 kubelet[3451]: E0819 00:15:16.074879 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.074934 kubelet[3451]: W0819 00:15:16.074892 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.074934 kubelet[3451]: E0819 00:15:16.074902 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.075171 kubelet[3451]: E0819 00:15:16.075151 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.075171 kubelet[3451]: W0819 00:15:16.075164 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.075261 kubelet[3451]: E0819 00:15:16.075174 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.075512 kubelet[3451]: E0819 00:15:16.075496 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.075512 kubelet[3451]: W0819 00:15:16.075510 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.075668 kubelet[3451]: E0819 00:15:16.075520 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.076092 kubelet[3451]: E0819 00:15:16.076065 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.076092 kubelet[3451]: W0819 00:15:16.076083 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.076092 kubelet[3451]: E0819 00:15:16.076095 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.076433 kubelet[3451]: E0819 00:15:16.076417 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.076433 kubelet[3451]: W0819 00:15:16.076430 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.076536 kubelet[3451]: E0819 00:15:16.076440 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.076763 kubelet[3451]: E0819 00:15:16.076745 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.076763 kubelet[3451]: W0819 00:15:16.076759 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.076853 kubelet[3451]: E0819 00:15:16.076770 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.077058 kubelet[3451]: E0819 00:15:16.077044 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.077058 kubelet[3451]: W0819 00:15:16.077056 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.077131 kubelet[3451]: E0819 00:15:16.077066 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.077258 kubelet[3451]: E0819 00:15:16.077243 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.077258 kubelet[3451]: W0819 00:15:16.077254 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.077309 kubelet[3451]: E0819 00:15:16.077262 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.077426 kubelet[3451]: E0819 00:15:16.077414 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.077426 kubelet[3451]: W0819 00:15:16.077424 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.077470 kubelet[3451]: E0819 00:15:16.077432 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.077756 kubelet[3451]: E0819 00:15:16.077700 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.077756 kubelet[3451]: W0819 00:15:16.077712 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.077756 kubelet[3451]: E0819 00:15:16.077721 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.078036 kubelet[3451]: E0819 00:15:16.077961 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.078036 kubelet[3451]: W0819 00:15:16.077972 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.078036 kubelet[3451]: E0819 00:15:16.077982 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.078268 kubelet[3451]: E0819 00:15:16.078251 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.078485 kubelet[3451]: W0819 00:15:16.078328 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.078485 kubelet[3451]: E0819 00:15:16.078342 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.078594 kubelet[3451]: E0819 00:15:16.078579 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.078617 kubelet[3451]: W0819 00:15:16.078598 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.078617 kubelet[3451]: E0819 00:15:16.078609 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.079094 kubelet[3451]: E0819 00:15:16.079052 3451 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:15:16.079094 kubelet[3451]: W0819 00:15:16.079064 3451 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:15:16.079094 kubelet[3451]: E0819 00:15:16.079074 3451 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:15:16.101433 containerd[1864]: time="2025-08-19T00:15:16.101330332Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:16.104103 containerd[1864]: time="2025-08-19T00:15:16.104071390Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Aug 19 00:15:16.107852 containerd[1864]: time="2025-08-19T00:15:16.107813654Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:16.112300 containerd[1864]: time="2025-08-19T00:15:16.112264395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:16.112739 containerd[1864]: time="2025-08-19T00:15:16.112614421Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.189456115s" Aug 19 00:15:16.112739 containerd[1864]: time="2025-08-19T00:15:16.112635126Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Aug 19 00:15:16.119934 containerd[1864]: time="2025-08-19T00:15:16.119904879Z" level=info msg="CreateContainer within sandbox \"871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 19 00:15:16.143447 containerd[1864]: time="2025-08-19T00:15:16.143285890Z" level=info msg="Container 961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:15:16.161790 containerd[1864]: time="2025-08-19T00:15:16.161675999Z" level=info msg="CreateContainer within sandbox \"871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f\"" Aug 19 00:15:16.164416 containerd[1864]: time="2025-08-19T00:15:16.163990716Z" level=info msg="StartContainer for \"961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f\"" Aug 19 00:15:16.165496 containerd[1864]: time="2025-08-19T00:15:16.165462216Z" level=info msg="connecting to shim 961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f" address="unix:///run/containerd/s/95157f1e749fbcbcb6e85cc3da5bd496bbdc9960063eb8607612fbd66eedc350" protocol=ttrpc version=3 Aug 19 00:15:16.184366 systemd[1]: Started cri-containerd-961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f.scope - libcontainer container 961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f. Aug 19 00:15:16.214677 containerd[1864]: time="2025-08-19T00:15:16.214645782Z" level=info msg="StartContainer for \"961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f\" returns successfully" Aug 19 00:15:16.219754 systemd[1]: cri-containerd-961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f.scope: Deactivated successfully. Aug 19 00:15:16.223724 containerd[1864]: time="2025-08-19T00:15:16.223545807Z" level=info msg="received exit event container_id:\"961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f\" id:\"961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f\" pid:4114 exited_at:{seconds:1755562516 nanos:223076505}" Aug 19 00:15:16.223724 containerd[1864]: time="2025-08-19T00:15:16.223655011Z" level=info msg="TaskExit event in podsandbox handler container_id:\"961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f\" id:\"961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f\" pid:4114 exited_at:{seconds:1755562516 nanos:223076505}" Aug 19 00:15:16.239778 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-961c8d0b3991c131005e371ed3cbbf66d0ea2331eaa5d3170250176c15fb648f-rootfs.mount: Deactivated successfully. Aug 19 00:15:16.987580 kubelet[3451]: I0819 00:15:16.987545 3451 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:15:17.003506 kubelet[3451]: I0819 00:15:17.003423 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6db5cf5bd6-r7fgg" podStartSLOduration=2.876292553 podStartE2EDuration="5.003306254s" podCreationTimestamp="2025-08-19 00:15:12 +0000 UTC" firstStartedPulling="2025-08-19 00:15:12.795659855 +0000 UTC m=+21.007372200" lastFinishedPulling="2025-08-19 00:15:14.922673556 +0000 UTC m=+23.134385901" observedRunningTime="2025-08-19 00:15:15.998005636 +0000 UTC m=+24.209717997" watchObservedRunningTime="2025-08-19 00:15:17.003306254 +0000 UTC m=+25.215018607" Aug 19 00:15:17.911403 kubelet[3451]: E0819 00:15:17.911345 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:19.905613 kubelet[3451]: E0819 00:15:19.905476 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:21.905675 kubelet[3451]: E0819 00:15:21.905603 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:23.000047 containerd[1864]: time="2025-08-19T00:15:22.999997250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 19 00:15:23.905341 kubelet[3451]: E0819 00:15:23.904740 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:25.905954 kubelet[3451]: E0819 00:15:25.905005 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:27.036268 containerd[1864]: time="2025-08-19T00:15:27.035957755Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:27.039181 containerd[1864]: time="2025-08-19T00:15:27.039145771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Aug 19 00:15:27.043006 containerd[1864]: time="2025-08-19T00:15:27.042966694Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:27.047066 containerd[1864]: time="2025-08-19T00:15:27.047006839Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:27.047773 containerd[1864]: time="2025-08-19T00:15:27.047744861Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 4.047701186s" Aug 19 00:15:27.047773 containerd[1864]: time="2025-08-19T00:15:27.047770238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Aug 19 00:15:27.055256 containerd[1864]: time="2025-08-19T00:15:27.055218637Z" level=info msg="CreateContainer within sandbox \"871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 19 00:15:27.085257 containerd[1864]: time="2025-08-19T00:15:27.084551581Z" level=info msg="Container 8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:15:27.085549 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2080260346.mount: Deactivated successfully. Aug 19 00:15:27.110453 containerd[1864]: time="2025-08-19T00:15:27.110421237Z" level=info msg="CreateContainer within sandbox \"871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c\"" Aug 19 00:15:27.111085 containerd[1864]: time="2025-08-19T00:15:27.111069856Z" level=info msg="StartContainer for \"8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c\"" Aug 19 00:15:27.112696 containerd[1864]: time="2025-08-19T00:15:27.112674024Z" level=info msg="connecting to shim 8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c" address="unix:///run/containerd/s/95157f1e749fbcbcb6e85cc3da5bd496bbdc9960063eb8607612fbd66eedc350" protocol=ttrpc version=3 Aug 19 00:15:27.129344 systemd[1]: Started cri-containerd-8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c.scope - libcontainer container 8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c. Aug 19 00:15:27.156459 containerd[1864]: time="2025-08-19T00:15:27.156421944Z" level=info msg="StartContainer for \"8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c\" returns successfully" Aug 19 00:15:27.905392 kubelet[3451]: E0819 00:15:27.905291 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:29.905689 kubelet[3451]: E0819 00:15:29.905633 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:31.905897 kubelet[3451]: E0819 00:15:31.905469 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:33.905905 kubelet[3451]: E0819 00:15:33.905586 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:35.905781 kubelet[3451]: E0819 00:15:35.905487 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:37.775099 containerd[1864]: time="2025-08-19T00:15:37.775051634Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 00:15:37.777364 systemd[1]: cri-containerd-8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c.scope: Deactivated successfully. Aug 19 00:15:37.777601 systemd[1]: cri-containerd-8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c.scope: Consumed 303ms CPU time, 184.6M memory peak, 165.8M written to disk. Aug 19 00:15:37.778714 containerd[1864]: time="2025-08-19T00:15:37.778689695Z" level=info msg="received exit event container_id:\"8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c\" id:\"8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c\" pid:4176 exited_at:{seconds:1755562537 nanos:778531978}" Aug 19 00:15:37.779094 containerd[1864]: time="2025-08-19T00:15:37.779069370Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c\" id:\"8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c\" pid:4176 exited_at:{seconds:1755562537 nanos:778531978}" Aug 19 00:15:37.788964 kubelet[3451]: I0819 00:15:37.788863 3451 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 19 00:15:37.796269 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8756b7126c5010a5df22d5c80238ccdcf0c42e8d1daa531e2d33043bbc65e69c-rootfs.mount: Deactivated successfully. Aug 19 00:15:39.735277 systemd[1]: Created slice kubepods-burstable-podd2d325d7_1d07_4009_8a42_24c5de39eba0.slice - libcontainer container kubepods-burstable-podd2d325d7_1d07_4009_8a42_24c5de39eba0.slice. Aug 19 00:15:39.745968 systemd[1]: Created slice kubepods-besteffort-podfaf2c0f1_76a3_47d7_9d8d_b6aa58e5852b.slice - libcontainer container kubepods-besteffort-podfaf2c0f1_76a3_47d7_9d8d_b6aa58e5852b.slice. Aug 19 00:15:39.755365 systemd[1]: Created slice kubepods-besteffort-pod82bb62c5_1662_42fe_899a_c8b29c1bb13e.slice - libcontainer container kubepods-besteffort-pod82bb62c5_1662_42fe_899a_c8b29c1bb13e.slice. Aug 19 00:15:39.761992 systemd[1]: Created slice kubepods-burstable-pod891bd3f0_8d1a_448c_a33f_43deb1ae5105.slice - libcontainer container kubepods-burstable-pod891bd3f0_8d1a_448c_a33f_43deb1ae5105.slice. Aug 19 00:15:39.766521 containerd[1864]: time="2025-08-19T00:15:39.766309212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c697k,Uid:82bb62c5-1662-42fe-899a-c8b29c1bb13e,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:39.767136 systemd[1]: Created slice kubepods-besteffort-podbb71d638_eb84_4117_8b41_c44efc0e6f03.slice - libcontainer container kubepods-besteffort-podbb71d638_eb84_4117_8b41_c44efc0e6f03.slice. Aug 19 00:15:39.776063 systemd[1]: Created slice kubepods-besteffort-pod3ee73aa8_152c_47e9_930f_01221caf59a8.slice - libcontainer container kubepods-besteffort-pod3ee73aa8_152c_47e9_930f_01221caf59a8.slice. Aug 19 00:15:39.790828 systemd[1]: Created slice kubepods-besteffort-pod06fcbf06_b9a6_4a75_8c26_7cb8c6c491d8.slice - libcontainer container kubepods-besteffort-pod06fcbf06_b9a6_4a75_8c26_7cb8c6c491d8.slice. Aug 19 00:15:39.808788 systemd[1]: Created slice kubepods-besteffort-pod48f4c9e3_c928_4ddd_bc8b_c6722f68bb65.slice - libcontainer container kubepods-besteffort-pod48f4c9e3_c928_4ddd_bc8b_c6722f68bb65.slice. Aug 19 00:15:39.814295 kubelet[3451]: I0819 00:15:39.813067 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-ca-bundle\") pod \"whisker-666dc99856-h22tx\" (UID: \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\") " pod="calico-system/whisker-666dc99856-h22tx" Aug 19 00:15:39.814295 kubelet[3451]: I0819 00:15:39.813306 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-backend-key-pair\") pod \"whisker-666dc99856-h22tx\" (UID: \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\") " pod="calico-system/whisker-666dc99856-h22tx" Aug 19 00:15:39.814649 systemd[1]: Created slice kubepods-besteffort-pode9c36368_8b14_48eb_8a1e_75af084a418b.slice - libcontainer container kubepods-besteffort-pode9c36368_8b14_48eb_8a1e_75af084a418b.slice. Aug 19 00:15:39.818335 kubelet[3451]: I0819 00:15:39.816947 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c36368-8b14-48eb-8a1e-75af084a418b-config\") pod \"goldmane-768f4c5c69-tl6wx\" (UID: \"e9c36368-8b14-48eb-8a1e-75af084a418b\") " pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:39.818335 kubelet[3451]: I0819 00:15:39.817003 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnd2\" (UniqueName: \"kubernetes.io/projected/e9c36368-8b14-48eb-8a1e-75af084a418b-kube-api-access-svnd2\") pod \"goldmane-768f4c5c69-tl6wx\" (UID: \"e9c36368-8b14-48eb-8a1e-75af084a418b\") " pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:39.818335 kubelet[3451]: I0819 00:15:39.817020 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2d325d7-1d07-4009-8a42-24c5de39eba0-config-volume\") pod \"coredns-674b8bbfcf-dvgdj\" (UID: \"d2d325d7-1d07-4009-8a42-24c5de39eba0\") " pod="kube-system/coredns-674b8bbfcf-dvgdj" Aug 19 00:15:39.818335 kubelet[3451]: I0819 00:15:39.817033 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs22q\" (UniqueName: \"kubernetes.io/projected/bb71d638-eb84-4117-8b41-c44efc0e6f03-kube-api-access-bs22q\") pod \"calico-kube-controllers-66676b5955-qbll4\" (UID: \"bb71d638-eb84-4117-8b41-c44efc0e6f03\") " pod="calico-system/calico-kube-controllers-66676b5955-qbll4" Aug 19 00:15:39.818335 kubelet[3451]: I0819 00:15:39.817043 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9c36368-8b14-48eb-8a1e-75af084a418b-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-tl6wx\" (UID: \"e9c36368-8b14-48eb-8a1e-75af084a418b\") " pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:39.818475 kubelet[3451]: I0819 00:15:39.817083 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thp69\" (UniqueName: \"kubernetes.io/projected/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-kube-api-access-thp69\") pod \"whisker-666dc99856-h22tx\" (UID: \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\") " pod="calico-system/whisker-666dc99856-h22tx" Aug 19 00:15:39.818475 kubelet[3451]: I0819 00:15:39.817099 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw58f\" (UniqueName: \"kubernetes.io/projected/d2d325d7-1d07-4009-8a42-24c5de39eba0-kube-api-access-tw58f\") pod \"coredns-674b8bbfcf-dvgdj\" (UID: \"d2d325d7-1d07-4009-8a42-24c5de39eba0\") " pod="kube-system/coredns-674b8bbfcf-dvgdj" Aug 19 00:15:39.818475 kubelet[3451]: I0819 00:15:39.817109 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891bd3f0-8d1a-448c-a33f-43deb1ae5105-config-volume\") pod \"coredns-674b8bbfcf-769sj\" (UID: \"891bd3f0-8d1a-448c-a33f-43deb1ae5105\") " pod="kube-system/coredns-674b8bbfcf-769sj" Aug 19 00:15:39.818475 kubelet[3451]: I0819 00:15:39.817119 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb52v\" (UniqueName: \"kubernetes.io/projected/891bd3f0-8d1a-448c-a33f-43deb1ae5105-kube-api-access-nb52v\") pod \"coredns-674b8bbfcf-769sj\" (UID: \"891bd3f0-8d1a-448c-a33f-43deb1ae5105\") " pod="kube-system/coredns-674b8bbfcf-769sj" Aug 19 00:15:39.818475 kubelet[3451]: I0819 00:15:39.817149 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb71d638-eb84-4117-8b41-c44efc0e6f03-tigera-ca-bundle\") pod \"calico-kube-controllers-66676b5955-qbll4\" (UID: \"bb71d638-eb84-4117-8b41-c44efc0e6f03\") " pod="calico-system/calico-kube-controllers-66676b5955-qbll4" Aug 19 00:15:39.818552 kubelet[3451]: I0819 00:15:39.817159 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e9c36368-8b14-48eb-8a1e-75af084a418b-goldmane-key-pair\") pod \"goldmane-768f4c5c69-tl6wx\" (UID: \"e9c36368-8b14-48eb-8a1e-75af084a418b\") " pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:39.852143 containerd[1864]: time="2025-08-19T00:15:39.851999986Z" level=error msg="Failed to destroy network for sandbox \"d83781cdd0174cfb8750d603199dbce1fd5774725bfce2e89a363cdb61bad18f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:39.853598 systemd[1]: run-netns-cni\x2d3f0e0a36\x2d8407\x2dab6f\x2d806f\x2df82a0eb833c2.mount: Deactivated successfully. Aug 19 00:15:39.859858 containerd[1864]: time="2025-08-19T00:15:39.859797508Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c697k,Uid:82bb62c5-1662-42fe-899a-c8b29c1bb13e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d83781cdd0174cfb8750d603199dbce1fd5774725bfce2e89a363cdb61bad18f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:39.860576 kubelet[3451]: E0819 00:15:39.860369 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d83781cdd0174cfb8750d603199dbce1fd5774725bfce2e89a363cdb61bad18f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:39.860714 kubelet[3451]: E0819 00:15:39.860698 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d83781cdd0174cfb8750d603199dbce1fd5774725bfce2e89a363cdb61bad18f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:39.860782 kubelet[3451]: E0819 00:15:39.860767 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d83781cdd0174cfb8750d603199dbce1fd5774725bfce2e89a363cdb61bad18f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:39.860890 kubelet[3451]: E0819 00:15:39.860864 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-c697k_calico-system(82bb62c5-1662-42fe-899a-c8b29c1bb13e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-c697k_calico-system(82bb62c5-1662-42fe-899a-c8b29c1bb13e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d83781cdd0174cfb8750d603199dbce1fd5774725bfce2e89a363cdb61bad18f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:39.918008 kubelet[3451]: I0819 00:15:39.917977 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-calico-apiserver-certs\") pod \"calico-apiserver-6c8858db87-l566d\" (UID: \"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65\") " pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" Aug 19 00:15:39.918132 kubelet[3451]: I0819 00:15:39.918022 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3ee73aa8-152c-47e9-930f-01221caf59a8-calico-apiserver-certs\") pod \"calico-apiserver-5d78cb7dc4-xtwmv\" (UID: \"3ee73aa8-152c-47e9-930f-01221caf59a8\") " pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" Aug 19 00:15:39.918132 kubelet[3451]: I0819 00:15:39.918036 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqj8\" (UniqueName: \"kubernetes.io/projected/3ee73aa8-152c-47e9-930f-01221caf59a8-kube-api-access-fgqj8\") pod \"calico-apiserver-5d78cb7dc4-xtwmv\" (UID: \"3ee73aa8-152c-47e9-930f-01221caf59a8\") " pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" Aug 19 00:15:39.918132 kubelet[3451]: I0819 00:15:39.918053 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-calico-apiserver-certs\") pod \"calico-apiserver-6c8858db87-7lbsf\" (UID: \"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8\") " pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" Aug 19 00:15:39.918132 kubelet[3451]: I0819 00:15:39.918126 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqx59\" (UniqueName: \"kubernetes.io/projected/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-kube-api-access-bqx59\") pod \"calico-apiserver-6c8858db87-l566d\" (UID: \"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65\") " pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" Aug 19 00:15:39.918257 kubelet[3451]: I0819 00:15:39.918136 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzv99\" (UniqueName: \"kubernetes.io/projected/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-kube-api-access-qzv99\") pod \"calico-apiserver-6c8858db87-7lbsf\" (UID: \"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8\") " pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" Aug 19 00:15:40.035325 containerd[1864]: time="2025-08-19T00:15:40.034867358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 19 00:15:40.042175 containerd[1864]: time="2025-08-19T00:15:40.041620120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dvgdj,Uid:d2d325d7-1d07-4009-8a42-24c5de39eba0,Namespace:kube-system,Attempt:0,}" Aug 19 00:15:40.056628 containerd[1864]: time="2025-08-19T00:15:40.056451380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-666dc99856-h22tx,Uid:faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:40.068450 containerd[1864]: time="2025-08-19T00:15:40.068405354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-769sj,Uid:891bd3f0-8d1a-448c-a33f-43deb1ae5105,Namespace:kube-system,Attempt:0,}" Aug 19 00:15:40.075467 containerd[1864]: time="2025-08-19T00:15:40.075413604Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66676b5955-qbll4,Uid:bb71d638-eb84-4117-8b41-c44efc0e6f03,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:40.085391 containerd[1864]: time="2025-08-19T00:15:40.085340325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-xtwmv,Uid:3ee73aa8-152c-47e9-930f-01221caf59a8,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:15:40.096423 containerd[1864]: time="2025-08-19T00:15:40.096319606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-7lbsf,Uid:06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:15:40.122237 containerd[1864]: time="2025-08-19T00:15:40.121757264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-l566d,Uid:48f4c9e3-c928-4ddd-bc8b-c6722f68bb65,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:15:40.129292 containerd[1864]: time="2025-08-19T00:15:40.129185182Z" level=error msg="Failed to destroy network for sandbox \"1d832a1ddf9fe1e8b4c892a1deb1536f4b3603b06f27ce8b2d9a20432becd8f2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.136037 containerd[1864]: time="2025-08-19T00:15:40.136008666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tl6wx,Uid:e9c36368-8b14-48eb-8a1e-75af084a418b,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:40.145244 containerd[1864]: time="2025-08-19T00:15:40.145144252Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dvgdj,Uid:d2d325d7-1d07-4009-8a42-24c5de39eba0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d832a1ddf9fe1e8b4c892a1deb1536f4b3603b06f27ce8b2d9a20432becd8f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.145784 kubelet[3451]: E0819 00:15:40.145682 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d832a1ddf9fe1e8b4c892a1deb1536f4b3603b06f27ce8b2d9a20432becd8f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.146078 kubelet[3451]: E0819 00:15:40.145978 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d832a1ddf9fe1e8b4c892a1deb1536f4b3603b06f27ce8b2d9a20432becd8f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dvgdj" Aug 19 00:15:40.146078 kubelet[3451]: E0819 00:15:40.146009 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d832a1ddf9fe1e8b4c892a1deb1536f4b3603b06f27ce8b2d9a20432becd8f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dvgdj" Aug 19 00:15:40.147928 kubelet[3451]: E0819 00:15:40.147800 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dvgdj_kube-system(d2d325d7-1d07-4009-8a42-24c5de39eba0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dvgdj_kube-system(d2d325d7-1d07-4009-8a42-24c5de39eba0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d832a1ddf9fe1e8b4c892a1deb1536f4b3603b06f27ce8b2d9a20432becd8f2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dvgdj" podUID="d2d325d7-1d07-4009-8a42-24c5de39eba0" Aug 19 00:15:40.195032 containerd[1864]: time="2025-08-19T00:15:40.194985880Z" level=error msg="Failed to destroy network for sandbox \"86b476d4ce10b8f372e9c1f5721f51483ba3258765b79cdeb9dca5312db66964\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.233887 containerd[1864]: time="2025-08-19T00:15:40.233833020Z" level=error msg="Failed to destroy network for sandbox \"ae5de5b8b1b37b94318709fbdb5ae43e5b33fa287dff9194479118a525051f35\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.241736 containerd[1864]: time="2025-08-19T00:15:40.241704703Z" level=error msg="Failed to destroy network for sandbox \"f537193a7a9f3edc7eb67865615bc5262e61694d7b64be3b114ee321aa42cead\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.248387 containerd[1864]: time="2025-08-19T00:15:40.248357535Z" level=error msg="Failed to destroy network for sandbox \"58afc1d9a76af0794c6b4605914e0ffdb4694c49fe97758c1ae5911810cdb57e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.250648 containerd[1864]: time="2025-08-19T00:15:40.250606802Z" level=error msg="Failed to destroy network for sandbox \"cb721d84382602cb5332135cda786816e2d53d8127c05f6459fb20ede10febdf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.258405 containerd[1864]: time="2025-08-19T00:15:40.258371027Z" level=error msg="Failed to destroy network for sandbox \"5b740e825497f8b52053681cec6feb7cd91a26d05e5e95f7f7902c18e06e22c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.267683 containerd[1864]: time="2025-08-19T00:15:40.267641224Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-666dc99856-h22tx,Uid:faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"86b476d4ce10b8f372e9c1f5721f51483ba3258765b79cdeb9dca5312db66964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.268001 kubelet[3451]: E0819 00:15:40.267966 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86b476d4ce10b8f372e9c1f5721f51483ba3258765b79cdeb9dca5312db66964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.268850 kubelet[3451]: E0819 00:15:40.268823 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86b476d4ce10b8f372e9c1f5721f51483ba3258765b79cdeb9dca5312db66964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-666dc99856-h22tx" Aug 19 00:15:40.268850 kubelet[3451]: E0819 00:15:40.268851 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86b476d4ce10b8f372e9c1f5721f51483ba3258765b79cdeb9dca5312db66964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-666dc99856-h22tx" Aug 19 00:15:40.268992 kubelet[3451]: E0819 00:15:40.268896 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-666dc99856-h22tx_calico-system(faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-666dc99856-h22tx_calico-system(faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"86b476d4ce10b8f372e9c1f5721f51483ba3258765b79cdeb9dca5312db66964\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-666dc99856-h22tx" podUID="faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b" Aug 19 00:15:40.270406 containerd[1864]: time="2025-08-19T00:15:40.270380418Z" level=error msg="Failed to destroy network for sandbox \"4404ec8fcd4012b81f61688fb5b1362a4c903207b99313f163cb17e4d8d3ac9a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.270814 containerd[1864]: time="2025-08-19T00:15:40.270779646Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-769sj,Uid:891bd3f0-8d1a-448c-a33f-43deb1ae5105,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae5de5b8b1b37b94318709fbdb5ae43e5b33fa287dff9194479118a525051f35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.270963 kubelet[3451]: E0819 00:15:40.270927 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae5de5b8b1b37b94318709fbdb5ae43e5b33fa287dff9194479118a525051f35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.270963 kubelet[3451]: E0819 00:15:40.270956 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae5de5b8b1b37b94318709fbdb5ae43e5b33fa287dff9194479118a525051f35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-769sj" Aug 19 00:15:40.271012 kubelet[3451]: E0819 00:15:40.270970 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae5de5b8b1b37b94318709fbdb5ae43e5b33fa287dff9194479118a525051f35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-769sj" Aug 19 00:15:40.271012 kubelet[3451]: E0819 00:15:40.270997 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-769sj_kube-system(891bd3f0-8d1a-448c-a33f-43deb1ae5105)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-769sj_kube-system(891bd3f0-8d1a-448c-a33f-43deb1ae5105)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae5de5b8b1b37b94318709fbdb5ae43e5b33fa287dff9194479118a525051f35\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-769sj" podUID="891bd3f0-8d1a-448c-a33f-43deb1ae5105" Aug 19 00:15:40.316237 containerd[1864]: time="2025-08-19T00:15:40.315976656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66676b5955-qbll4,Uid:bb71d638-eb84-4117-8b41-c44efc0e6f03,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f537193a7a9f3edc7eb67865615bc5262e61694d7b64be3b114ee321aa42cead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.316339 kubelet[3451]: E0819 00:15:40.316164 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f537193a7a9f3edc7eb67865615bc5262e61694d7b64be3b114ee321aa42cead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.316339 kubelet[3451]: E0819 00:15:40.316208 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f537193a7a9f3edc7eb67865615bc5262e61694d7b64be3b114ee321aa42cead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66676b5955-qbll4" Aug 19 00:15:40.316339 kubelet[3451]: E0819 00:15:40.316220 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f537193a7a9f3edc7eb67865615bc5262e61694d7b64be3b114ee321aa42cead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66676b5955-qbll4" Aug 19 00:15:40.316608 kubelet[3451]: E0819 00:15:40.316267 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66676b5955-qbll4_calico-system(bb71d638-eb84-4117-8b41-c44efc0e6f03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66676b5955-qbll4_calico-system(bb71d638-eb84-4117-8b41-c44efc0e6f03)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f537193a7a9f3edc7eb67865615bc5262e61694d7b64be3b114ee321aa42cead\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66676b5955-qbll4" podUID="bb71d638-eb84-4117-8b41-c44efc0e6f03" Aug 19 00:15:40.364345 containerd[1864]: time="2025-08-19T00:15:40.364294215Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-7lbsf,Uid:06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"58afc1d9a76af0794c6b4605914e0ffdb4694c49fe97758c1ae5911810cdb57e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.364685 kubelet[3451]: E0819 00:15:40.364645 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58afc1d9a76af0794c6b4605914e0ffdb4694c49fe97758c1ae5911810cdb57e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.364742 kubelet[3451]: E0819 00:15:40.364697 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58afc1d9a76af0794c6b4605914e0ffdb4694c49fe97758c1ae5911810cdb57e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" Aug 19 00:15:40.364742 kubelet[3451]: E0819 00:15:40.364711 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58afc1d9a76af0794c6b4605914e0ffdb4694c49fe97758c1ae5911810cdb57e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" Aug 19 00:15:40.364843 kubelet[3451]: E0819 00:15:40.364763 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c8858db87-7lbsf_calico-apiserver(06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c8858db87-7lbsf_calico-apiserver(06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"58afc1d9a76af0794c6b4605914e0ffdb4694c49fe97758c1ae5911810cdb57e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" podUID="06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8" Aug 19 00:15:40.371458 containerd[1864]: time="2025-08-19T00:15:40.371348906Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-l566d,Uid:48f4c9e3-c928-4ddd-bc8b-c6722f68bb65,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb721d84382602cb5332135cda786816e2d53d8127c05f6459fb20ede10febdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.371547 kubelet[3451]: E0819 00:15:40.371501 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb721d84382602cb5332135cda786816e2d53d8127c05f6459fb20ede10febdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.371547 kubelet[3451]: E0819 00:15:40.371533 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb721d84382602cb5332135cda786816e2d53d8127c05f6459fb20ede10febdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" Aug 19 00:15:40.371589 kubelet[3451]: E0819 00:15:40.371544 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb721d84382602cb5332135cda786816e2d53d8127c05f6459fb20ede10febdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" Aug 19 00:15:40.371589 kubelet[3451]: E0819 00:15:40.371575 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c8858db87-l566d_calico-apiserver(48f4c9e3-c928-4ddd-bc8b-c6722f68bb65)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c8858db87-l566d_calico-apiserver(48f4c9e3-c928-4ddd-bc8b-c6722f68bb65)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cb721d84382602cb5332135cda786816e2d53d8127c05f6459fb20ede10febdf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" podUID="48f4c9e3-c928-4ddd-bc8b-c6722f68bb65" Aug 19 00:15:40.411337 containerd[1864]: time="2025-08-19T00:15:40.411213412Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-xtwmv,Uid:3ee73aa8-152c-47e9-930f-01221caf59a8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b740e825497f8b52053681cec6feb7cd91a26d05e5e95f7f7902c18e06e22c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.411455 kubelet[3451]: E0819 00:15:40.411434 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b740e825497f8b52053681cec6feb7cd91a26d05e5e95f7f7902c18e06e22c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.411500 kubelet[3451]: E0819 00:15:40.411472 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b740e825497f8b52053681cec6feb7cd91a26d05e5e95f7f7902c18e06e22c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" Aug 19 00:15:40.411500 kubelet[3451]: E0819 00:15:40.411484 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b740e825497f8b52053681cec6feb7cd91a26d05e5e95f7f7902c18e06e22c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" Aug 19 00:15:40.411543 kubelet[3451]: E0819 00:15:40.411516 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d78cb7dc4-xtwmv_calico-apiserver(3ee73aa8-152c-47e9-930f-01221caf59a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d78cb7dc4-xtwmv_calico-apiserver(3ee73aa8-152c-47e9-930f-01221caf59a8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b740e825497f8b52053681cec6feb7cd91a26d05e5e95f7f7902c18e06e22c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" podUID="3ee73aa8-152c-47e9-930f-01221caf59a8" Aug 19 00:15:40.457920 containerd[1864]: time="2025-08-19T00:15:40.457735102Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tl6wx,Uid:e9c36368-8b14-48eb-8a1e-75af084a418b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4404ec8fcd4012b81f61688fb5b1362a4c903207b99313f163cb17e4d8d3ac9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.458047 kubelet[3451]: E0819 00:15:40.457942 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4404ec8fcd4012b81f61688fb5b1362a4c903207b99313f163cb17e4d8d3ac9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:40.458047 kubelet[3451]: E0819 00:15:40.457987 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4404ec8fcd4012b81f61688fb5b1362a4c903207b99313f163cb17e4d8d3ac9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:40.458047 kubelet[3451]: E0819 00:15:40.458001 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4404ec8fcd4012b81f61688fb5b1362a4c903207b99313f163cb17e4d8d3ac9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:40.458113 kubelet[3451]: E0819 00:15:40.458033 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-tl6wx_calico-system(e9c36368-8b14-48eb-8a1e-75af084a418b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-tl6wx_calico-system(e9c36368-8b14-48eb-8a1e-75af084a418b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4404ec8fcd4012b81f61688fb5b1362a4c903207b99313f163cb17e4d8d3ac9a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-tl6wx" podUID="e9c36368-8b14-48eb-8a1e-75af084a418b" Aug 19 00:15:49.314275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2449777072.mount: Deactivated successfully. Aug 19 00:15:50.905802 containerd[1864]: time="2025-08-19T00:15:50.905701600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c697k,Uid:82bb62c5-1662-42fe-899a-c8b29c1bb13e,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:52.906573 containerd[1864]: time="2025-08-19T00:15:52.906014227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-l566d,Uid:48f4c9e3-c928-4ddd-bc8b-c6722f68bb65,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:15:52.906573 containerd[1864]: time="2025-08-19T00:15:52.906170775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-7lbsf,Uid:06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:15:52.906573 containerd[1864]: time="2025-08-19T00:15:52.906283371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-xtwmv,Uid:3ee73aa8-152c-47e9-930f-01221caf59a8,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:15:54.756546 containerd[1864]: time="2025-08-19T00:15:52.906731560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tl6wx,Uid:e9c36368-8b14-48eb-8a1e-75af084a418b,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:54.756546 containerd[1864]: time="2025-08-19T00:15:52.906847676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dvgdj,Uid:d2d325d7-1d07-4009-8a42-24c5de39eba0,Namespace:kube-system,Attempt:0,}" Aug 19 00:15:54.756546 containerd[1864]: time="2025-08-19T00:15:52.907040681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-666dc99856-h22tx,Uid:faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:54.756546 containerd[1864]: time="2025-08-19T00:15:53.905751200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-769sj,Uid:891bd3f0-8d1a-448c-a33f-43deb1ae5105,Namespace:kube-system,Attempt:0,}" Aug 19 00:15:54.936771 containerd[1864]: time="2025-08-19T00:15:54.936640708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:54.943063 containerd[1864]: time="2025-08-19T00:15:54.943028723Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Aug 19 00:15:54.948523 containerd[1864]: time="2025-08-19T00:15:54.948496285Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:54.949501 containerd[1864]: time="2025-08-19T00:15:54.949469994Z" level=error msg="Failed to destroy network for sandbox \"702e4cb7bab23f90bb917b7d52dd6029ca6ce611708ace9b34d7dadc5a2cbf33\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:54.957805 containerd[1864]: time="2025-08-19T00:15:54.957569740Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c697k,Uid:82bb62c5-1662-42fe-899a-c8b29c1bb13e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"702e4cb7bab23f90bb917b7d52dd6029ca6ce611708ace9b34d7dadc5a2cbf33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:54.961701 kubelet[3451]: E0819 00:15:54.961664 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"702e4cb7bab23f90bb917b7d52dd6029ca6ce611708ace9b34d7dadc5a2cbf33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:54.962804 containerd[1864]: time="2025-08-19T00:15:54.962555520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:15:54.964448 containerd[1864]: time="2025-08-19T00:15:54.964422912Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 14.929130237s" Aug 19 00:15:54.964661 containerd[1864]: time="2025-08-19T00:15:54.964622549Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Aug 19 00:15:54.969609 kubelet[3451]: E0819 00:15:54.969459 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"702e4cb7bab23f90bb917b7d52dd6029ca6ce611708ace9b34d7dadc5a2cbf33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:54.974485 kubelet[3451]: E0819 00:15:54.973509 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"702e4cb7bab23f90bb917b7d52dd6029ca6ce611708ace9b34d7dadc5a2cbf33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-c697k" Aug 19 00:15:54.974485 kubelet[3451]: E0819 00:15:54.973569 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-c697k_calico-system(82bb62c5-1662-42fe-899a-c8b29c1bb13e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-c697k_calico-system(82bb62c5-1662-42fe-899a-c8b29c1bb13e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"702e4cb7bab23f90bb917b7d52dd6029ca6ce611708ace9b34d7dadc5a2cbf33\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-c697k" podUID="82bb62c5-1662-42fe-899a-c8b29c1bb13e" Aug 19 00:15:54.980267 containerd[1864]: time="2025-08-19T00:15:54.980244111Z" level=error msg="Failed to destroy network for sandbox \"5c42147912a5bab670d6c282ef20ba631708be46faee2297376d1b4d42c8effc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:54.983715 containerd[1864]: time="2025-08-19T00:15:54.983695149Z" level=info msg="CreateContainer within sandbox \"871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 19 00:15:54.984596 containerd[1864]: time="2025-08-19T00:15:54.984570575Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-l566d,Uid:48f4c9e3-c928-4ddd-bc8b-c6722f68bb65,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c42147912a5bab670d6c282ef20ba631708be46faee2297376d1b4d42c8effc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:54.984960 kubelet[3451]: E0819 00:15:54.984938 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c42147912a5bab670d6c282ef20ba631708be46faee2297376d1b4d42c8effc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:54.985055 kubelet[3451]: E0819 00:15:54.985041 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c42147912a5bab670d6c282ef20ba631708be46faee2297376d1b4d42c8effc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" Aug 19 00:15:54.985118 kubelet[3451]: E0819 00:15:54.985103 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c42147912a5bab670d6c282ef20ba631708be46faee2297376d1b4d42c8effc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" Aug 19 00:15:54.985201 kubelet[3451]: E0819 00:15:54.985185 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c8858db87-l566d_calico-apiserver(48f4c9e3-c928-4ddd-bc8b-c6722f68bb65)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c8858db87-l566d_calico-apiserver(48f4c9e3-c928-4ddd-bc8b-c6722f68bb65)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5c42147912a5bab670d6c282ef20ba631708be46faee2297376d1b4d42c8effc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" podUID="48f4c9e3-c928-4ddd-bc8b-c6722f68bb65" Aug 19 00:15:55.015374 containerd[1864]: time="2025-08-19T00:15:55.015302162Z" level=error msg="Failed to destroy network for sandbox \"dfaf5d9923171c81368ad92d8360c27290085646ef4b54056748be210ea7a3cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.017172 containerd[1864]: time="2025-08-19T00:15:55.017143633Z" level=error msg="Failed to destroy network for sandbox \"dbfd59a171d3b6b1d288d8764f9cea82ff27a4747c264ceea69fe85682187273\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.019650 containerd[1864]: time="2025-08-19T00:15:55.019541601Z" level=info msg="Container fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:15:55.020791 containerd[1864]: time="2025-08-19T00:15:55.020754469Z" level=error msg="Failed to destroy network for sandbox \"b189c6b24089785c5bdb9f93cac2b156a88d2dffa3fe29943258e08866a1c245\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.023489 containerd[1864]: time="2025-08-19T00:15:55.023461077Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-7lbsf,Uid:06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbfd59a171d3b6b1d288d8764f9cea82ff27a4747c264ceea69fe85682187273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.024122 kubelet[3451]: E0819 00:15:55.024041 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbfd59a171d3b6b1d288d8764f9cea82ff27a4747c264ceea69fe85682187273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.024122 kubelet[3451]: E0819 00:15:55.024086 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbfd59a171d3b6b1d288d8764f9cea82ff27a4747c264ceea69fe85682187273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" Aug 19 00:15:55.024122 kubelet[3451]: E0819 00:15:55.024099 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbfd59a171d3b6b1d288d8764f9cea82ff27a4747c264ceea69fe85682187273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" Aug 19 00:15:55.024527 kubelet[3451]: E0819 00:15:55.024252 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c8858db87-7lbsf_calico-apiserver(06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c8858db87-7lbsf_calico-apiserver(06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dbfd59a171d3b6b1d288d8764f9cea82ff27a4747c264ceea69fe85682187273\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" podUID="06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8" Aug 19 00:15:55.028833 containerd[1864]: time="2025-08-19T00:15:55.028784972Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-xtwmv,Uid:3ee73aa8-152c-47e9-930f-01221caf59a8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfaf5d9923171c81368ad92d8360c27290085646ef4b54056748be210ea7a3cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.029158 kubelet[3451]: E0819 00:15:55.029109 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfaf5d9923171c81368ad92d8360c27290085646ef4b54056748be210ea7a3cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.029365 kubelet[3451]: E0819 00:15:55.029222 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfaf5d9923171c81368ad92d8360c27290085646ef4b54056748be210ea7a3cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" Aug 19 00:15:55.029365 kubelet[3451]: E0819 00:15:55.029259 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfaf5d9923171c81368ad92d8360c27290085646ef4b54056748be210ea7a3cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" Aug 19 00:15:55.029516 kubelet[3451]: E0819 00:15:55.029465 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d78cb7dc4-xtwmv_calico-apiserver(3ee73aa8-152c-47e9-930f-01221caf59a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d78cb7dc4-xtwmv_calico-apiserver(3ee73aa8-152c-47e9-930f-01221caf59a8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dfaf5d9923171c81368ad92d8360c27290085646ef4b54056748be210ea7a3cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" podUID="3ee73aa8-152c-47e9-930f-01221caf59a8" Aug 19 00:15:55.032925 containerd[1864]: time="2025-08-19T00:15:55.032867285Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dvgdj,Uid:d2d325d7-1d07-4009-8a42-24c5de39eba0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b189c6b24089785c5bdb9f93cac2b156a88d2dffa3fe29943258e08866a1c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.033211 kubelet[3451]: E0819 00:15:55.033172 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b189c6b24089785c5bdb9f93cac2b156a88d2dffa3fe29943258e08866a1c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.033211 kubelet[3451]: E0819 00:15:55.033203 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b189c6b24089785c5bdb9f93cac2b156a88d2dffa3fe29943258e08866a1c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dvgdj" Aug 19 00:15:55.033579 kubelet[3451]: E0819 00:15:55.033214 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b189c6b24089785c5bdb9f93cac2b156a88d2dffa3fe29943258e08866a1c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dvgdj" Aug 19 00:15:55.033579 kubelet[3451]: E0819 00:15:55.033256 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dvgdj_kube-system(d2d325d7-1d07-4009-8a42-24c5de39eba0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dvgdj_kube-system(d2d325d7-1d07-4009-8a42-24c5de39eba0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b189c6b24089785c5bdb9f93cac2b156a88d2dffa3fe29943258e08866a1c245\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dvgdj" podUID="d2d325d7-1d07-4009-8a42-24c5de39eba0" Aug 19 00:15:55.035740 containerd[1864]: time="2025-08-19T00:15:55.035703666Z" level=error msg="Failed to destroy network for sandbox \"ee4dc03a1a14a24810a086a41ad7812b1af49b7a8371c40bad84d3994189682c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.040648 containerd[1864]: time="2025-08-19T00:15:55.040529345Z" level=error msg="Failed to destroy network for sandbox \"d2814bbb04611a76868a66c8e157b7da9273591f851d0bd0b0f8fc1fa7395f6f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.049539 containerd[1864]: time="2025-08-19T00:15:55.049501581Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-666dc99856-h22tx,Uid:faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee4dc03a1a14a24810a086a41ad7812b1af49b7a8371c40bad84d3994189682c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.050098 kubelet[3451]: E0819 00:15:55.049714 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee4dc03a1a14a24810a086a41ad7812b1af49b7a8371c40bad84d3994189682c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.050098 kubelet[3451]: E0819 00:15:55.049766 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee4dc03a1a14a24810a086a41ad7812b1af49b7a8371c40bad84d3994189682c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-666dc99856-h22tx" Aug 19 00:15:55.050098 kubelet[3451]: E0819 00:15:55.049780 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee4dc03a1a14a24810a086a41ad7812b1af49b7a8371c40bad84d3994189682c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-666dc99856-h22tx" Aug 19 00:15:55.050170 kubelet[3451]: E0819 00:15:55.049810 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-666dc99856-h22tx_calico-system(faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-666dc99856-h22tx_calico-system(faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ee4dc03a1a14a24810a086a41ad7812b1af49b7a8371c40bad84d3994189682c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-666dc99856-h22tx" podUID="faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b" Aug 19 00:15:55.051720 containerd[1864]: time="2025-08-19T00:15:55.051693582Z" level=error msg="Failed to destroy network for sandbox \"d9df2ee7235979cade2a6f801879273a592251a6b9621c6e09d3a98e4823599e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.055039 containerd[1864]: time="2025-08-19T00:15:55.054998840Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tl6wx,Uid:e9c36368-8b14-48eb-8a1e-75af084a418b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2814bbb04611a76868a66c8e157b7da9273591f851d0bd0b0f8fc1fa7395f6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.055907 kubelet[3451]: E0819 00:15:55.055702 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2814bbb04611a76868a66c8e157b7da9273591f851d0bd0b0f8fc1fa7395f6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.055907 kubelet[3451]: E0819 00:15:55.055736 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2814bbb04611a76868a66c8e157b7da9273591f851d0bd0b0f8fc1fa7395f6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:55.055907 kubelet[3451]: E0819 00:15:55.055749 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2814bbb04611a76868a66c8e157b7da9273591f851d0bd0b0f8fc1fa7395f6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tl6wx" Aug 19 00:15:55.056013 kubelet[3451]: E0819 00:15:55.055771 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-tl6wx_calico-system(e9c36368-8b14-48eb-8a1e-75af084a418b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-tl6wx_calico-system(e9c36368-8b14-48eb-8a1e-75af084a418b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d2814bbb04611a76868a66c8e157b7da9273591f851d0bd0b0f8fc1fa7395f6f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-tl6wx" podUID="e9c36368-8b14-48eb-8a1e-75af084a418b" Aug 19 00:15:55.057938 containerd[1864]: time="2025-08-19T00:15:55.057878798Z" level=info msg="CreateContainer within sandbox \"871734d3bc2de74bb0f46a4a63590437d2941f4629ffdc8372f2add6aa8d69a7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\"" Aug 19 00:15:55.058253 containerd[1864]: time="2025-08-19T00:15:55.058217272Z" level=info msg="StartContainer for \"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\"" Aug 19 00:15:55.058564 containerd[1864]: time="2025-08-19T00:15:55.058357892Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-769sj,Uid:891bd3f0-8d1a-448c-a33f-43deb1ae5105,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9df2ee7235979cade2a6f801879273a592251a6b9621c6e09d3a98e4823599e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.058956 kubelet[3451]: E0819 00:15:55.058873 3451 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9df2ee7235979cade2a6f801879273a592251a6b9621c6e09d3a98e4823599e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:15:55.058956 kubelet[3451]: E0819 00:15:55.058908 3451 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9df2ee7235979cade2a6f801879273a592251a6b9621c6e09d3a98e4823599e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-769sj" Aug 19 00:15:55.058956 kubelet[3451]: E0819 00:15:55.058924 3451 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9df2ee7235979cade2a6f801879273a592251a6b9621c6e09d3a98e4823599e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-769sj" Aug 19 00:15:55.059057 kubelet[3451]: E0819 00:15:55.058951 3451 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-769sj_kube-system(891bd3f0-8d1a-448c-a33f-43deb1ae5105)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-769sj_kube-system(891bd3f0-8d1a-448c-a33f-43deb1ae5105)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d9df2ee7235979cade2a6f801879273a592251a6b9621c6e09d3a98e4823599e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-769sj" podUID="891bd3f0-8d1a-448c-a33f-43deb1ae5105" Aug 19 00:15:55.059636 containerd[1864]: time="2025-08-19T00:15:55.059582025Z" level=info msg="connecting to shim fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438" address="unix:///run/containerd/s/95157f1e749fbcbcb6e85cc3da5bd496bbdc9960063eb8607612fbd66eedc350" protocol=ttrpc version=3 Aug 19 00:15:55.077448 systemd[1]: Started cri-containerd-fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438.scope - libcontainer container fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438. Aug 19 00:15:55.109591 containerd[1864]: time="2025-08-19T00:15:55.109562561Z" level=info msg="StartContainer for \"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\" returns successfully" Aug 19 00:15:55.562739 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 19 00:15:55.562853 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 19 00:15:55.811159 kubelet[3451]: I0819 00:15:55.811125 3451 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-ca-bundle\") pod \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\" (UID: \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\") " Aug 19 00:15:55.812251 kubelet[3451]: I0819 00:15:55.811337 3451 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-backend-key-pair\") pod \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\" (UID: \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\") " Aug 19 00:15:55.812251 kubelet[3451]: I0819 00:15:55.811372 3451 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thp69\" (UniqueName: \"kubernetes.io/projected/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-kube-api-access-thp69\") pod \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\" (UID: \"faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b\") " Aug 19 00:15:55.812408 kubelet[3451]: I0819 00:15:55.812381 3451 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b" (UID: "faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 19 00:15:55.814577 kubelet[3451]: I0819 00:15:55.814495 3451 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b" (UID: "faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 00:15:55.815440 kubelet[3451]: I0819 00:15:55.815407 3451 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-kube-api-access-thp69" (OuterVolumeSpecName: "kube-api-access-thp69") pod "faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b" (UID: "faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b"). InnerVolumeSpecName "kube-api-access-thp69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 00:15:55.824417 systemd[1]: run-netns-cni\x2dba282cb0\x2d2a83\x2dfe16\x2d6996\x2d578c4ecd7d89.mount: Deactivated successfully. Aug 19 00:15:55.824499 systemd[1]: run-netns-cni\x2db4c0e591\x2d4f98\x2d81a2\x2d752e\x2d9a1e5b2676d6.mount: Deactivated successfully. Aug 19 00:15:55.824535 systemd[1]: run-netns-cni\x2d2c2063af\x2ddc3a\x2da699\x2dc8ee\x2dd7ed9a178ac5.mount: Deactivated successfully. Aug 19 00:15:55.824567 systemd[1]: run-netns-cni\x2d20892dc2\x2d1470\x2d025c\x2d371d\x2d5296486e4e70.mount: Deactivated successfully. Aug 19 00:15:55.824595 systemd[1]: run-netns-cni\x2dd0e8cc98\x2db231\x2d9248\x2dea07\x2dec907d52583f.mount: Deactivated successfully. Aug 19 00:15:55.824625 systemd[1]: var-lib-kubelet-pods-faf2c0f1\x2d76a3\x2d47d7\x2d9d8d\x2db6aa58e5852b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dthp69.mount: Deactivated successfully. Aug 19 00:15:55.824663 systemd[1]: var-lib-kubelet-pods-faf2c0f1\x2d76a3\x2d47d7\x2d9d8d\x2db6aa58e5852b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 19 00:15:55.906145 containerd[1864]: time="2025-08-19T00:15:55.906052498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66676b5955-qbll4,Uid:bb71d638-eb84-4117-8b41-c44efc0e6f03,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:55.911503 systemd[1]: Removed slice kubepods-besteffort-podfaf2c0f1_76a3_47d7_9d8d_b6aa58e5852b.slice - libcontainer container kubepods-besteffort-podfaf2c0f1_76a3_47d7_9d8d_b6aa58e5852b.slice. Aug 19 00:15:55.912664 kubelet[3451]: I0819 00:15:55.912462 3451 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thp69\" (UniqueName: \"kubernetes.io/projected/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-kube-api-access-thp69\") on node \"ci-4426.0.0-a-440c7464d3\" DevicePath \"\"" Aug 19 00:15:55.912664 kubelet[3451]: I0819 00:15:55.912474 3451 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-ca-bundle\") on node \"ci-4426.0.0-a-440c7464d3\" DevicePath \"\"" Aug 19 00:15:55.912664 kubelet[3451]: I0819 00:15:55.912481 3451 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b-whisker-backend-key-pair\") on node \"ci-4426.0.0-a-440c7464d3\" DevicePath \"\"" Aug 19 00:15:56.022919 systemd-networkd[1695]: calic373a9b9a7b: Link UP Aug 19 00:15:56.023094 systemd-networkd[1695]: calic373a9b9a7b: Gained carrier Aug 19 00:15:56.035804 containerd[1864]: 2025-08-19 00:15:55.934 [INFO][4745] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 00:15:56.035804 containerd[1864]: 2025-08-19 00:15:55.956 [INFO][4745] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0 calico-kube-controllers-66676b5955- calico-system bb71d638-eb84-4117-8b41-c44efc0e6f03 862 0 2025-08-19 00:15:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:66676b5955 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 calico-kube-controllers-66676b5955-qbll4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic373a9b9a7b [] [] }} ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-" Aug 19 00:15:56.035804 containerd[1864]: 2025-08-19 00:15:55.956 [INFO][4745] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" Aug 19 00:15:56.035804 containerd[1864]: 2025-08-19 00:15:55.973 [INFO][4756] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" HandleID="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.973 [INFO][4756] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" HandleID="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3950), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-440c7464d3", "pod":"calico-kube-controllers-66676b5955-qbll4", "timestamp":"2025-08-19 00:15:55.973581117 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.973 [INFO][4756] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.973 [INFO][4756] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.973 [INFO][4756] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.979 [INFO][4756] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.982 [INFO][4756] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.985 [INFO][4756] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.986 [INFO][4756] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.035949 containerd[1864]: 2025-08-19 00:15:55.988 [INFO][4756] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.036357 containerd[1864]: 2025-08-19 00:15:55.988 [INFO][4756] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.036357 containerd[1864]: 2025-08-19 00:15:55.989 [INFO][4756] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2 Aug 19 00:15:56.036357 containerd[1864]: 2025-08-19 00:15:55.992 [INFO][4756] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.036357 containerd[1864]: 2025-08-19 00:15:56.000 [INFO][4756] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.65/26] block=192.168.21.64/26 handle="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.036357 containerd[1864]: 2025-08-19 00:15:56.000 [INFO][4756] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.65/26] handle="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.036357 containerd[1864]: 2025-08-19 00:15:56.000 [INFO][4756] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:15:56.036357 containerd[1864]: 2025-08-19 00:15:56.000 [INFO][4756] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.65/26] IPv6=[] ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" HandleID="k8s-pod-network.d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" Aug 19 00:15:56.036452 containerd[1864]: 2025-08-19 00:15:56.002 [INFO][4745] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0", GenerateName:"calico-kube-controllers-66676b5955-", Namespace:"calico-system", SelfLink:"", UID:"bb71d638-eb84-4117-8b41-c44efc0e6f03", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66676b5955", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"calico-kube-controllers-66676b5955-qbll4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic373a9b9a7b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:15:56.036493 containerd[1864]: 2025-08-19 00:15:56.002 [INFO][4745] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.65/32] ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" Aug 19 00:15:56.036493 containerd[1864]: 2025-08-19 00:15:56.002 [INFO][4745] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic373a9b9a7b ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" Aug 19 00:15:56.036493 containerd[1864]: 2025-08-19 00:15:56.023 [INFO][4745] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" Aug 19 00:15:56.036535 containerd[1864]: 2025-08-19 00:15:56.023 [INFO][4745] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0", GenerateName:"calico-kube-controllers-66676b5955-", Namespace:"calico-system", SelfLink:"", UID:"bb71d638-eb84-4117-8b41-c44efc0e6f03", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66676b5955", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2", Pod:"calico-kube-controllers-66676b5955-qbll4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic373a9b9a7b", MAC:"3e:3e:9c:ac:a5:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:15:56.036568 containerd[1864]: 2025-08-19 00:15:56.033 [INFO][4745] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" Namespace="calico-system" Pod="calico-kube-controllers-66676b5955-qbll4" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--kube--controllers--66676b5955--qbll4-eth0" Aug 19 00:15:56.082301 containerd[1864]: time="2025-08-19T00:15:56.082099876Z" level=info msg="connecting to shim d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2" address="unix:///run/containerd/s/23753cfec5f93c16968895f4c028e24511e0d56293b064a04693b13c92cfbe2b" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:15:56.109284 kubelet[3451]: I0819 00:15:56.107990 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-kq8gb" podStartSLOduration=2.004566991 podStartE2EDuration="44.107976302s" podCreationTimestamp="2025-08-19 00:15:12 +0000 UTC" firstStartedPulling="2025-08-19 00:15:12.865157172 +0000 UTC m=+21.076869517" lastFinishedPulling="2025-08-19 00:15:54.968566483 +0000 UTC m=+63.180278828" observedRunningTime="2025-08-19 00:15:56.089402565 +0000 UTC m=+64.301114926" watchObservedRunningTime="2025-08-19 00:15:56.107976302 +0000 UTC m=+64.319688647" Aug 19 00:15:56.114453 systemd[1]: Started cri-containerd-d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2.scope - libcontainer container d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2. Aug 19 00:15:56.180586 containerd[1864]: time="2025-08-19T00:15:56.180044296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66676b5955-qbll4,Uid:bb71d638-eb84-4117-8b41-c44efc0e6f03,Namespace:calico-system,Attempt:0,} returns sandbox id \"d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2\"" Aug 19 00:15:56.186030 containerd[1864]: time="2025-08-19T00:15:56.185616006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 19 00:15:56.187456 systemd[1]: Created slice kubepods-besteffort-pod5af73bac_0aa5_4a23_ba64_29016995690e.slice - libcontainer container kubepods-besteffort-pod5af73bac_0aa5_4a23_ba64_29016995690e.slice. Aug 19 00:15:56.202663 containerd[1864]: time="2025-08-19T00:15:56.202629656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\" id:\"18b602e1485030a59bd2299dbc887894222b48e9bc73306138d351b2fd47f27e\" pid:4804 exit_status:1 exited_at:{seconds:1755562556 nanos:201759574}" Aug 19 00:15:56.315195 kubelet[3451]: I0819 00:15:56.315162 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5af73bac-0aa5-4a23-ba64-29016995690e-whisker-backend-key-pair\") pod \"whisker-79955b9856-vwdq2\" (UID: \"5af73bac-0aa5-4a23-ba64-29016995690e\") " pod="calico-system/whisker-79955b9856-vwdq2" Aug 19 00:15:56.315291 kubelet[3451]: I0819 00:15:56.315225 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5af73bac-0aa5-4a23-ba64-29016995690e-whisker-ca-bundle\") pod \"whisker-79955b9856-vwdq2\" (UID: \"5af73bac-0aa5-4a23-ba64-29016995690e\") " pod="calico-system/whisker-79955b9856-vwdq2" Aug 19 00:15:56.315291 kubelet[3451]: I0819 00:15:56.315287 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f76kg\" (UniqueName: \"kubernetes.io/projected/5af73bac-0aa5-4a23-ba64-29016995690e-kube-api-access-f76kg\") pod \"whisker-79955b9856-vwdq2\" (UID: \"5af73bac-0aa5-4a23-ba64-29016995690e\") " pod="calico-system/whisker-79955b9856-vwdq2" Aug 19 00:15:56.491347 containerd[1864]: time="2025-08-19T00:15:56.491225676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79955b9856-vwdq2,Uid:5af73bac-0aa5-4a23-ba64-29016995690e,Namespace:calico-system,Attempt:0,}" Aug 19 00:15:56.573442 systemd-networkd[1695]: calia7040eded0d: Link UP Aug 19 00:15:56.574768 systemd-networkd[1695]: calia7040eded0d: Gained carrier Aug 19 00:15:56.588257 containerd[1864]: 2025-08-19 00:15:56.518 [INFO][4846] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 00:15:56.588257 containerd[1864]: 2025-08-19 00:15:56.525 [INFO][4846] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0 whisker-79955b9856- calico-system 5af73bac-0aa5-4a23-ba64-29016995690e 967 0 2025-08-19 00:15:56 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79955b9856 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 whisker-79955b9856-vwdq2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calia7040eded0d [] [] }} ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-" Aug 19 00:15:56.588257 containerd[1864]: 2025-08-19 00:15:56.525 [INFO][4846] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" Aug 19 00:15:56.588257 containerd[1864]: 2025-08-19 00:15:56.545 [INFO][4857] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" HandleID="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Workload="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.545 [INFO][4857] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" HandleID="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Workload="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3680), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-440c7464d3", "pod":"whisker-79955b9856-vwdq2", "timestamp":"2025-08-19 00:15:56.545029186 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.545 [INFO][4857] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.545 [INFO][4857] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.545 [INFO][4857] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.549 [INFO][4857] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.552 [INFO][4857] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.555 [INFO][4857] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.556 [INFO][4857] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588426 containerd[1864]: 2025-08-19 00:15:56.558 [INFO][4857] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588554 containerd[1864]: 2025-08-19 00:15:56.558 [INFO][4857] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588554 containerd[1864]: 2025-08-19 00:15:56.559 [INFO][4857] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724 Aug 19 00:15:56.588554 containerd[1864]: 2025-08-19 00:15:56.562 [INFO][4857] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588554 containerd[1864]: 2025-08-19 00:15:56.569 [INFO][4857] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.66/26] block=192.168.21.64/26 handle="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588554 containerd[1864]: 2025-08-19 00:15:56.569 [INFO][4857] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.66/26] handle="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:15:56.588554 containerd[1864]: 2025-08-19 00:15:56.569 [INFO][4857] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:15:56.588554 containerd[1864]: 2025-08-19 00:15:56.569 [INFO][4857] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.66/26] IPv6=[] ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" HandleID="k8s-pod-network.c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Workload="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" Aug 19 00:15:56.588659 containerd[1864]: 2025-08-19 00:15:56.571 [INFO][4846] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0", GenerateName:"whisker-79955b9856-", Namespace:"calico-system", SelfLink:"", UID:"5af73bac-0aa5-4a23-ba64-29016995690e", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79955b9856", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"whisker-79955b9856-vwdq2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.21.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia7040eded0d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:15:56.588659 containerd[1864]: 2025-08-19 00:15:56.571 [INFO][4846] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.66/32] ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" Aug 19 00:15:56.588705 containerd[1864]: 2025-08-19 00:15:56.571 [INFO][4846] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia7040eded0d ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" Aug 19 00:15:56.588705 containerd[1864]: 2025-08-19 00:15:56.574 [INFO][4846] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" Aug 19 00:15:56.588734 containerd[1864]: 2025-08-19 00:15:56.575 [INFO][4846] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0", GenerateName:"whisker-79955b9856-", Namespace:"calico-system", SelfLink:"", UID:"5af73bac-0aa5-4a23-ba64-29016995690e", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79955b9856", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724", Pod:"whisker-79955b9856-vwdq2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.21.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia7040eded0d", MAC:"ae:00:75:5f:48:94", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:15:56.588769 containerd[1864]: 2025-08-19 00:15:56.584 [INFO][4846] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" Namespace="calico-system" Pod="whisker-79955b9856-vwdq2" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-whisker--79955b9856--vwdq2-eth0" Aug 19 00:15:56.639424 containerd[1864]: time="2025-08-19T00:15:56.639377005Z" level=info msg="connecting to shim c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724" address="unix:///run/containerd/s/23a17e56ec8841122a14295f614293f8232be4a82b73e2852af4232b2334ec4c" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:15:56.658356 systemd[1]: Started cri-containerd-c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724.scope - libcontainer container c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724. Aug 19 00:15:56.686093 containerd[1864]: time="2025-08-19T00:15:56.686009130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79955b9856-vwdq2,Uid:5af73bac-0aa5-4a23-ba64-29016995690e,Namespace:calico-system,Attempt:0,} returns sandbox id \"c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724\"" Aug 19 00:15:57.139393 containerd[1864]: time="2025-08-19T00:15:57.139356449Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\" id:\"e53743453b9ba2208a18f3a83357615795264f9990b129bc4d59c92ed3b88d57\" pid:5041 exit_status:1 exited_at:{seconds:1755562557 nanos:139106122}" Aug 19 00:15:58.085379 systemd-networkd[1695]: calic373a9b9a7b: Gained IPv6LL Aug 19 00:15:58.208505 kubelet[3451]: I0819 00:15:58.208461 3451 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b" path="/var/lib/kubelet/pods/faf2c0f1-76a3-47d7-9d8d-b6aa58e5852b/volumes" Aug 19 00:15:58.277501 systemd-networkd[1695]: calia7040eded0d: Gained IPv6LL Aug 19 00:15:58.645744 systemd-networkd[1695]: vxlan.calico: Link UP Aug 19 00:15:58.645749 systemd-networkd[1695]: vxlan.calico: Gained carrier Aug 19 00:16:00.133399 systemd-networkd[1695]: vxlan.calico: Gained IPv6LL Aug 19 00:16:03.668367 containerd[1864]: time="2025-08-19T00:16:03.668281564Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:03.674095 containerd[1864]: time="2025-08-19T00:16:03.674057610Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Aug 19 00:16:03.677667 containerd[1864]: time="2025-08-19T00:16:03.677607757Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:03.681912 containerd[1864]: time="2025-08-19T00:16:03.681871813Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:03.682437 containerd[1864]: time="2025-08-19T00:16:03.682134037Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 7.495957415s" Aug 19 00:16:03.682437 containerd[1864]: time="2025-08-19T00:16:03.682160886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Aug 19 00:16:03.683245 containerd[1864]: time="2025-08-19T00:16:03.683172365Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 19 00:16:03.698586 containerd[1864]: time="2025-08-19T00:16:03.698569477Z" level=info msg="CreateContainer within sandbox \"d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 19 00:16:03.820364 containerd[1864]: time="2025-08-19T00:16:03.820331082Z" level=info msg="Container a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:03.820714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2037559851.mount: Deactivated successfully. Aug 19 00:16:03.959625 containerd[1864]: time="2025-08-19T00:16:03.959531989Z" level=info msg="CreateContainer within sandbox \"d0b3a6821d660a34a9f58d67866378bb745573b681b34d63754d46ebe68ed8a2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\"" Aug 19 00:16:03.960253 containerd[1864]: time="2025-08-19T00:16:03.960027668Z" level=info msg="StartContainer for \"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\"" Aug 19 00:16:03.962019 containerd[1864]: time="2025-08-19T00:16:03.961987335Z" level=info msg="connecting to shim a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a" address="unix:///run/containerd/s/23753cfec5f93c16968895f4c028e24511e0d56293b064a04693b13c92cfbe2b" protocol=ttrpc version=3 Aug 19 00:16:03.980519 systemd[1]: Started cri-containerd-a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a.scope - libcontainer container a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a. Aug 19 00:16:04.072812 containerd[1864]: time="2025-08-19T00:16:04.072733001Z" level=info msg="StartContainer for \"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" returns successfully" Aug 19 00:16:04.114220 containerd[1864]: time="2025-08-19T00:16:04.114187730Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"dff5abaf01eac466b56b8f80105cbf239b3077d80625dad113e710e330a173ad\" pid:5198 exit_status:1 exited_at:{seconds:1755562564 nanos:113843224}" Aug 19 00:16:05.110499 containerd[1864]: time="2025-08-19T00:16:05.110441557Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"38b539749f053326c1f4574bec8692932a22f44adc9860a67a8f7cab6cb93edf\" pid:5228 exited_at:{seconds:1755562565 nanos:110052281}" Aug 19 00:16:05.124721 kubelet[3451]: I0819 00:16:05.124304 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-66676b5955-qbll4" podStartSLOduration=45.626357203 podStartE2EDuration="53.124289205s" podCreationTimestamp="2025-08-19 00:15:12 +0000 UTC" firstStartedPulling="2025-08-19 00:15:56.184830206 +0000 UTC m=+64.396542551" lastFinishedPulling="2025-08-19 00:16:03.682762208 +0000 UTC m=+71.894474553" observedRunningTime="2025-08-19 00:16:04.102586493 +0000 UTC m=+72.314298838" watchObservedRunningTime="2025-08-19 00:16:05.124289205 +0000 UTC m=+73.336001550" Aug 19 00:16:05.568326 containerd[1864]: time="2025-08-19T00:16:05.568274456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:05.572600 containerd[1864]: time="2025-08-19T00:16:05.572565241Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Aug 19 00:16:05.617588 containerd[1864]: time="2025-08-19T00:16:05.617535495Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:05.665478 containerd[1864]: time="2025-08-19T00:16:05.665411812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:05.666073 containerd[1864]: time="2025-08-19T00:16:05.665796680Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.982599971s" Aug 19 00:16:05.666073 containerd[1864]: time="2025-08-19T00:16:05.665824713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Aug 19 00:16:05.674796 containerd[1864]: time="2025-08-19T00:16:05.674776686Z" level=info msg="CreateContainer within sandbox \"c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 19 00:16:05.812913 containerd[1864]: time="2025-08-19T00:16:05.811550433Z" level=info msg="Container 9d5837b4545f393dd35ab8d487e186652efcbf94c56388a6417e02cff726e208: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:05.976037 containerd[1864]: time="2025-08-19T00:16:05.975953041Z" level=info msg="CreateContainer within sandbox \"c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"9d5837b4545f393dd35ab8d487e186652efcbf94c56388a6417e02cff726e208\"" Aug 19 00:16:05.977192 containerd[1864]: time="2025-08-19T00:16:05.977168518Z" level=info msg="StartContainer for \"9d5837b4545f393dd35ab8d487e186652efcbf94c56388a6417e02cff726e208\"" Aug 19 00:16:05.978028 containerd[1864]: time="2025-08-19T00:16:05.978004847Z" level=info msg="connecting to shim 9d5837b4545f393dd35ab8d487e186652efcbf94c56388a6417e02cff726e208" address="unix:///run/containerd/s/23a17e56ec8841122a14295f614293f8232be4a82b73e2852af4232b2334ec4c" protocol=ttrpc version=3 Aug 19 00:16:05.998344 systemd[1]: Started cri-containerd-9d5837b4545f393dd35ab8d487e186652efcbf94c56388a6417e02cff726e208.scope - libcontainer container 9d5837b4545f393dd35ab8d487e186652efcbf94c56388a6417e02cff726e208. Aug 19 00:16:06.031827 containerd[1864]: time="2025-08-19T00:16:06.031759725Z" level=info msg="StartContainer for \"9d5837b4545f393dd35ab8d487e186652efcbf94c56388a6417e02cff726e208\" returns successfully" Aug 19 00:16:06.033291 containerd[1864]: time="2025-08-19T00:16:06.032948352Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 19 00:16:06.905935 containerd[1864]: time="2025-08-19T00:16:06.905894596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tl6wx,Uid:e9c36368-8b14-48eb-8a1e-75af084a418b,Namespace:calico-system,Attempt:0,}" Aug 19 00:16:06.995792 systemd-networkd[1695]: cali28b84b93586: Link UP Aug 19 00:16:06.996864 systemd-networkd[1695]: cali28b84b93586: Gained carrier Aug 19 00:16:07.014334 containerd[1864]: 2025-08-19 00:16:06.942 [INFO][5275] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0 goldmane-768f4c5c69- calico-system e9c36368-8b14-48eb-8a1e-75af084a418b 869 0 2025-08-19 00:15:12 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 goldmane-768f4c5c69-tl6wx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali28b84b93586 [] [] }} ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-" Aug 19 00:16:07.014334 containerd[1864]: 2025-08-19 00:16:06.942 [INFO][5275] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" Aug 19 00:16:07.014334 containerd[1864]: 2025-08-19 00:16:06.958 [INFO][5286] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" HandleID="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Workload="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.959 [INFO][5286] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" HandleID="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Workload="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-440c7464d3", "pod":"goldmane-768f4c5c69-tl6wx", "timestamp":"2025-08-19 00:16:06.958893779 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.959 [INFO][5286] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.959 [INFO][5286] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.959 [INFO][5286] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.963 [INFO][5286] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.966 [INFO][5286] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.970 [INFO][5286] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.971 [INFO][5286] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014473 containerd[1864]: 2025-08-19 00:16:06.973 [INFO][5286] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014614 containerd[1864]: 2025-08-19 00:16:06.973 [INFO][5286] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014614 containerd[1864]: 2025-08-19 00:16:06.975 [INFO][5286] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c Aug 19 00:16:07.014614 containerd[1864]: 2025-08-19 00:16:06.979 [INFO][5286] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014614 containerd[1864]: 2025-08-19 00:16:06.991 [INFO][5286] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.67/26] block=192.168.21.64/26 handle="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014614 containerd[1864]: 2025-08-19 00:16:06.991 [INFO][5286] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.67/26] handle="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:07.014614 containerd[1864]: 2025-08-19 00:16:06.991 [INFO][5286] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:07.014614 containerd[1864]: 2025-08-19 00:16:06.991 [INFO][5286] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.67/26] IPv6=[] ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" HandleID="k8s-pod-network.88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Workload="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" Aug 19 00:16:07.014707 containerd[1864]: 2025-08-19 00:16:06.993 [INFO][5275] cni-plugin/k8s.go 418: Populated endpoint ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"e9c36368-8b14-48eb-8a1e-75af084a418b", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"goldmane-768f4c5c69-tl6wx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali28b84b93586", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:07.014707 containerd[1864]: 2025-08-19 00:16:06.993 [INFO][5275] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.67/32] ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" Aug 19 00:16:07.014754 containerd[1864]: 2025-08-19 00:16:06.993 [INFO][5275] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali28b84b93586 ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" Aug 19 00:16:07.014754 containerd[1864]: 2025-08-19 00:16:06.997 [INFO][5275] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" Aug 19 00:16:07.014783 containerd[1864]: 2025-08-19 00:16:06.997 [INFO][5275] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"e9c36368-8b14-48eb-8a1e-75af084a418b", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c", Pod:"goldmane-768f4c5c69-tl6wx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali28b84b93586", MAC:"12:11:f8:69:33:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:07.014815 containerd[1864]: 2025-08-19 00:16:07.012 [INFO][5275] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" Namespace="calico-system" Pod="goldmane-768f4c5c69-tl6wx" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-goldmane--768f4c5c69--tl6wx-eth0" Aug 19 00:16:07.906745 containerd[1864]: time="2025-08-19T00:16:07.906696918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-xtwmv,Uid:3ee73aa8-152c-47e9-930f-01221caf59a8,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:16:07.907151 containerd[1864]: time="2025-08-19T00:16:07.907125451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dvgdj,Uid:d2d325d7-1d07-4009-8a42-24c5de39eba0,Namespace:kube-system,Attempt:0,}" Aug 19 00:16:08.133376 systemd-networkd[1695]: cali28b84b93586: Gained IPv6LL Aug 19 00:16:08.905620 containerd[1864]: time="2025-08-19T00:16:08.905580951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-769sj,Uid:891bd3f0-8d1a-448c-a33f-43deb1ae5105,Namespace:kube-system,Attempt:0,}" Aug 19 00:16:09.620813 containerd[1864]: time="2025-08-19T00:16:09.620764713Z" level=info msg="connecting to shim 88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c" address="unix:///run/containerd/s/e93bf397fc3f92170c5222ad4ab56fea84a649fe4d923abaec1cb8db3245da5f" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:09.642349 systemd[1]: Started cri-containerd-88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c.scope - libcontainer container 88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c. Aug 19 00:16:09.858889 containerd[1864]: time="2025-08-19T00:16:09.858842950Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tl6wx,Uid:e9c36368-8b14-48eb-8a1e-75af084a418b,Namespace:calico-system,Attempt:0,} returns sandbox id \"88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c\"" Aug 19 00:16:09.908837 containerd[1864]: time="2025-08-19T00:16:09.908660086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-7lbsf,Uid:06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:16:09.909520 containerd[1864]: time="2025-08-19T00:16:09.909488855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-l566d,Uid:48f4c9e3-c928-4ddd-bc8b-c6722f68bb65,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:16:09.909711 containerd[1864]: time="2025-08-19T00:16:09.909664852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c697k,Uid:82bb62c5-1662-42fe-899a-c8b29c1bb13e,Namespace:calico-system,Attempt:0,}" Aug 19 00:16:09.993472 systemd-networkd[1695]: cali0996855bf88: Link UP Aug 19 00:16:09.994039 systemd-networkd[1695]: cali0996855bf88: Gained carrier Aug 19 00:16:10.011472 containerd[1864]: 2025-08-19 00:16:09.933 [INFO][5355] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0 calico-apiserver-5d78cb7dc4- calico-apiserver 3ee73aa8-152c-47e9-930f-01221caf59a8 874 0 2025-08-19 00:15:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d78cb7dc4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 calico-apiserver-5d78cb7dc4-xtwmv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0996855bf88 [] [] }} ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-" Aug 19 00:16:10.011472 containerd[1864]: 2025-08-19 00:16:09.933 [INFO][5355] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" Aug 19 00:16:10.011472 containerd[1864]: 2025-08-19 00:16:09.953 [INFO][5368] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" HandleID="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.953 [INFO][5368] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" HandleID="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3010), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-440c7464d3", "pod":"calico-apiserver-5d78cb7dc4-xtwmv", "timestamp":"2025-08-19 00:16:09.953869387 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.954 [INFO][5368] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.954 [INFO][5368] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.954 [INFO][5368] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.959 [INFO][5368] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.961 [INFO][5368] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.964 [INFO][5368] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.966 [INFO][5368] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011608 containerd[1864]: 2025-08-19 00:16:09.967 [INFO][5368] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011752 containerd[1864]: 2025-08-19 00:16:09.967 [INFO][5368] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011752 containerd[1864]: 2025-08-19 00:16:09.969 [INFO][5368] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154 Aug 19 00:16:10.011752 containerd[1864]: 2025-08-19 00:16:09.976 [INFO][5368] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011752 containerd[1864]: 2025-08-19 00:16:09.985 [INFO][5368] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.68/26] block=192.168.21.64/26 handle="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011752 containerd[1864]: 2025-08-19 00:16:09.986 [INFO][5368] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.68/26] handle="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.011752 containerd[1864]: 2025-08-19 00:16:09.986 [INFO][5368] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:10.011752 containerd[1864]: 2025-08-19 00:16:09.986 [INFO][5368] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.68/26] IPv6=[] ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" HandleID="k8s-pod-network.98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" Aug 19 00:16:10.011840 containerd[1864]: 2025-08-19 00:16:09.988 [INFO][5355] cni-plugin/k8s.go 418: Populated endpoint ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0", GenerateName:"calico-apiserver-5d78cb7dc4-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ee73aa8-152c-47e9-930f-01221caf59a8", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d78cb7dc4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"calico-apiserver-5d78cb7dc4-xtwmv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0996855bf88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:10.011873 containerd[1864]: 2025-08-19 00:16:09.988 [INFO][5355] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.68/32] ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" Aug 19 00:16:10.011873 containerd[1864]: 2025-08-19 00:16:09.988 [INFO][5355] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0996855bf88 ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" Aug 19 00:16:10.011873 containerd[1864]: 2025-08-19 00:16:09.994 [INFO][5355] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" Aug 19 00:16:10.011915 containerd[1864]: 2025-08-19 00:16:09.994 [INFO][5355] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0", GenerateName:"calico-apiserver-5d78cb7dc4-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ee73aa8-152c-47e9-930f-01221caf59a8", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d78cb7dc4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154", Pod:"calico-apiserver-5d78cb7dc4-xtwmv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0996855bf88", MAC:"d6:8a:21:dc:0a:3d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:10.011947 containerd[1864]: 2025-08-19 00:16:10.006 [INFO][5355] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-xtwmv" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--xtwmv-eth0" Aug 19 00:16:10.100336 systemd-networkd[1695]: cali025ad4b39e9: Link UP Aug 19 00:16:10.102008 systemd-networkd[1695]: cali025ad4b39e9: Gained carrier Aug 19 00:16:10.165630 containerd[1864]: 2025-08-19 00:16:10.011 [INFO][5374] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0 coredns-674b8bbfcf- kube-system d2d325d7-1d07-4009-8a42-24c5de39eba0 860 0 2025-08-19 00:14:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 coredns-674b8bbfcf-dvgdj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali025ad4b39e9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-" Aug 19 00:16:10.165630 containerd[1864]: 2025-08-19 00:16:10.012 [INFO][5374] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" Aug 19 00:16:10.165630 containerd[1864]: 2025-08-19 00:16:10.041 [INFO][5399] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" HandleID="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Workload="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.041 [INFO][5399] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" HandleID="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Workload="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.0.0-a-440c7464d3", "pod":"coredns-674b8bbfcf-dvgdj", "timestamp":"2025-08-19 00:16:10.041700936 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.041 [INFO][5399] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.041 [INFO][5399] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.041 [INFO][5399] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.061 [INFO][5399] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.065 [INFO][5399] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.069 [INFO][5399] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.070 [INFO][5399] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.165770 containerd[1864]: 2025-08-19 00:16:10.072 [INFO][5399] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.167960 containerd[1864]: 2025-08-19 00:16:10.072 [INFO][5399] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.167960 containerd[1864]: 2025-08-19 00:16:10.073 [INFO][5399] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644 Aug 19 00:16:10.167960 containerd[1864]: 2025-08-19 00:16:10.082 [INFO][5399] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.167960 containerd[1864]: 2025-08-19 00:16:10.089 [INFO][5399] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.69/26] block=192.168.21.64/26 handle="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.167960 containerd[1864]: 2025-08-19 00:16:10.089 [INFO][5399] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.69/26] handle="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.167960 containerd[1864]: 2025-08-19 00:16:10.089 [INFO][5399] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:10.167960 containerd[1864]: 2025-08-19 00:16:10.089 [INFO][5399] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.69/26] IPv6=[] ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" HandleID="k8s-pod-network.2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Workload="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" Aug 19 00:16:10.168065 containerd[1864]: 2025-08-19 00:16:10.092 [INFO][5374] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d2d325d7-1d07-4009-8a42-24c5de39eba0", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 14, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"coredns-674b8bbfcf-dvgdj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali025ad4b39e9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:10.168065 containerd[1864]: 2025-08-19 00:16:10.092 [INFO][5374] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.69/32] ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" Aug 19 00:16:10.168065 containerd[1864]: 2025-08-19 00:16:10.093 [INFO][5374] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali025ad4b39e9 ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" Aug 19 00:16:10.168065 containerd[1864]: 2025-08-19 00:16:10.102 [INFO][5374] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" Aug 19 00:16:10.168065 containerd[1864]: 2025-08-19 00:16:10.102 [INFO][5374] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d2d325d7-1d07-4009-8a42-24c5de39eba0", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 14, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644", Pod:"coredns-674b8bbfcf-dvgdj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali025ad4b39e9", MAC:"02:c1:84:3d:49:ef", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:10.168065 containerd[1864]: 2025-08-19 00:16:10.116 [INFO][5374] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" Namespace="kube-system" Pod="coredns-674b8bbfcf-dvgdj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--dvgdj-eth0" Aug 19 00:16:10.200434 systemd-networkd[1695]: cali8869d4de77d: Link UP Aug 19 00:16:10.201977 systemd-networkd[1695]: cali8869d4de77d: Gained carrier Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.043 [INFO][5392] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0 coredns-674b8bbfcf- kube-system 891bd3f0-8d1a-448c-a33f-43deb1ae5105 866 0 2025-08-19 00:14:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 coredns-674b8bbfcf-769sj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8869d4de77d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.043 [INFO][5392] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.066 [INFO][5412] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" HandleID="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Workload="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.066 [INFO][5412] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" HandleID="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Workload="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.0.0-a-440c7464d3", "pod":"coredns-674b8bbfcf-769sj", "timestamp":"2025-08-19 00:16:10.066004666 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.066 [INFO][5412] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.089 [INFO][5412] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.090 [INFO][5412] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.161 [INFO][5412] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.169 [INFO][5412] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.174 [INFO][5412] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.176 [INFO][5412] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.178 [INFO][5412] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.178 [INFO][5412] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.179 [INFO][5412] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0 Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.183 [INFO][5412] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.193 [INFO][5412] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.70/26] block=192.168.21.64/26 handle="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.193 [INFO][5412] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.70/26] handle="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.193 [INFO][5412] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:10.231459 containerd[1864]: 2025-08-19 00:16:10.193 [INFO][5412] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.70/26] IPv6=[] ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" HandleID="k8s-pod-network.c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Workload="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" Aug 19 00:16:10.231809 containerd[1864]: 2025-08-19 00:16:10.194 [INFO][5392] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"891bd3f0-8d1a-448c-a33f-43deb1ae5105", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 14, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"coredns-674b8bbfcf-769sj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8869d4de77d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:10.231809 containerd[1864]: 2025-08-19 00:16:10.195 [INFO][5392] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.70/32] ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" Aug 19 00:16:10.231809 containerd[1864]: 2025-08-19 00:16:10.195 [INFO][5392] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8869d4de77d ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" Aug 19 00:16:10.231809 containerd[1864]: 2025-08-19 00:16:10.206 [INFO][5392] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" Aug 19 00:16:10.231809 containerd[1864]: 2025-08-19 00:16:10.210 [INFO][5392] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"891bd3f0-8d1a-448c-a33f-43deb1ae5105", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 14, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0", Pod:"coredns-674b8bbfcf-769sj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8869d4de77d", MAC:"3e:3d:88:a5:b1:3f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:10.231809 containerd[1864]: 2025-08-19 00:16:10.224 [INFO][5392] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" Namespace="kube-system" Pod="coredns-674b8bbfcf-769sj" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-coredns--674b8bbfcf--769sj-eth0" Aug 19 00:16:11.013376 systemd-networkd[1695]: cali0996855bf88: Gained IPv6LL Aug 19 00:16:12.037405 systemd-networkd[1695]: cali8869d4de77d: Gained IPv6LL Aug 19 00:16:12.165369 systemd-networkd[1695]: cali025ad4b39e9: Gained IPv6LL Aug 19 00:16:16.637979 systemd-networkd[1695]: cali1145d25bd81: Link UP Aug 19 00:16:16.638348 systemd-networkd[1695]: cali1145d25bd81: Gained carrier Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.583 [INFO][5451] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0 calico-apiserver-6c8858db87- calico-apiserver 06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8 871 0 2025-08-19 00:15:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c8858db87 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 calico-apiserver-6c8858db87-7lbsf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1145d25bd81 [] [] }} ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.583 [INFO][5451] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.600 [INFO][5464] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.600 [INFO][5464] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002caff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-440c7464d3", "pod":"calico-apiserver-6c8858db87-7lbsf", "timestamp":"2025-08-19 00:16:16.600394488 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.600 [INFO][5464] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.600 [INFO][5464] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.600 [INFO][5464] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.605 [INFO][5464] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.608 [INFO][5464] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.610 [INFO][5464] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.612 [INFO][5464] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.613 [INFO][5464] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.613 [INFO][5464] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.614 [INFO][5464] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.622 [INFO][5464] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.632 [INFO][5464] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.71/26] block=192.168.21.64/26 handle="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.632 [INFO][5464] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.71/26] handle="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.632 [INFO][5464] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:16.650690 containerd[1864]: 2025-08-19 00:16:16.632 [INFO][5464] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.71/26] IPv6=[] ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:16.652311 containerd[1864]: 2025-08-19 00:16:16.634 [INFO][5451] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0", GenerateName:"calico-apiserver-6c8858db87-", Namespace:"calico-apiserver", SelfLink:"", UID:"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c8858db87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"calico-apiserver-6c8858db87-7lbsf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1145d25bd81", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:16.652311 containerd[1864]: 2025-08-19 00:16:16.634 [INFO][5451] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.71/32] ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:16.652311 containerd[1864]: 2025-08-19 00:16:16.634 [INFO][5451] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1145d25bd81 ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:16.652311 containerd[1864]: 2025-08-19 00:16:16.635 [INFO][5451] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:16.652311 containerd[1864]: 2025-08-19 00:16:16.635 [INFO][5451] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0", GenerateName:"calico-apiserver-6c8858db87-", Namespace:"calico-apiserver", SelfLink:"", UID:"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c8858db87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b", Pod:"calico-apiserver-6c8858db87-7lbsf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1145d25bd81", MAC:"b6:38:c3:e7:cb:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:16.652311 containerd[1864]: 2025-08-19 00:16:16.647 [INFO][5451] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-7lbsf" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:16.872784 systemd-networkd[1695]: cali3dc735b92a3: Link UP Aug 19 00:16:16.873499 systemd-networkd[1695]: cali3dc735b92a3: Gained carrier Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.806 [INFO][5481] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0 csi-node-driver- calico-system 82bb62c5-1662-42fe-899a-c8b29c1bb13e 718 0 2025-08-19 00:15:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 csi-node-driver-c697k eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali3dc735b92a3 [] [] }} ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.806 [INFO][5481] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.832 [INFO][5505] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" HandleID="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Workload="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.832 [INFO][5505] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" HandleID="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Workload="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-440c7464d3", "pod":"csi-node-driver-c697k", "timestamp":"2025-08-19 00:16:16.832810404 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.833 [INFO][5505] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.833 [INFO][5505] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.833 [INFO][5505] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.839 [INFO][5505] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.843 [INFO][5505] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.846 [INFO][5505] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.847 [INFO][5505] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.850 [INFO][5505] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.851 [INFO][5505] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.853 [INFO][5505] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471 Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.857 [INFO][5505] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.866 [INFO][5505] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.72/26] block=192.168.21.64/26 handle="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.866 [INFO][5505] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.72/26] handle="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.866 [INFO][5505] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:16.887377 containerd[1864]: 2025-08-19 00:16:16.866 [INFO][5505] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.72/26] IPv6=[] ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" HandleID="k8s-pod-network.43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Workload="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" Aug 19 00:16:16.887762 containerd[1864]: 2025-08-19 00:16:16.869 [INFO][5481] cni-plugin/k8s.go 418: Populated endpoint ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"82bb62c5-1662-42fe-899a-c8b29c1bb13e", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"csi-node-driver-c697k", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3dc735b92a3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:16.887762 containerd[1864]: 2025-08-19 00:16:16.869 [INFO][5481] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.72/32] ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" Aug 19 00:16:16.887762 containerd[1864]: 2025-08-19 00:16:16.869 [INFO][5481] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3dc735b92a3 ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" Aug 19 00:16:16.887762 containerd[1864]: 2025-08-19 00:16:16.873 [INFO][5481] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" Aug 19 00:16:16.887762 containerd[1864]: 2025-08-19 00:16:16.874 [INFO][5481] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"82bb62c5-1662-42fe-899a-c8b29c1bb13e", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471", Pod:"csi-node-driver-c697k", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3dc735b92a3", MAC:"9e:1b:77:a7:ab:4e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:16.887762 containerd[1864]: 2025-08-19 00:16:16.884 [INFO][5481] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" Namespace="calico-system" Pod="csi-node-driver-c697k" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-csi--node--driver--c697k-eth0" Aug 19 00:16:16.927774 containerd[1864]: time="2025-08-19T00:16:16.927575348Z" level=info msg="connecting to shim 98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154" address="unix:///run/containerd/s/c72bfb8772a9c7699151cbdba7303f96acb9d95c924dfdff59f94384d15e8a5c" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:16.934939 containerd[1864]: time="2025-08-19T00:16:16.934888903Z" level=info msg="connecting to shim 2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644" address="unix:///run/containerd/s/d598ed079a2cdb294fe5233f31a4bd156f793a0b164e25bd2fa8e60e10e31e5a" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:16.968358 systemd[1]: Started cri-containerd-98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154.scope - libcontainer container 98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154. Aug 19 00:16:16.972554 systemd[1]: Started cri-containerd-2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644.scope - libcontainer container 2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644. Aug 19 00:16:16.976199 containerd[1864]: time="2025-08-19T00:16:16.976137627Z" level=info msg="connecting to shim c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0" address="unix:///run/containerd/s/c2fc9726bc8acec63048c4f12ccfa9a80a9adc5ff2294457bc57a5a2357140bd" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:16.988608 containerd[1864]: time="2025-08-19T00:16:16.988155739Z" level=info msg="connecting to shim 8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" address="unix:///run/containerd/s/9c517928a4d0393e23dd4a3a0d2ec0c9cfbd960d4e8e1be64cba0e93c8a84026" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:16.992158 containerd[1864]: time="2025-08-19T00:16:16.991941549Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:16.995911 containerd[1864]: time="2025-08-19T00:16:16.995802784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Aug 19 00:16:17.012728 systemd-networkd[1695]: cali43af0d6c25f: Link UP Aug 19 00:16:17.014151 systemd-networkd[1695]: cali43af0d6c25f: Gained carrier Aug 19 00:16:17.021108 containerd[1864]: time="2025-08-19T00:16:17.021077270Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:17.034026 containerd[1864]: time="2025-08-19T00:16:17.033999345Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:17.034696 systemd[1]: Started cri-containerd-8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b.scope - libcontainer container 8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b. Aug 19 00:16:17.039320 systemd[1]: Started cri-containerd-c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0.scope - libcontainer container c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0. Aug 19 00:16:17.040872 containerd[1864]: time="2025-08-19T00:16:17.040749371Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 11.007774986s" Aug 19 00:16:17.040872 containerd[1864]: time="2025-08-19T00:16:17.040777484Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Aug 19 00:16:17.054988 containerd[1864]: time="2025-08-19T00:16:17.054958181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.807 [INFO][5492] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0 calico-apiserver-6c8858db87- calico-apiserver 48f4c9e3-c928-4ddd-bc8b-c6722f68bb65 872 0 2025-08-19 00:15:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c8858db87 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 calico-apiserver-6c8858db87-l566d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali43af0d6c25f [] [] }} ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.807 [INFO][5492] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.835 [INFO][5506] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.835 [INFO][5506] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-440c7464d3", "pod":"calico-apiserver-6c8858db87-l566d", "timestamp":"2025-08-19 00:16:16.835191572 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.835 [INFO][5506] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.867 [INFO][5506] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.867 [INFO][5506] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.940 [INFO][5506] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.947 [INFO][5506] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.954 [INFO][5506] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.957 [INFO][5506] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.966 [INFO][5506] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.967 [INFO][5506] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.973 [INFO][5506] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.986 [INFO][5506] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.997 [INFO][5506] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.73/26] block=192.168.21.64/26 handle="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.997 [INFO][5506] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.73/26] handle="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.997 [INFO][5506] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:17.059542 containerd[1864]: 2025-08-19 00:16:16.998 [INFO][5506] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.73/26] IPv6=[] ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:17.059888 containerd[1864]: 2025-08-19 00:16:17.004 [INFO][5492] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0", GenerateName:"calico-apiserver-6c8858db87-", Namespace:"calico-apiserver", SelfLink:"", UID:"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c8858db87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"calico-apiserver-6c8858db87-l566d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43af0d6c25f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:17.059888 containerd[1864]: 2025-08-19 00:16:17.004 [INFO][5492] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.73/32] ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:17.059888 containerd[1864]: 2025-08-19 00:16:17.004 [INFO][5492] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43af0d6c25f ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:17.059888 containerd[1864]: 2025-08-19 00:16:17.015 [INFO][5492] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:17.059888 containerd[1864]: 2025-08-19 00:16:17.019 [INFO][5492] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0", GenerateName:"calico-apiserver-6c8858db87-", Namespace:"calico-apiserver", SelfLink:"", UID:"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 15, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c8858db87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea", Pod:"calico-apiserver-6c8858db87-l566d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43af0d6c25f", MAC:"7e:3e:37:5d:fd:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:17.059888 containerd[1864]: 2025-08-19 00:16:17.043 [INFO][5492] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Namespace="calico-apiserver" Pod="calico-apiserver-6c8858db87-l566d" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:17.063601 containerd[1864]: time="2025-08-19T00:16:17.063308247Z" level=info msg="connecting to shim 43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471" address="unix:///run/containerd/s/7df30dc771e96ae4cb6d59bbc1ea3382b4e422219759ac8b8551840de16d9f6e" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:17.072535 containerd[1864]: time="2025-08-19T00:16:17.071713659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dvgdj,Uid:d2d325d7-1d07-4009-8a42-24c5de39eba0,Namespace:kube-system,Attempt:0,} returns sandbox id \"2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644\"" Aug 19 00:16:17.072695 containerd[1864]: time="2025-08-19T00:16:17.072085638Z" level=info msg="CreateContainer within sandbox \"c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 19 00:16:17.089452 containerd[1864]: time="2025-08-19T00:16:17.089429710Z" level=info msg="CreateContainer within sandbox \"2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 00:16:17.116576 systemd[1]: Started cri-containerd-43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471.scope - libcontainer container 43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471. Aug 19 00:16:17.125222 containerd[1864]: time="2025-08-19T00:16:17.124884684Z" level=info msg="Container 1f919837242bb82b9e1d193a5b8cd66fdbc00925636bbc955771917023ac183c: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:17.153024 containerd[1864]: time="2025-08-19T00:16:17.152991935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-769sj,Uid:891bd3f0-8d1a-448c-a33f-43deb1ae5105,Namespace:kube-system,Attempt:0,} returns sandbox id \"c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0\"" Aug 19 00:16:17.163885 containerd[1864]: time="2025-08-19T00:16:17.163863732Z" level=info msg="CreateContainer within sandbox \"c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 00:16:17.174275 containerd[1864]: time="2025-08-19T00:16:17.174246188Z" level=info msg="Container c78f0a073def961f3045bd038412c307f8d750ebe7c5c5635367effee6012a14: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:17.177732 containerd[1864]: time="2025-08-19T00:16:17.177609288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-7lbsf,Uid:06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\"" Aug 19 00:16:17.181326 containerd[1864]: time="2025-08-19T00:16:17.181251677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-xtwmv,Uid:3ee73aa8-152c-47e9-930f-01221caf59a8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154\"" Aug 19 00:16:17.188862 containerd[1864]: time="2025-08-19T00:16:17.188754318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c697k,Uid:82bb62c5-1662-42fe-899a-c8b29c1bb13e,Namespace:calico-system,Attempt:0,} returns sandbox id \"43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471\"" Aug 19 00:16:17.189268 containerd[1864]: time="2025-08-19T00:16:17.189137554Z" level=info msg="connecting to shim 0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" address="unix:///run/containerd/s/c86dec53bd1ffd45c53366d49e602cc8538b215dc23a79109be399f78530c311" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:17.199080 containerd[1864]: time="2025-08-19T00:16:17.198812148Z" level=info msg="CreateContainer within sandbox \"c66edacc3ee17394bb5afe980ad8dd1b507828408d0b7fd4059b43996b7fc724\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1f919837242bb82b9e1d193a5b8cd66fdbc00925636bbc955771917023ac183c\"" Aug 19 00:16:17.200153 containerd[1864]: time="2025-08-19T00:16:17.199900268Z" level=info msg="StartContainer for \"1f919837242bb82b9e1d193a5b8cd66fdbc00925636bbc955771917023ac183c\"" Aug 19 00:16:17.202846 containerd[1864]: time="2025-08-19T00:16:17.202806203Z" level=info msg="connecting to shim 1f919837242bb82b9e1d193a5b8cd66fdbc00925636bbc955771917023ac183c" address="unix:///run/containerd/s/23a17e56ec8841122a14295f614293f8232be4a82b73e2852af4232b2334ec4c" protocol=ttrpc version=3 Aug 19 00:16:17.211544 systemd[1]: Started cri-containerd-0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea.scope - libcontainer container 0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea. Aug 19 00:16:17.216960 containerd[1864]: time="2025-08-19T00:16:17.216682947Z" level=info msg="CreateContainer within sandbox \"2d6bbaa2f9a3374aaee7e73998dd19aa54847898467c784141e4670bb6a17644\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c78f0a073def961f3045bd038412c307f8d750ebe7c5c5635367effee6012a14\"" Aug 19 00:16:17.219524 containerd[1864]: time="2025-08-19T00:16:17.219490375Z" level=info msg="StartContainer for \"c78f0a073def961f3045bd038412c307f8d750ebe7c5c5635367effee6012a14\"" Aug 19 00:16:17.220612 containerd[1864]: time="2025-08-19T00:16:17.220592688Z" level=info msg="connecting to shim c78f0a073def961f3045bd038412c307f8d750ebe7c5c5635367effee6012a14" address="unix:///run/containerd/s/d598ed079a2cdb294fe5233f31a4bd156f793a0b164e25bd2fa8e60e10e31e5a" protocol=ttrpc version=3 Aug 19 00:16:17.229470 systemd[1]: Started cri-containerd-1f919837242bb82b9e1d193a5b8cd66fdbc00925636bbc955771917023ac183c.scope - libcontainer container 1f919837242bb82b9e1d193a5b8cd66fdbc00925636bbc955771917023ac183c. Aug 19 00:16:17.238688 containerd[1864]: time="2025-08-19T00:16:17.238661774Z" level=info msg="Container 53b1f69556831150e91de301c396a0a72d6a0f0bd40d4d92b574f0caadbf8651: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:17.240529 systemd[1]: Started cri-containerd-c78f0a073def961f3045bd038412c307f8d750ebe7c5c5635367effee6012a14.scope - libcontainer container c78f0a073def961f3045bd038412c307f8d750ebe7c5c5635367effee6012a14. Aug 19 00:16:17.368670 containerd[1864]: time="2025-08-19T00:16:17.368642605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c8858db87-l566d,Uid:48f4c9e3-c928-4ddd-bc8b-c6722f68bb65,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\"" Aug 19 00:16:17.369515 containerd[1864]: time="2025-08-19T00:16:17.369472141Z" level=info msg="CreateContainer within sandbox \"c4bfa62dd53227bdfd571d386bd613c83439b30dc0746c090358c51a10d6b3d0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"53b1f69556831150e91de301c396a0a72d6a0f0bd40d4d92b574f0caadbf8651\"" Aug 19 00:16:17.370186 containerd[1864]: time="2025-08-19T00:16:17.370167242Z" level=info msg="StartContainer for \"c78f0a073def961f3045bd038412c307f8d750ebe7c5c5635367effee6012a14\" returns successfully" Aug 19 00:16:17.371029 containerd[1864]: time="2025-08-19T00:16:17.370987587Z" level=info msg="StartContainer for \"53b1f69556831150e91de301c396a0a72d6a0f0bd40d4d92b574f0caadbf8651\"" Aug 19 00:16:17.372253 containerd[1864]: time="2025-08-19T00:16:17.372045827Z" level=info msg="StartContainer for \"1f919837242bb82b9e1d193a5b8cd66fdbc00925636bbc955771917023ac183c\" returns successfully" Aug 19 00:16:17.411838 containerd[1864]: time="2025-08-19T00:16:17.411811778Z" level=info msg="connecting to shim 53b1f69556831150e91de301c396a0a72d6a0f0bd40d4d92b574f0caadbf8651" address="unix:///run/containerd/s/c2fc9726bc8acec63048c4f12ccfa9a80a9adc5ff2294457bc57a5a2357140bd" protocol=ttrpc version=3 Aug 19 00:16:17.426368 systemd[1]: Started cri-containerd-53b1f69556831150e91de301c396a0a72d6a0f0bd40d4d92b574f0caadbf8651.scope - libcontainer container 53b1f69556831150e91de301c396a0a72d6a0f0bd40d4d92b574f0caadbf8651. Aug 19 00:16:17.468996 containerd[1864]: time="2025-08-19T00:16:17.468902825Z" level=info msg="StartContainer for \"53b1f69556831150e91de301c396a0a72d6a0f0bd40d4d92b574f0caadbf8651\" returns successfully" Aug 19 00:16:18.159380 kubelet[3451]: I0819 00:16:18.159314 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dvgdj" podStartSLOduration=79.159302249 podStartE2EDuration="1m19.159302249s" podCreationTimestamp="2025-08-19 00:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:16:18.139728551 +0000 UTC m=+86.351440936" watchObservedRunningTime="2025-08-19 00:16:18.159302249 +0000 UTC m=+86.371014594" Aug 19 00:16:18.211767 kubelet[3451]: I0819 00:16:18.211724 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-769sj" podStartSLOduration=79.211711668 podStartE2EDuration="1m19.211711668s" podCreationTimestamp="2025-08-19 00:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:16:18.195080793 +0000 UTC m=+86.406793146" watchObservedRunningTime="2025-08-19 00:16:18.211711668 +0000 UTC m=+86.423424013" Aug 19 00:16:18.245346 systemd-networkd[1695]: cali43af0d6c25f: Gained IPv6LL Aug 19 00:16:18.565330 systemd-networkd[1695]: cali1145d25bd81: Gained IPv6LL Aug 19 00:16:18.694494 systemd-networkd[1695]: cali3dc735b92a3: Gained IPv6LL Aug 19 00:16:22.958735 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3780027529.mount: Deactivated successfully. Aug 19 00:16:23.668867 containerd[1864]: time="2025-08-19T00:16:23.668813883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:23.715876 containerd[1864]: time="2025-08-19T00:16:23.715822868Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Aug 19 00:16:23.762667 containerd[1864]: time="2025-08-19T00:16:23.762616134Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:23.812960 containerd[1864]: time="2025-08-19T00:16:23.812917681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:23.813619 containerd[1864]: time="2025-08-19T00:16:23.813515019Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 6.758530285s" Aug 19 00:16:23.813619 containerd[1864]: time="2025-08-19T00:16:23.813541468Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Aug 19 00:16:23.814689 containerd[1864]: time="2025-08-19T00:16:23.814558050Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 00:16:23.861381 containerd[1864]: time="2025-08-19T00:16:23.861342060Z" level=info msg="CreateContainer within sandbox \"88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 19 00:16:25.815867 containerd[1864]: time="2025-08-19T00:16:25.815279101Z" level=info msg="Container aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:26.028029 containerd[1864]: time="2025-08-19T00:16:26.027983915Z" level=info msg="CreateContainer within sandbox \"88a5f3f7fa4f50f1629ab7ca57cc61a8a830bad57d265fc1684f3e65c452c55c\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\"" Aug 19 00:16:26.028619 containerd[1864]: time="2025-08-19T00:16:26.028601533Z" level=info msg="StartContainer for \"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\"" Aug 19 00:16:26.030058 containerd[1864]: time="2025-08-19T00:16:26.030035272Z" level=info msg="connecting to shim aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620" address="unix:///run/containerd/s/e93bf397fc3f92170c5222ad4ab56fea84a649fe4d923abaec1cb8db3245da5f" protocol=ttrpc version=3 Aug 19 00:16:26.063377 systemd[1]: Started cri-containerd-aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620.scope - libcontainer container aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620. Aug 19 00:16:26.094352 containerd[1864]: time="2025-08-19T00:16:26.093841512Z" level=info msg="StartContainer for \"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" returns successfully" Aug 19 00:16:26.177697 kubelet[3451]: I0819 00:16:26.177587 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-79955b9856-vwdq2" podStartSLOduration=9.818440777 podStartE2EDuration="30.177574565s" podCreationTimestamp="2025-08-19 00:15:56 +0000 UTC" firstStartedPulling="2025-08-19 00:15:56.6880508 +0000 UTC m=+64.899763145" lastFinishedPulling="2025-08-19 00:16:17.04718458 +0000 UTC m=+85.258896933" observedRunningTime="2025-08-19 00:16:18.212971769 +0000 UTC m=+86.424684114" watchObservedRunningTime="2025-08-19 00:16:26.177574565 +0000 UTC m=+94.389286910" Aug 19 00:16:26.179057 kubelet[3451]: I0819 00:16:26.177940 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-tl6wx" podStartSLOduration=60.225757654 podStartE2EDuration="1m14.177932776s" podCreationTimestamp="2025-08-19 00:15:12 +0000 UTC" firstStartedPulling="2025-08-19 00:16:09.862270685 +0000 UTC m=+78.073983030" lastFinishedPulling="2025-08-19 00:16:23.814445807 +0000 UTC m=+92.026158152" observedRunningTime="2025-08-19 00:16:26.176698195 +0000 UTC m=+94.388410540" watchObservedRunningTime="2025-08-19 00:16:26.177932776 +0000 UTC m=+94.389645121" Aug 19 00:16:27.135242 containerd[1864]: time="2025-08-19T00:16:27.135167562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\" id:\"ae81b168beabbb711ac7356b943d4aa41a0a024685359d38a1e912a69c510928\" pid:5991 exited_at:{seconds:1755562587 nanos:134314512}" Aug 19 00:16:27.208515 containerd[1864]: time="2025-08-19T00:16:27.208484343Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" id:\"0de7eb52b4a156510419a433adc6ceb0962c9f6230c0c686ee245f1f7b72fa8a\" pid:6017 exit_status:1 exited_at:{seconds:1755562587 nanos:208138188}" Aug 19 00:16:28.251628 containerd[1864]: time="2025-08-19T00:16:28.251582981Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" id:\"3851beaa900029700adc50adf07cf2f6903f87b4adf8703a84af3474a232fbb7\" pid:6040 exit_status:1 exited_at:{seconds:1755562588 nanos:251156849}" Aug 19 00:16:32.414061 containerd[1864]: time="2025-08-19T00:16:32.413567132Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:32.417735 containerd[1864]: time="2025-08-19T00:16:32.417711600Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Aug 19 00:16:32.420987 containerd[1864]: time="2025-08-19T00:16:32.420961882Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:32.424827 containerd[1864]: time="2025-08-19T00:16:32.424789780Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:32.425224 containerd[1864]: time="2025-08-19T00:16:32.425055604Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 8.610475465s" Aug 19 00:16:32.425224 containerd[1864]: time="2025-08-19T00:16:32.425083501Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 19 00:16:32.425994 containerd[1864]: time="2025-08-19T00:16:32.425973120Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 00:16:32.432129 containerd[1864]: time="2025-08-19T00:16:32.432049021Z" level=info msg="CreateContainer within sandbox \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:16:32.454818 containerd[1864]: time="2025-08-19T00:16:32.454333448Z" level=info msg="Container 4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:32.473069 containerd[1864]: time="2025-08-19T00:16:32.473038288Z" level=info msg="CreateContainer within sandbox \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\"" Aug 19 00:16:32.473566 containerd[1864]: time="2025-08-19T00:16:32.473545231Z" level=info msg="StartContainer for \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\"" Aug 19 00:16:32.474825 containerd[1864]: time="2025-08-19T00:16:32.474762908Z" level=info msg="connecting to shim 4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4" address="unix:///run/containerd/s/9c517928a4d0393e23dd4a3a0d2ec0c9cfbd960d4e8e1be64cba0e93c8a84026" protocol=ttrpc version=3 Aug 19 00:16:32.492345 systemd[1]: Started cri-containerd-4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4.scope - libcontainer container 4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4. Aug 19 00:16:32.527063 containerd[1864]: time="2025-08-19T00:16:32.527033096Z" level=info msg="StartContainer for \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" returns successfully" Aug 19 00:16:33.009843 containerd[1864]: time="2025-08-19T00:16:33.009791934Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:33.013363 containerd[1864]: time="2025-08-19T00:16:33.013173771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 00:16:33.014699 containerd[1864]: time="2025-08-19T00:16:33.014675816Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 588.575197ms" Aug 19 00:16:33.014699 containerd[1864]: time="2025-08-19T00:16:33.014700440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 19 00:16:33.018247 containerd[1864]: time="2025-08-19T00:16:33.017263677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 19 00:16:33.108392 containerd[1864]: time="2025-08-19T00:16:33.108270672Z" level=info msg="CreateContainer within sandbox \"98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:16:33.199992 kubelet[3451]: I0819 00:16:33.199943 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c8858db87-7lbsf" podStartSLOduration=68.952805917 podStartE2EDuration="1m24.199928367s" podCreationTimestamp="2025-08-19 00:15:09 +0000 UTC" firstStartedPulling="2025-08-19 00:16:17.178771555 +0000 UTC m=+85.390483900" lastFinishedPulling="2025-08-19 00:16:32.425894005 +0000 UTC m=+100.637606350" observedRunningTime="2025-08-19 00:16:33.198921345 +0000 UTC m=+101.410633698" watchObservedRunningTime="2025-08-19 00:16:33.199928367 +0000 UTC m=+101.411640712" Aug 19 00:16:34.063396 containerd[1864]: time="2025-08-19T00:16:34.063351668Z" level=info msg="Container 4cd1c59ab3cbbce4e8d547e83de8697b80ac693d6efcd8892ab2a1b9bb455212: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:34.067171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2351070665.mount: Deactivated successfully. Aug 19 00:16:34.186296 containerd[1864]: time="2025-08-19T00:16:34.186246145Z" level=info msg="CreateContainer within sandbox \"98d275d488970cc75f2786db7337d21e7b640c6e764ecfa2dbb06d1e8cd26154\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4cd1c59ab3cbbce4e8d547e83de8697b80ac693d6efcd8892ab2a1b9bb455212\"" Aug 19 00:16:34.187490 containerd[1864]: time="2025-08-19T00:16:34.186698647Z" level=info msg="StartContainer for \"4cd1c59ab3cbbce4e8d547e83de8697b80ac693d6efcd8892ab2a1b9bb455212\"" Aug 19 00:16:34.187490 containerd[1864]: time="2025-08-19T00:16:34.187433165Z" level=info msg="connecting to shim 4cd1c59ab3cbbce4e8d547e83de8697b80ac693d6efcd8892ab2a1b9bb455212" address="unix:///run/containerd/s/c72bfb8772a9c7699151cbdba7303f96acb9d95c924dfdff59f94384d15e8a5c" protocol=ttrpc version=3 Aug 19 00:16:34.207365 systemd[1]: Started cri-containerd-4cd1c59ab3cbbce4e8d547e83de8697b80ac693d6efcd8892ab2a1b9bb455212.scope - libcontainer container 4cd1c59ab3cbbce4e8d547e83de8697b80ac693d6efcd8892ab2a1b9bb455212. Aug 19 00:16:34.247068 containerd[1864]: time="2025-08-19T00:16:34.247045245Z" level=info msg="StartContainer for \"4cd1c59ab3cbbce4e8d547e83de8697b80ac693d6efcd8892ab2a1b9bb455212\" returns successfully" Aug 19 00:16:35.117920 containerd[1864]: time="2025-08-19T00:16:35.117144457Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"1d747d47c887253e12956c51db5a37376079ed6a39b78fea5e0a81325faa611d\" pid:6145 exited_at:{seconds:1755562595 nanos:116859568}" Aug 19 00:16:35.569667 containerd[1864]: time="2025-08-19T00:16:35.569615925Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:35.573791 containerd[1864]: time="2025-08-19T00:16:35.573755856Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Aug 19 00:16:35.577724 containerd[1864]: time="2025-08-19T00:16:35.577690718Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:35.583139 containerd[1864]: time="2025-08-19T00:16:35.583039678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:35.583701 containerd[1864]: time="2025-08-19T00:16:35.583641928Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 2.566354402s" Aug 19 00:16:35.583701 containerd[1864]: time="2025-08-19T00:16:35.583668337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Aug 19 00:16:35.584989 containerd[1864]: time="2025-08-19T00:16:35.584963416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 00:16:35.591372 containerd[1864]: time="2025-08-19T00:16:35.591338023Z" level=info msg="CreateContainer within sandbox \"43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 19 00:16:35.621526 containerd[1864]: time="2025-08-19T00:16:35.621427491Z" level=info msg="Container 2211c6ec12e385673b101377236d0891d88882468e7959064b79dec40c83a302: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:35.624012 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1588568810.mount: Deactivated successfully. Aug 19 00:16:35.645205 containerd[1864]: time="2025-08-19T00:16:35.645167153Z" level=info msg="CreateContainer within sandbox \"43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2211c6ec12e385673b101377236d0891d88882468e7959064b79dec40c83a302\"" Aug 19 00:16:35.646156 containerd[1864]: time="2025-08-19T00:16:35.646130134Z" level=info msg="StartContainer for \"2211c6ec12e385673b101377236d0891d88882468e7959064b79dec40c83a302\"" Aug 19 00:16:35.647467 containerd[1864]: time="2025-08-19T00:16:35.647442133Z" level=info msg="connecting to shim 2211c6ec12e385673b101377236d0891d88882468e7959064b79dec40c83a302" address="unix:///run/containerd/s/7df30dc771e96ae4cb6d59bbc1ea3382b4e422219759ac8b8551840de16d9f6e" protocol=ttrpc version=3 Aug 19 00:16:35.669574 systemd[1]: Started cri-containerd-2211c6ec12e385673b101377236d0891d88882468e7959064b79dec40c83a302.scope - libcontainer container 2211c6ec12e385673b101377236d0891d88882468e7959064b79dec40c83a302. Aug 19 00:16:35.702807 containerd[1864]: time="2025-08-19T00:16:35.702777373Z" level=info msg="StartContainer for \"2211c6ec12e385673b101377236d0891d88882468e7959064b79dec40c83a302\" returns successfully" Aug 19 00:16:35.920327 containerd[1864]: time="2025-08-19T00:16:35.920210024Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:35.923058 containerd[1864]: time="2025-08-19T00:16:35.923018348Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 00:16:35.924304 containerd[1864]: time="2025-08-19T00:16:35.924279313Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 339.290649ms" Aug 19 00:16:35.924304 containerd[1864]: time="2025-08-19T00:16:35.924304090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 19 00:16:35.925908 containerd[1864]: time="2025-08-19T00:16:35.925438924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 19 00:16:35.932539 containerd[1864]: time="2025-08-19T00:16:35.932513632Z" level=info msg="CreateContainer within sandbox \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:16:35.959249 containerd[1864]: time="2025-08-19T00:16:35.959140341Z" level=info msg="Container 3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:35.977319 containerd[1864]: time="2025-08-19T00:16:35.977290732Z" level=info msg="CreateContainer within sandbox \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\"" Aug 19 00:16:35.978024 containerd[1864]: time="2025-08-19T00:16:35.977711232Z" level=info msg="StartContainer for \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\"" Aug 19 00:16:35.979071 containerd[1864]: time="2025-08-19T00:16:35.979038176Z" level=info msg="connecting to shim 3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1" address="unix:///run/containerd/s/c86dec53bd1ffd45c53366d49e602cc8538b215dc23a79109be399f78530c311" protocol=ttrpc version=3 Aug 19 00:16:35.995336 systemd[1]: Started cri-containerd-3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1.scope - libcontainer container 3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1. Aug 19 00:16:36.027389 containerd[1864]: time="2025-08-19T00:16:36.027337893Z" level=info msg="StartContainer for \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" returns successfully" Aug 19 00:16:36.211383 kubelet[3451]: I0819 00:16:36.210556 3451 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:16:36.229728 kubelet[3451]: I0819 00:16:36.229551 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-xtwmv" podStartSLOduration=70.396112158 podStartE2EDuration="1m26.229539312s" podCreationTimestamp="2025-08-19 00:15:10 +0000 UTC" firstStartedPulling="2025-08-19 00:16:17.182724458 +0000 UTC m=+85.394436803" lastFinishedPulling="2025-08-19 00:16:33.016151612 +0000 UTC m=+101.227863957" observedRunningTime="2025-08-19 00:16:35.23400301 +0000 UTC m=+103.445715387" watchObservedRunningTime="2025-08-19 00:16:36.229539312 +0000 UTC m=+104.441251657" Aug 19 00:16:36.230110 kubelet[3451]: I0819 00:16:36.229964 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c8858db87-l566d" podStartSLOduration=68.676527569 podStartE2EDuration="1m27.229954212s" podCreationTimestamp="2025-08-19 00:15:09 +0000 UTC" firstStartedPulling="2025-08-19 00:16:17.371398007 +0000 UTC m=+85.583110352" lastFinishedPulling="2025-08-19 00:16:35.92482465 +0000 UTC m=+104.136536995" observedRunningTime="2025-08-19 00:16:36.225819208 +0000 UTC m=+104.437531553" watchObservedRunningTime="2025-08-19 00:16:36.229954212 +0000 UTC m=+104.441666613" Aug 19 00:16:36.961191 systemd[1]: Created slice kubepods-besteffort-podb181315a_7951_4f89_aca0_84d4e3911a4c.slice - libcontainer container kubepods-besteffort-podb181315a_7951_4f89_aca0_84d4e3911a4c.slice. Aug 19 00:16:37.047433 kubelet[3451]: I0819 00:16:37.047355 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b181315a-7951-4f89-aca0-84d4e3911a4c-calico-apiserver-certs\") pod \"calico-apiserver-5d78cb7dc4-fwb9m\" (UID: \"b181315a-7951-4f89-aca0-84d4e3911a4c\") " pod="calico-apiserver/calico-apiserver-5d78cb7dc4-fwb9m" Aug 19 00:16:37.047433 kubelet[3451]: I0819 00:16:37.047400 3451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dblzb\" (UniqueName: \"kubernetes.io/projected/b181315a-7951-4f89-aca0-84d4e3911a4c-kube-api-access-dblzb\") pod \"calico-apiserver-5d78cb7dc4-fwb9m\" (UID: \"b181315a-7951-4f89-aca0-84d4e3911a4c\") " pod="calico-apiserver/calico-apiserver-5d78cb7dc4-fwb9m" Aug 19 00:16:37.213829 kubelet[3451]: I0819 00:16:37.212829 3451 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:16:37.264982 containerd[1864]: time="2025-08-19T00:16:37.264768131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-fwb9m,Uid:b181315a-7951-4f89-aca0-84d4e3911a4c,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:16:37.493694 systemd-networkd[1695]: cali7c39425efc2: Link UP Aug 19 00:16:37.496015 systemd-networkd[1695]: cali7c39425efc2: Gained carrier Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.336 [INFO][6237] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0 calico-apiserver-5d78cb7dc4- calico-apiserver b181315a-7951-4f89-aca0-84d4e3911a4c 1193 0 2025-08-19 00:16:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d78cb7dc4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-440c7464d3 calico-apiserver-5d78cb7dc4-fwb9m eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7c39425efc2 [] [] }} ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.337 [INFO][6237] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.391 [INFO][6249] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" HandleID="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.392 [INFO][6249] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" HandleID="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d36b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-440c7464d3", "pod":"calico-apiserver-5d78cb7dc4-fwb9m", "timestamp":"2025-08-19 00:16:37.390454819 +0000 UTC"}, Hostname:"ci-4426.0.0-a-440c7464d3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.392 [INFO][6249] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.392 [INFO][6249] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.392 [INFO][6249] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-440c7464d3' Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.414 [INFO][6249] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.423 [INFO][6249] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.434 [INFO][6249] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.437 [INFO][6249] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.440 [INFO][6249] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.440 [INFO][6249] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.441 [INFO][6249] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223 Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.461 [INFO][6249] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.482 [INFO][6249] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.74/26] block=192.168.21.64/26 handle="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.482 [INFO][6249] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.74/26] handle="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" host="ci-4426.0.0-a-440c7464d3" Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.482 [INFO][6249] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:37.533236 containerd[1864]: 2025-08-19 00:16:37.482 [INFO][6249] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.74/26] IPv6=[] ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" HandleID="k8s-pod-network.4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" Aug 19 00:16:37.533620 containerd[1864]: 2025-08-19 00:16:37.486 [INFO][6237] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0", GenerateName:"calico-apiserver-5d78cb7dc4-", Namespace:"calico-apiserver", SelfLink:"", UID:"b181315a-7951-4f89-aca0-84d4e3911a4c", ResourceVersion:"1193", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 16, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d78cb7dc4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"", Pod:"calico-apiserver-5d78cb7dc4-fwb9m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7c39425efc2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:37.533620 containerd[1864]: 2025-08-19 00:16:37.486 [INFO][6237] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.74/32] ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" Aug 19 00:16:37.533620 containerd[1864]: 2025-08-19 00:16:37.486 [INFO][6237] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7c39425efc2 ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" Aug 19 00:16:37.533620 containerd[1864]: 2025-08-19 00:16:37.496 [INFO][6237] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" Aug 19 00:16:37.533620 containerd[1864]: 2025-08-19 00:16:37.498 [INFO][6237] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0", GenerateName:"calico-apiserver-5d78cb7dc4-", Namespace:"calico-apiserver", SelfLink:"", UID:"b181315a-7951-4f89-aca0-84d4e3911a4c", ResourceVersion:"1193", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 16, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d78cb7dc4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-440c7464d3", ContainerID:"4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223", Pod:"calico-apiserver-5d78cb7dc4-fwb9m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7c39425efc2", MAC:"5e:d5:47:1c:be:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:16:37.533620 containerd[1864]: 2025-08-19 00:16:37.529 [INFO][6237] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" Namespace="calico-apiserver" Pod="calico-apiserver-5d78cb7dc4-fwb9m" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--5d78cb7dc4--fwb9m-eth0" Aug 19 00:16:38.214878 kubelet[3451]: I0819 00:16:38.214844 3451 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:16:38.217038 containerd[1864]: time="2025-08-19T00:16:38.216910352Z" level=info msg="StopContainer for \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" with timeout 30 (s)" Aug 19 00:16:39.222184 containerd[1864]: time="2025-08-19T00:16:39.222143866Z" level=info msg="Stop container \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" with signal terminated" Aug 19 00:16:39.237091 systemd[1]: cri-containerd-3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1.scope: Deactivated successfully. Aug 19 00:16:39.243161 containerd[1864]: time="2025-08-19T00:16:39.243133998Z" level=info msg="received exit event container_id:\"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" id:\"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" pid:6203 exit_status:1 exited_at:{seconds:1755562599 nanos:242870046}" Aug 19 00:16:39.243362 containerd[1864]: time="2025-08-19T00:16:39.243134630Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" id:\"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" pid:6203 exit_status:1 exited_at:{seconds:1755562599 nanos:242870046}" Aug 19 00:16:39.262312 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1-rootfs.mount: Deactivated successfully. Aug 19 00:16:39.301440 systemd-networkd[1695]: cali7c39425efc2: Gained IPv6LL Aug 19 00:16:40.966932 containerd[1864]: time="2025-08-19T00:16:40.966878135Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:43.591188 containerd[1864]: time="2025-08-19T00:16:43.591116764Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" id:\"3e84e69da1ec52c917e801d36d3e2906209e8ef0e5cb4cb4ac4409bf808a30fc\" pid:6309 exited_at:{seconds:1755562603 nanos:590747777}" Aug 19 00:16:47.572526 containerd[1864]: time="2025-08-19T00:16:47.572472311Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Aug 19 00:16:47.663013 containerd[1864]: time="2025-08-19T00:16:47.662874792Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:47.715486 containerd[1864]: time="2025-08-19T00:16:47.715450171Z" level=info msg="StopContainer for \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" returns successfully" Aug 19 00:16:47.717279 containerd[1864]: time="2025-08-19T00:16:47.715956171Z" level=info msg="StopPodSandbox for \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\"" Aug 19 00:16:47.723941 containerd[1864]: time="2025-08-19T00:16:47.723894623Z" level=info msg="Container to stop \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 19 00:16:47.732850 systemd[1]: cri-containerd-0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea.scope: Deactivated successfully. Aug 19 00:16:47.734510 containerd[1864]: time="2025-08-19T00:16:47.733742940Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" id:\"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" pid:5815 exit_status:137 exited_at:{seconds:1755562607 nanos:733033919}" Aug 19 00:16:47.765220 containerd[1864]: time="2025-08-19T00:16:47.765152050Z" level=info msg="shim disconnected" id=0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea namespace=k8s.io Aug 19 00:16:47.765220 containerd[1864]: time="2025-08-19T00:16:47.765173778Z" level=warning msg="cleaning up after shim disconnected" id=0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea namespace=k8s.io Aug 19 00:16:47.765220 containerd[1864]: time="2025-08-19T00:16:47.765197139Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 19 00:16:47.768197 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea-rootfs.mount: Deactivated successfully. Aug 19 00:16:47.807871 containerd[1864]: time="2025-08-19T00:16:47.807840184Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:16:47.808831 containerd[1864]: time="2025-08-19T00:16:47.808435921Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 11.882968844s" Aug 19 00:16:47.809176 containerd[1864]: time="2025-08-19T00:16:47.809160079Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Aug 19 00:16:47.821554 containerd[1864]: time="2025-08-19T00:16:47.821469421Z" level=info msg="connecting to shim 4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223" address="unix:///run/containerd/s/3346386e7de85007f774addce8ef788168bbef0cac0fc049e4d25e75aae57318" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:16:47.838512 systemd[1]: Started cri-containerd-4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223.scope - libcontainer container 4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223. Aug 19 00:16:48.358486 containerd[1864]: time="2025-08-19T00:16:48.358225746Z" level=info msg="CreateContainer within sandbox \"43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 19 00:16:48.416936 containerd[1864]: time="2025-08-19T00:16:48.416899523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d78cb7dc4-fwb9m,Uid:b181315a-7951-4f89-aca0-84d4e3911a4c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223\"" Aug 19 00:16:48.759874 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea-shm.mount: Deactivated successfully. Aug 19 00:16:48.760759 containerd[1864]: time="2025-08-19T00:16:48.760713758Z" level=info msg="CreateContainer within sandbox \"4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:16:48.770146 containerd[1864]: time="2025-08-19T00:16:48.769781643Z" level=info msg="received exit event sandbox_id:\"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" exit_status:137 exited_at:{seconds:1755562607 nanos:733033919}" Aug 19 00:16:48.800263 systemd-networkd[1695]: cali43af0d6c25f: Link DOWN Aug 19 00:16:48.800736 systemd-networkd[1695]: cali43af0d6c25f: Lost carrier Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.796 [INFO][6417] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.798 [INFO][6417] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" iface="eth0" netns="/var/run/netns/cni-2f0f1f89-4411-7104-9b14-bc5269c17b1c" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.799 [INFO][6417] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" iface="eth0" netns="/var/run/netns/cni-2f0f1f89-4411-7104-9b14-bc5269c17b1c" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.808 [INFO][6417] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" after=9.47565ms iface="eth0" netns="/var/run/netns/cni-2f0f1f89-4411-7104-9b14-bc5269c17b1c" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.808 [INFO][6417] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.808 [INFO][6417] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.826 [INFO][6430] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.826 [INFO][6430] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.826 [INFO][6430] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.859 [INFO][6430] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.859 [INFO][6430] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.860 [INFO][6430] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:48.863198 containerd[1864]: 2025-08-19 00:16:48.861 [INFO][6417] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:48.865102 containerd[1864]: time="2025-08-19T00:16:48.864983195Z" level=info msg="TearDown network for sandbox \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" successfully" Aug 19 00:16:48.865102 containerd[1864]: time="2025-08-19T00:16:48.865020612Z" level=info msg="StopPodSandbox for \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" returns successfully" Aug 19 00:16:48.865927 systemd[1]: run-netns-cni\x2d2f0f1f89\x2d4411\x2d7104\x2d9b14\x2dbc5269c17b1c.mount: Deactivated successfully. Aug 19 00:16:48.920800 containerd[1864]: time="2025-08-19T00:16:48.919362284Z" level=info msg="Container c83152e239906fa136232566341523508181e7d090e148a69cc96ea68729b0fb: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:48.920024 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3153666266.mount: Deactivated successfully. Aug 19 00:16:49.022765 kubelet[3451]: I0819 00:16:49.021609 3451 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-calico-apiserver-certs\") pod \"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65\" (UID: \"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65\") " Aug 19 00:16:49.022765 kubelet[3451]: I0819 00:16:49.021648 3451 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqx59\" (UniqueName: \"kubernetes.io/projected/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-kube-api-access-bqx59\") pod \"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65\" (UID: \"48f4c9e3-c928-4ddd-bc8b-c6722f68bb65\") " Aug 19 00:16:49.025133 systemd[1]: var-lib-kubelet-pods-48f4c9e3\x2dc928\x2d4ddd\x2dbc8b\x2dc6722f68bb65-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbqx59.mount: Deactivated successfully. Aug 19 00:16:49.028660 systemd[1]: var-lib-kubelet-pods-48f4c9e3\x2dc928\x2d4ddd\x2dbc8b\x2dc6722f68bb65-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Aug 19 00:16:49.029320 kubelet[3451]: I0819 00:16:49.028906 3451 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-kube-api-access-bqx59" (OuterVolumeSpecName: "kube-api-access-bqx59") pod "48f4c9e3-c928-4ddd-bc8b-c6722f68bb65" (UID: "48f4c9e3-c928-4ddd-bc8b-c6722f68bb65"). InnerVolumeSpecName "kube-api-access-bqx59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 00:16:49.029320 kubelet[3451]: I0819 00:16:49.028886 3451 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "48f4c9e3-c928-4ddd-bc8b-c6722f68bb65" (UID: "48f4c9e3-c928-4ddd-bc8b-c6722f68bb65"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 00:16:49.121856 containerd[1864]: time="2025-08-19T00:16:49.121322404Z" level=info msg="Container 11ddd518c123daf11c3aaa95aa633efc91ffe5a65207fdae7a6e59f183779c36: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:16:49.122204 kubelet[3451]: I0819 00:16:49.122164 3451 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-calico-apiserver-certs\") on node \"ci-4426.0.0-a-440c7464d3\" DevicePath \"\"" Aug 19 00:16:49.122204 kubelet[3451]: I0819 00:16:49.122188 3451 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqx59\" (UniqueName: \"kubernetes.io/projected/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65-kube-api-access-bqx59\") on node \"ci-4426.0.0-a-440c7464d3\" DevicePath \"\"" Aug 19 00:16:49.130933 containerd[1864]: time="2025-08-19T00:16:49.130904113Z" level=info msg="CreateContainer within sandbox \"43486921da9877bc4ed24b3888418758c304b3b959579789a987153f7e1da471\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c83152e239906fa136232566341523508181e7d090e148a69cc96ea68729b0fb\"" Aug 19 00:16:49.132092 containerd[1864]: time="2025-08-19T00:16:49.131427568Z" level=info msg="StartContainer for \"c83152e239906fa136232566341523508181e7d090e148a69cc96ea68729b0fb\"" Aug 19 00:16:49.133269 containerd[1864]: time="2025-08-19T00:16:49.133209357Z" level=info msg="connecting to shim c83152e239906fa136232566341523508181e7d090e148a69cc96ea68729b0fb" address="unix:///run/containerd/s/7df30dc771e96ae4cb6d59bbc1ea3382b4e422219759ac8b8551840de16d9f6e" protocol=ttrpc version=3 Aug 19 00:16:49.147129 containerd[1864]: time="2025-08-19T00:16:49.147079378Z" level=info msg="CreateContainer within sandbox \"4dd90889bca341c883ff278e3bc04762f11086879fd66fc58f20da8876cf0223\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"11ddd518c123daf11c3aaa95aa633efc91ffe5a65207fdae7a6e59f183779c36\"" Aug 19 00:16:49.147898 containerd[1864]: time="2025-08-19T00:16:49.147869185Z" level=info msg="StartContainer for \"11ddd518c123daf11c3aaa95aa633efc91ffe5a65207fdae7a6e59f183779c36\"" Aug 19 00:16:49.149091 containerd[1864]: time="2025-08-19T00:16:49.149065525Z" level=info msg="connecting to shim 11ddd518c123daf11c3aaa95aa633efc91ffe5a65207fdae7a6e59f183779c36" address="unix:///run/containerd/s/3346386e7de85007f774addce8ef788168bbef0cac0fc049e4d25e75aae57318" protocol=ttrpc version=3 Aug 19 00:16:49.150432 systemd[1]: Started cri-containerd-c83152e239906fa136232566341523508181e7d090e148a69cc96ea68729b0fb.scope - libcontainer container c83152e239906fa136232566341523508181e7d090e148a69cc96ea68729b0fb. Aug 19 00:16:49.172565 systemd[1]: Started cri-containerd-11ddd518c123daf11c3aaa95aa633efc91ffe5a65207fdae7a6e59f183779c36.scope - libcontainer container 11ddd518c123daf11c3aaa95aa633efc91ffe5a65207fdae7a6e59f183779c36. Aug 19 00:16:49.210407 containerd[1864]: time="2025-08-19T00:16:49.210379421Z" level=info msg="StartContainer for \"c83152e239906fa136232566341523508181e7d090e148a69cc96ea68729b0fb\" returns successfully" Aug 19 00:16:49.243938 kubelet[3451]: I0819 00:16:49.243912 3451 scope.go:117] "RemoveContainer" containerID="3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1" Aug 19 00:16:49.250818 systemd[1]: Removed slice kubepods-besteffort-pod48f4c9e3_c928_4ddd_bc8b_c6722f68bb65.slice - libcontainer container kubepods-besteffort-pod48f4c9e3_c928_4ddd_bc8b_c6722f68bb65.slice. Aug 19 00:16:49.253442 containerd[1864]: time="2025-08-19T00:16:49.253414309Z" level=info msg="RemoveContainer for \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\"" Aug 19 00:16:49.263316 containerd[1864]: time="2025-08-19T00:16:49.263284506Z" level=info msg="RemoveContainer for \"3ce1efce33dc4845a612dbba06ce1dfbfc31ce45f88d93d4a3b09245e7931bb1\" returns successfully" Aug 19 00:16:49.282275 kubelet[3451]: I0819 00:16:49.281517 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-c697k" podStartSLOduration=66.661263685 podStartE2EDuration="1m37.281504872s" podCreationTimestamp="2025-08-19 00:15:12 +0000 UTC" firstStartedPulling="2025-08-19 00:16:17.190441337 +0000 UTC m=+85.402153682" lastFinishedPulling="2025-08-19 00:16:47.810682524 +0000 UTC m=+116.022394869" observedRunningTime="2025-08-19 00:16:49.272564286 +0000 UTC m=+117.484276639" watchObservedRunningTime="2025-08-19 00:16:49.281504872 +0000 UTC m=+117.493217217" Aug 19 00:16:49.327812 containerd[1864]: time="2025-08-19T00:16:49.327771320Z" level=info msg="StartContainer for \"11ddd518c123daf11c3aaa95aa633efc91ffe5a65207fdae7a6e59f183779c36\" returns successfully" Aug 19 00:16:49.907905 kubelet[3451]: I0819 00:16:49.907868 3451 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f4c9e3-c928-4ddd-bc8b-c6722f68bb65" path="/var/lib/kubelet/pods/48f4c9e3-c928-4ddd-bc8b-c6722f68bb65/volumes" Aug 19 00:16:50.028827 kubelet[3451]: I0819 00:16:50.028791 3451 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 19 00:16:50.028827 kubelet[3451]: I0819 00:16:50.028829 3451 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 19 00:16:51.432446 kubelet[3451]: I0819 00:16:51.432195 3451 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d78cb7dc4-fwb9m" podStartSLOduration=15.432179978 podStartE2EDuration="15.432179978s" podCreationTimestamp="2025-08-19 00:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:16:50.28027686 +0000 UTC m=+118.491989253" watchObservedRunningTime="2025-08-19 00:16:51.432179978 +0000 UTC m=+119.643892323" Aug 19 00:16:51.476964 containerd[1864]: time="2025-08-19T00:16:51.476487144Z" level=info msg="StopContainer for \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" with timeout 30 (s)" Aug 19 00:16:51.477736 containerd[1864]: time="2025-08-19T00:16:51.477584073Z" level=info msg="Stop container \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" with signal terminated" Aug 19 00:16:51.503344 systemd[1]: cri-containerd-4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4.scope: Deactivated successfully. Aug 19 00:16:51.504889 containerd[1864]: time="2025-08-19T00:16:51.504730184Z" level=info msg="received exit event container_id:\"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" id:\"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" pid:6073 exit_status:1 exited_at:{seconds:1755562611 nanos:504159495}" Aug 19 00:16:51.505537 containerd[1864]: time="2025-08-19T00:16:51.505259984Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" id:\"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" pid:6073 exit_status:1 exited_at:{seconds:1755562611 nanos:504159495}" Aug 19 00:16:51.539831 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4-rootfs.mount: Deactivated successfully. Aug 19 00:16:51.931693 containerd[1864]: time="2025-08-19T00:16:51.931618193Z" level=info msg="StopPodSandbox for \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\"" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.956 [WARNING][6547] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.956 [INFO][6547] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.956 [INFO][6547] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" iface="eth0" netns="" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.957 [INFO][6547] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.957 [INFO][6547] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.972 [INFO][6554] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.972 [INFO][6554] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.972 [INFO][6554] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.979 [WARNING][6554] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.979 [INFO][6554] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.981 [INFO][6554] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:51.983 [INFO][6547] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: time="2025-08-19T00:16:51.984660355Z" level=info msg="TearDown network for sandbox \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" successfully" Aug 19 00:16:52.720693 containerd[1864]: time="2025-08-19T00:16:51.984738541Z" level=info msg="StopPodSandbox for \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" returns successfully" Aug 19 00:16:52.720693 containerd[1864]: time="2025-08-19T00:16:51.985277485Z" level=info msg="RemovePodSandbox for \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\"" Aug 19 00:16:52.720693 containerd[1864]: time="2025-08-19T00:16:51.985303590Z" level=info msg="Forcibly stopping sandbox \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\"" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.010 [WARNING][6569] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.010 [INFO][6569] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.010 [INFO][6569] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" iface="eth0" netns="" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.010 [INFO][6569] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.010 [INFO][6569] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.023 [INFO][6576] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.023 [INFO][6576] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.023 [INFO][6576] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.027 [WARNING][6576] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.027 [INFO][6576] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" HandleID="k8s-pod-network.0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--l566d-eth0" Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.028 [INFO][6576] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:52.720693 containerd[1864]: 2025-08-19 00:16:52.029 [INFO][6569] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea" Aug 19 00:16:52.720693 containerd[1864]: time="2025-08-19T00:16:52.030916163Z" level=info msg="TearDown network for sandbox \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" successfully" Aug 19 00:16:52.720693 containerd[1864]: time="2025-08-19T00:16:52.032123103Z" level=info msg="Ensure that sandbox 0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea in task-service has been cleanup successfully" Aug 19 00:16:53.617377 containerd[1864]: time="2025-08-19T00:16:53.617264577Z" level=info msg="RemovePodSandbox \"0ac238a7d9a3a45897170256fa81aa9e5cef9501b5e86d43e63f7e63178769ea\" returns successfully" Aug 19 00:16:54.114252 containerd[1864]: time="2025-08-19T00:16:54.113737905Z" level=info msg="StopContainer for \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" returns successfully" Aug 19 00:16:54.115461 containerd[1864]: time="2025-08-19T00:16:54.114811713Z" level=info msg="StopPodSandbox for \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\"" Aug 19 00:16:54.115461 containerd[1864]: time="2025-08-19T00:16:54.114863506Z" level=info msg="Container to stop \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 19 00:16:54.132418 systemd[1]: cri-containerd-8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b.scope: Deactivated successfully. Aug 19 00:16:54.135623 containerd[1864]: time="2025-08-19T00:16:54.135454592Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" id:\"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" pid:5686 exit_status:137 exited_at:{seconds:1755562614 nanos:134601199}" Aug 19 00:16:54.173456 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b-rootfs.mount: Deactivated successfully. Aug 19 00:16:54.176513 containerd[1864]: time="2025-08-19T00:16:54.176343875Z" level=info msg="shim disconnected" id=8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b namespace=k8s.io Aug 19 00:16:54.176513 containerd[1864]: time="2025-08-19T00:16:54.176371612Z" level=warning msg="cleaning up after shim disconnected" id=8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b namespace=k8s.io Aug 19 00:16:54.176513 containerd[1864]: time="2025-08-19T00:16:54.176395388Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 19 00:16:54.757721 containerd[1864]: time="2025-08-19T00:16:54.757525367Z" level=info msg="received exit event sandbox_id:\"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" exit_status:137 exited_at:{seconds:1755562614 nanos:134601199}" Aug 19 00:16:54.759890 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b-shm.mount: Deactivated successfully. Aug 19 00:16:54.800973 systemd-networkd[1695]: cali1145d25bd81: Link DOWN Aug 19 00:16:54.800977 systemd-networkd[1695]: cali1145d25bd81: Lost carrier Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.798 [INFO][6630] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.798 [INFO][6630] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" iface="eth0" netns="/var/run/netns/cni-8a3523c7-19e0-d7e2-b35a-b25f76bf88df" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.799 [INFO][6630] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" iface="eth0" netns="/var/run/netns/cni-8a3523c7-19e0-d7e2-b35a-b25f76bf88df" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.806 [INFO][6630] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" after=7.590074ms iface="eth0" netns="/var/run/netns/cni-8a3523c7-19e0-d7e2-b35a-b25f76bf88df" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.806 [INFO][6630] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.806 [INFO][6630] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.826 [INFO][6642] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.826 [INFO][6642] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.826 [INFO][6642] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.860 [INFO][6642] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.860 [INFO][6642] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.862 [INFO][6642] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:16:54.864309 containerd[1864]: 2025-08-19 00:16:54.863 [INFO][6630] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:16:54.866532 containerd[1864]: time="2025-08-19T00:16:54.864669209Z" level=info msg="TearDown network for sandbox \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" successfully" Aug 19 00:16:54.866532 containerd[1864]: time="2025-08-19T00:16:54.864692474Z" level=info msg="StopPodSandbox for \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" returns successfully" Aug 19 00:16:54.866152 systemd[1]: run-netns-cni\x2d8a3523c7\x2d19e0\x2dd7e2\x2db35a\x2db25f76bf88df.mount: Deactivated successfully. Aug 19 00:16:54.962789 kubelet[3451]: I0819 00:16:54.962590 3451 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzv99\" (UniqueName: \"kubernetes.io/projected/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-kube-api-access-qzv99\") pod \"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8\" (UID: \"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8\") " Aug 19 00:16:54.962789 kubelet[3451]: I0819 00:16:54.962627 3451 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-calico-apiserver-certs\") pod \"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8\" (UID: \"06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8\") " Aug 19 00:16:54.966196 systemd[1]: var-lib-kubelet-pods-06fcbf06\x2db9a6\x2d4a75\x2d8c26\x2d7cb8c6c491d8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqzv99.mount: Deactivated successfully. Aug 19 00:16:54.966700 kubelet[3451]: I0819 00:16:54.966403 3451 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8" (UID: "06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 00:16:54.966830 kubelet[3451]: I0819 00:16:54.966407 3451 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-kube-api-access-qzv99" (OuterVolumeSpecName: "kube-api-access-qzv99") pod "06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8" (UID: "06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8"). InnerVolumeSpecName "kube-api-access-qzv99". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 00:16:54.968903 systemd[1]: var-lib-kubelet-pods-06fcbf06\x2db9a6\x2d4a75\x2d8c26\x2d7cb8c6c491d8-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Aug 19 00:16:55.062884 kubelet[3451]: I0819 00:16:55.062792 3451 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzv99\" (UniqueName: \"kubernetes.io/projected/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-kube-api-access-qzv99\") on node \"ci-4426.0.0-a-440c7464d3\" DevicePath \"\"" Aug 19 00:16:55.062884 kubelet[3451]: I0819 00:16:55.062819 3451 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8-calico-apiserver-certs\") on node \"ci-4426.0.0-a-440c7464d3\" DevicePath \"\"" Aug 19 00:16:55.282852 kubelet[3451]: I0819 00:16:55.282824 3451 scope.go:117] "RemoveContainer" containerID="4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4" Aug 19 00:16:55.284905 containerd[1864]: time="2025-08-19T00:16:55.284861503Z" level=info msg="RemoveContainer for \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\"" Aug 19 00:16:55.288181 systemd[1]: Removed slice kubepods-besteffort-pod06fcbf06_b9a6_4a75_8c26_7cb8c6c491d8.slice - libcontainer container kubepods-besteffort-pod06fcbf06_b9a6_4a75_8c26_7cb8c6c491d8.slice. Aug 19 00:16:55.313274 containerd[1864]: time="2025-08-19T00:16:55.313070176Z" level=info msg="RemoveContainer for \"4b7796647368f407d4ee6a7d94f2f955b1b8da0e112028a265aaa4540a953ff4\" returns successfully" Aug 19 00:16:55.906821 kubelet[3451]: I0819 00:16:55.906773 3451 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8" path="/var/lib/kubelet/pods/06fcbf06-b9a6-4a75-8c26-7cb8c6c491d8/volumes" Aug 19 00:16:57.181956 containerd[1864]: time="2025-08-19T00:16:57.181914677Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\" id:\"bd14fda6b31fe84a0e14e42946918890b68b378a9b0f58936e1911190eafb076\" pid:6670 exited_at:{seconds:1755562617 nanos:181679366}" Aug 19 00:16:58.224471 containerd[1864]: time="2025-08-19T00:16:58.224430314Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" id:\"4c02ca1e59b7af682c38ba1c17a8186c5e7830603b65168e5b93c45f03c6e9ee\" pid:6694 exited_at:{seconds:1755562618 nanos:224041246}" Aug 19 00:17:05.112744 containerd[1864]: time="2025-08-19T00:17:05.112710987Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"c6755d7a39051e3ca7c74ba79b29e8907557073004b4ba6c8e7ac155476041a5\" pid:6726 exited_at:{seconds:1755562625 nanos:112434955}" Aug 19 00:17:06.188200 containerd[1864]: time="2025-08-19T00:17:06.188148858Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"10e9ed3a0b08f9a473f860441033e838a8b3b02853065c241916cd9f913e40d1\" pid:6747 exited_at:{seconds:1755562626 nanos:187675244}" Aug 19 00:17:13.107813 systemd[1]: Started sshd@7-10.200.20.41:22-10.200.16.10:40848.service - OpenSSH per-connection server daemon (10.200.16.10:40848). Aug 19 00:17:13.604016 sshd[6760]: Accepted publickey for core from 10.200.16.10 port 40848 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:13.606581 sshd-session[6760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:13.613077 systemd-logind[1846]: New session 10 of user core. Aug 19 00:17:13.619545 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 19 00:17:14.018784 sshd[6763]: Connection closed by 10.200.16.10 port 40848 Aug 19 00:17:14.018328 sshd-session[6760]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:14.021361 systemd[1]: sshd@7-10.200.20.41:22-10.200.16.10:40848.service: Deactivated successfully. Aug 19 00:17:14.023336 systemd[1]: session-10.scope: Deactivated successfully. Aug 19 00:17:14.023956 systemd-logind[1846]: Session 10 logged out. Waiting for processes to exit. Aug 19 00:17:14.025039 systemd-logind[1846]: Removed session 10. Aug 19 00:17:19.129409 systemd[1]: Started sshd@8-10.200.20.41:22-10.200.16.10:40856.service - OpenSSH per-connection server daemon (10.200.16.10:40856). Aug 19 00:17:19.623904 sshd[6784]: Accepted publickey for core from 10.200.16.10 port 40856 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:19.625401 sshd-session[6784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:19.629689 systemd-logind[1846]: New session 11 of user core. Aug 19 00:17:19.636343 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 19 00:17:20.035527 sshd[6789]: Connection closed by 10.200.16.10 port 40856 Aug 19 00:17:20.036213 sshd-session[6784]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:20.040494 systemd-logind[1846]: Session 11 logged out. Waiting for processes to exit. Aug 19 00:17:20.040713 systemd[1]: sshd@8-10.200.20.41:22-10.200.16.10:40856.service: Deactivated successfully. Aug 19 00:17:20.042518 systemd[1]: session-11.scope: Deactivated successfully. Aug 19 00:17:20.045429 systemd-logind[1846]: Removed session 11. Aug 19 00:17:25.133974 systemd[1]: Started sshd@9-10.200.20.41:22-10.200.16.10:48176.service - OpenSSH per-connection server daemon (10.200.16.10:48176). Aug 19 00:17:25.630365 sshd[6801]: Accepted publickey for core from 10.200.16.10 port 48176 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:25.631436 sshd-session[6801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:25.635561 systemd-logind[1846]: New session 12 of user core. Aug 19 00:17:25.639348 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 19 00:17:26.024869 sshd[6804]: Connection closed by 10.200.16.10 port 48176 Aug 19 00:17:26.025347 sshd-session[6801]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:26.028496 systemd[1]: sshd@9-10.200.20.41:22-10.200.16.10:48176.service: Deactivated successfully. Aug 19 00:17:26.031037 systemd[1]: session-12.scope: Deactivated successfully. Aug 19 00:17:26.032222 systemd-logind[1846]: Session 12 logged out. Waiting for processes to exit. Aug 19 00:17:26.033902 systemd-logind[1846]: Removed session 12. Aug 19 00:17:26.115422 systemd[1]: Started sshd@10-10.200.20.41:22-10.200.16.10:48184.service - OpenSSH per-connection server daemon (10.200.16.10:48184). Aug 19 00:17:26.568756 sshd[6816]: Accepted publickey for core from 10.200.16.10 port 48184 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:26.569855 sshd-session[6816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:26.573954 systemd-logind[1846]: New session 13 of user core. Aug 19 00:17:26.577335 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 19 00:17:26.968870 sshd[6819]: Connection closed by 10.200.16.10 port 48184 Aug 19 00:17:26.969359 sshd-session[6816]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:26.972641 systemd[1]: sshd@10-10.200.20.41:22-10.200.16.10:48184.service: Deactivated successfully. Aug 19 00:17:26.974801 systemd[1]: session-13.scope: Deactivated successfully. Aug 19 00:17:26.975592 systemd-logind[1846]: Session 13 logged out. Waiting for processes to exit. Aug 19 00:17:26.977118 systemd-logind[1846]: Removed session 13. Aug 19 00:17:27.052930 systemd[1]: Started sshd@11-10.200.20.41:22-10.200.16.10:48192.service - OpenSSH per-connection server daemon (10.200.16.10:48192). Aug 19 00:17:27.133401 containerd[1864]: time="2025-08-19T00:17:27.133341460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\" id:\"a7ee7c0e193197fb96dc0826f844c0f2b9cffaecb6200ca6cee0472a815c1532\" pid:6843 exited_at:{seconds:1755562647 nanos:133060980}" Aug 19 00:17:27.529344 sshd[6828]: Accepted publickey for core from 10.200.16.10 port 48192 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:27.530476 sshd-session[6828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:27.534402 systemd-logind[1846]: New session 14 of user core. Aug 19 00:17:27.538255 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 19 00:17:27.920039 sshd[6854]: Connection closed by 10.200.16.10 port 48192 Aug 19 00:17:27.920524 sshd-session[6828]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:27.923610 systemd[1]: sshd@11-10.200.20.41:22-10.200.16.10:48192.service: Deactivated successfully. Aug 19 00:17:27.925186 systemd[1]: session-14.scope: Deactivated successfully. Aug 19 00:17:27.925933 systemd-logind[1846]: Session 14 logged out. Waiting for processes to exit. Aug 19 00:17:27.926978 systemd-logind[1846]: Removed session 14. Aug 19 00:17:28.227259 containerd[1864]: time="2025-08-19T00:17:28.227115554Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" id:\"a94894c11ed8d98b1907b758901b50dd5d34f0e6593027dc4d901f03355408eb\" pid:6887 exited_at:{seconds:1755562648 nanos:226780224}" Aug 19 00:17:33.012425 systemd[1]: Started sshd@12-10.200.20.41:22-10.200.16.10:35844.service - OpenSSH per-connection server daemon (10.200.16.10:35844). Aug 19 00:17:33.489112 sshd[6902]: Accepted publickey for core from 10.200.16.10 port 35844 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:33.490516 sshd-session[6902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:33.497502 systemd-logind[1846]: New session 15 of user core. Aug 19 00:17:33.502424 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 19 00:17:33.869102 sshd[6905]: Connection closed by 10.200.16.10 port 35844 Aug 19 00:17:33.869430 sshd-session[6902]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:33.873222 systemd-logind[1846]: Session 15 logged out. Waiting for processes to exit. Aug 19 00:17:33.873507 systemd[1]: sshd@12-10.200.20.41:22-10.200.16.10:35844.service: Deactivated successfully. Aug 19 00:17:33.876037 systemd[1]: session-15.scope: Deactivated successfully. Aug 19 00:17:33.877439 systemd-logind[1846]: Removed session 15. Aug 19 00:17:35.111650 containerd[1864]: time="2025-08-19T00:17:35.111602153Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"dc19c128b5567e771575f12e93490effeb2e9ca6c6b0ea49fae0281db3b9e066\" pid:6930 exited_at:{seconds:1755562655 nanos:111421435}" Aug 19 00:17:38.955849 systemd[1]: Started sshd@13-10.200.20.41:22-10.200.16.10:35852.service - OpenSSH per-connection server daemon (10.200.16.10:35852). Aug 19 00:17:39.433000 sshd[6954]: Accepted publickey for core from 10.200.16.10 port 35852 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:39.434065 sshd-session[6954]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:39.437991 systemd-logind[1846]: New session 16 of user core. Aug 19 00:17:39.444412 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 19 00:17:39.825256 sshd[6957]: Connection closed by 10.200.16.10 port 35852 Aug 19 00:17:39.825844 sshd-session[6954]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:39.829056 systemd[1]: sshd@13-10.200.20.41:22-10.200.16.10:35852.service: Deactivated successfully. Aug 19 00:17:39.830661 systemd[1]: session-16.scope: Deactivated successfully. Aug 19 00:17:39.833193 systemd-logind[1846]: Session 16 logged out. Waiting for processes to exit. Aug 19 00:17:39.834247 systemd-logind[1846]: Removed session 16. Aug 19 00:17:42.291868 containerd[1864]: time="2025-08-19T00:17:42.291828712Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" id:\"a3b916a400ff3a4d811ed38c4d5d44eac941b79d9061243224a81ac86cdbc91d\" pid:6981 exited_at:{seconds:1755562662 nanos:291497446}" Aug 19 00:17:44.912472 systemd[1]: Started sshd@14-10.200.20.41:22-10.200.16.10:41708.service - OpenSSH per-connection server daemon (10.200.16.10:41708). Aug 19 00:17:45.392642 sshd[6992]: Accepted publickey for core from 10.200.16.10 port 41708 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:45.393681 sshd-session[6992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:45.397394 systemd-logind[1846]: New session 17 of user core. Aug 19 00:17:45.409356 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 19 00:17:45.779089 sshd[6995]: Connection closed by 10.200.16.10 port 41708 Aug 19 00:17:45.779649 sshd-session[6992]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:45.783217 systemd-logind[1846]: Session 17 logged out. Waiting for processes to exit. Aug 19 00:17:45.783393 systemd[1]: sshd@14-10.200.20.41:22-10.200.16.10:41708.service: Deactivated successfully. Aug 19 00:17:45.784879 systemd[1]: session-17.scope: Deactivated successfully. Aug 19 00:17:45.786472 systemd-logind[1846]: Removed session 17. Aug 19 00:17:45.861833 systemd[1]: Started sshd@15-10.200.20.41:22-10.200.16.10:41724.service - OpenSSH per-connection server daemon (10.200.16.10:41724). Aug 19 00:17:46.299917 sshd[7007]: Accepted publickey for core from 10.200.16.10 port 41724 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:46.300923 sshd-session[7007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:46.304954 systemd-logind[1846]: New session 18 of user core. Aug 19 00:17:46.310334 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 19 00:17:46.809684 sshd[7010]: Connection closed by 10.200.16.10 port 41724 Aug 19 00:17:46.810547 sshd-session[7007]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:46.813576 systemd[1]: sshd@15-10.200.20.41:22-10.200.16.10:41724.service: Deactivated successfully. Aug 19 00:17:46.815701 systemd[1]: session-18.scope: Deactivated successfully. Aug 19 00:17:46.816493 systemd-logind[1846]: Session 18 logged out. Waiting for processes to exit. Aug 19 00:17:46.817541 systemd-logind[1846]: Removed session 18. Aug 19 00:17:46.899749 systemd[1]: Started sshd@16-10.200.20.41:22-10.200.16.10:41736.service - OpenSSH per-connection server daemon (10.200.16.10:41736). Aug 19 00:17:47.393072 sshd[7020]: Accepted publickey for core from 10.200.16.10 port 41736 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:47.394108 sshd-session[7020]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:47.397997 systemd-logind[1846]: New session 19 of user core. Aug 19 00:17:47.402365 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 19 00:17:48.342103 sshd[7023]: Connection closed by 10.200.16.10 port 41736 Aug 19 00:17:48.342427 sshd-session[7020]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:48.345965 systemd[1]: sshd@16-10.200.20.41:22-10.200.16.10:41736.service: Deactivated successfully. Aug 19 00:17:48.348051 systemd[1]: session-19.scope: Deactivated successfully. Aug 19 00:17:48.349110 systemd-logind[1846]: Session 19 logged out. Waiting for processes to exit. Aug 19 00:17:48.351097 systemd-logind[1846]: Removed session 19. Aug 19 00:17:48.442039 systemd[1]: Started sshd@17-10.200.20.41:22-10.200.16.10:41744.service - OpenSSH per-connection server daemon (10.200.16.10:41744). Aug 19 00:17:48.894756 sshd[7040]: Accepted publickey for core from 10.200.16.10 port 41744 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:48.895872 sshd-session[7040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:48.900156 systemd-logind[1846]: New session 20 of user core. Aug 19 00:17:48.904566 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 19 00:17:49.370250 sshd[7043]: Connection closed by 10.200.16.10 port 41744 Aug 19 00:17:49.369821 sshd-session[7040]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:49.374721 systemd[1]: sshd@17-10.200.20.41:22-10.200.16.10:41744.service: Deactivated successfully. Aug 19 00:17:49.379634 systemd[1]: session-20.scope: Deactivated successfully. Aug 19 00:17:49.381206 systemd-logind[1846]: Session 20 logged out. Waiting for processes to exit. Aug 19 00:17:49.384161 systemd-logind[1846]: Removed session 20. Aug 19 00:17:49.456333 systemd[1]: Started sshd@18-10.200.20.41:22-10.200.16.10:41752.service - OpenSSH per-connection server daemon (10.200.16.10:41752). Aug 19 00:17:49.953261 sshd[7053]: Accepted publickey for core from 10.200.16.10 port 41752 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:49.954108 sshd-session[7053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:49.958002 systemd-logind[1846]: New session 21 of user core. Aug 19 00:17:49.966348 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 19 00:17:50.340697 sshd[7056]: Connection closed by 10.200.16.10 port 41752 Aug 19 00:17:50.340625 sshd-session[7053]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:50.342973 systemd[1]: sshd@18-10.200.20.41:22-10.200.16.10:41752.service: Deactivated successfully. Aug 19 00:17:50.344603 systemd[1]: session-21.scope: Deactivated successfully. Aug 19 00:17:50.345843 systemd-logind[1846]: Session 21 logged out. Waiting for processes to exit. Aug 19 00:17:50.347782 systemd-logind[1846]: Removed session 21. Aug 19 00:17:53.620484 containerd[1864]: time="2025-08-19T00:17:53.620386262Z" level=info msg="StopPodSandbox for \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\"" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.643 [WARNING][7079] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.643 [INFO][7079] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.643 [INFO][7079] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" iface="eth0" netns="" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.643 [INFO][7079] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.643 [INFO][7079] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.655 [INFO][7086] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.655 [INFO][7086] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.655 [INFO][7086] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.659 [WARNING][7086] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.659 [INFO][7086] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.660 [INFO][7086] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:17:53.662489 containerd[1864]: 2025-08-19 00:17:53.661 [INFO][7079] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.662819 containerd[1864]: time="2025-08-19T00:17:53.662526114Z" level=info msg="TearDown network for sandbox \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" successfully" Aug 19 00:17:53.662819 containerd[1864]: time="2025-08-19T00:17:53.662547882Z" level=info msg="StopPodSandbox for \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" returns successfully" Aug 19 00:17:53.663269 containerd[1864]: time="2025-08-19T00:17:53.663246959Z" level=info msg="RemovePodSandbox for \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\"" Aug 19 00:17:53.663313 containerd[1864]: time="2025-08-19T00:17:53.663277120Z" level=info msg="Forcibly stopping sandbox \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\"" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.685 [WARNING][7100] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" WorkloadEndpoint="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.685 [INFO][7100] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.685 [INFO][7100] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" iface="eth0" netns="" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.685 [INFO][7100] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.685 [INFO][7100] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.697 [INFO][7107] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.697 [INFO][7107] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.697 [INFO][7107] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.701 [WARNING][7107] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.701 [INFO][7107] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" HandleID="k8s-pod-network.8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Workload="ci--4426.0.0--a--440c7464d3-k8s-calico--apiserver--6c8858db87--7lbsf-eth0" Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.702 [INFO][7107] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:17:53.704551 containerd[1864]: 2025-08-19 00:17:53.703 [INFO][7100] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b" Aug 19 00:17:53.705002 containerd[1864]: time="2025-08-19T00:17:53.704534930Z" level=info msg="TearDown network for sandbox \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" successfully" Aug 19 00:17:53.706121 containerd[1864]: time="2025-08-19T00:17:53.706058775Z" level=info msg="Ensure that sandbox 8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b in task-service has been cleanup successfully" Aug 19 00:17:53.723795 containerd[1864]: time="2025-08-19T00:17:53.723745492Z" level=info msg="RemovePodSandbox \"8a037fe2b540874b16a322fdeadeec02a14a62f95897f963bede916119eb427b\" returns successfully" Aug 19 00:17:55.428923 systemd[1]: Started sshd@19-10.200.20.41:22-10.200.16.10:60062.service - OpenSSH per-connection server daemon (10.200.16.10:60062). Aug 19 00:17:55.923443 sshd[7114]: Accepted publickey for core from 10.200.16.10 port 60062 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:17:55.924583 sshd-session[7114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:55.929112 systemd-logind[1846]: New session 22 of user core. Aug 19 00:17:55.934351 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 19 00:17:56.312276 sshd[7117]: Connection closed by 10.200.16.10 port 60062 Aug 19 00:17:56.312739 sshd-session[7114]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:56.315458 systemd[1]: sshd@19-10.200.20.41:22-10.200.16.10:60062.service: Deactivated successfully. Aug 19 00:17:56.317320 systemd[1]: session-22.scope: Deactivated successfully. Aug 19 00:17:56.318360 systemd-logind[1846]: Session 22 logged out. Waiting for processes to exit. Aug 19 00:17:56.319406 systemd-logind[1846]: Removed session 22. Aug 19 00:17:57.124928 containerd[1864]: time="2025-08-19T00:17:57.124876148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fac6f3e5f76eeb5c21a290eb43a00cdeb51c3284e5fb7d56cfd310831e5fa438\" id:\"938bf174bb2ac017fb9d080a1c8ea6e5b0dee83208434d8273434b6c2c770bd1\" pid:7139 exited_at:{seconds:1755562677 nanos:124260954}" Aug 19 00:17:58.219807 containerd[1864]: time="2025-08-19T00:17:58.219768786Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aedaaef0b6e9860ec579aeca34031d580eb51d9155789ed51ee18cc14204b620\" id:\"3c481b7b9bd39b0c7c87f75f5874d0e43bedbebabeab2a9d68c04463e6967ac9\" pid:7162 exited_at:{seconds:1755562678 nanos:219555044}" Aug 19 00:18:01.388668 systemd[1]: Started sshd@20-10.200.20.41:22-10.200.16.10:43802.service - OpenSSH per-connection server daemon (10.200.16.10:43802). Aug 19 00:18:01.827480 sshd[7175]: Accepted publickey for core from 10.200.16.10 port 43802 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:18:01.828525 sshd-session[7175]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:18:01.832282 systemd-logind[1846]: New session 23 of user core. Aug 19 00:18:01.835344 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 19 00:18:02.206112 sshd[7178]: Connection closed by 10.200.16.10 port 43802 Aug 19 00:18:02.204951 sshd-session[7175]: pam_unix(sshd:session): session closed for user core Aug 19 00:18:02.210320 systemd[1]: sshd@20-10.200.20.41:22-10.200.16.10:43802.service: Deactivated successfully. Aug 19 00:18:02.210932 systemd-logind[1846]: Session 23 logged out. Waiting for processes to exit. Aug 19 00:18:02.215158 systemd[1]: session-23.scope: Deactivated successfully. Aug 19 00:18:02.217263 systemd-logind[1846]: Removed session 23. Aug 19 00:18:05.123207 containerd[1864]: time="2025-08-19T00:18:05.123159967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"4e09e64fdc64d7e89ebcfef9f4144ffc43ea6ff9730159afbf9d6b9898659d71\" pid:7202 exited_at:{seconds:1755562685 nanos:121105369}" Aug 19 00:18:06.186666 containerd[1864]: time="2025-08-19T00:18:06.186617628Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1dff065ab2a6ab8435771129de6cafcf22c01c692089b46e48b4cd3bd12189a\" id:\"94d09dac6d266374068c4c9cbaee03527b5a8d3d00a73af32d76a510bb206349\" pid:7223 exited_at:{seconds:1755562686 nanos:186327964}" Aug 19 00:18:07.300341 systemd[1]: Started sshd@21-10.200.20.41:22-10.200.16.10:43814.service - OpenSSH per-connection server daemon (10.200.16.10:43814). Aug 19 00:18:07.793534 sshd[7234]: Accepted publickey for core from 10.200.16.10 port 43814 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:18:07.794621 sshd-session[7234]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:18:07.798307 systemd-logind[1846]: New session 24 of user core. Aug 19 00:18:07.803343 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 19 00:18:08.186675 sshd[7237]: Connection closed by 10.200.16.10 port 43814 Aug 19 00:18:08.187290 sshd-session[7234]: pam_unix(sshd:session): session closed for user core Aug 19 00:18:08.190796 systemd-logind[1846]: Session 24 logged out. Waiting for processes to exit. Aug 19 00:18:08.190936 systemd[1]: sshd@21-10.200.20.41:22-10.200.16.10:43814.service: Deactivated successfully. Aug 19 00:18:08.193516 systemd[1]: session-24.scope: Deactivated successfully. Aug 19 00:18:08.195118 systemd-logind[1846]: Removed session 24. Aug 19 00:18:13.277186 systemd[1]: Started sshd@22-10.200.20.41:22-10.200.16.10:58124.service - OpenSSH per-connection server daemon (10.200.16.10:58124). Aug 19 00:18:13.773906 sshd[7250]: Accepted publickey for core from 10.200.16.10 port 58124 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:18:13.774904 sshd-session[7250]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:18:13.780575 systemd-logind[1846]: New session 25 of user core. Aug 19 00:18:13.782419 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 19 00:18:14.166373 sshd[7256]: Connection closed by 10.200.16.10 port 58124 Aug 19 00:18:14.166859 sshd-session[7250]: pam_unix(sshd:session): session closed for user core Aug 19 00:18:14.169722 systemd[1]: sshd@22-10.200.20.41:22-10.200.16.10:58124.service: Deactivated successfully. Aug 19 00:18:14.172602 systemd[1]: session-25.scope: Deactivated successfully. Aug 19 00:18:14.174047 systemd-logind[1846]: Session 25 logged out. Waiting for processes to exit. Aug 19 00:18:14.175151 systemd-logind[1846]: Removed session 25. Aug 19 00:18:19.250797 systemd[1]: Started sshd@23-10.200.20.41:22-10.200.16.10:58138.service - OpenSSH per-connection server daemon (10.200.16.10:58138). Aug 19 00:18:19.705567 sshd[7272]: Accepted publickey for core from 10.200.16.10 port 58138 ssh2: RSA SHA256:tHXpZ9Uix+1Ly7F4ljpaMBfkvX5B0AqptkjWKRNXLmA Aug 19 00:18:19.706065 sshd-session[7272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:18:19.710224 systemd-logind[1846]: New session 26 of user core. Aug 19 00:18:19.714359 systemd[1]: Started session-26.scope - Session 26 of User core. Aug 19 00:18:20.081728 sshd[7275]: Connection closed by 10.200.16.10 port 58138 Aug 19 00:18:20.082215 sshd-session[7272]: pam_unix(sshd:session): session closed for user core Aug 19 00:18:20.084942 systemd-logind[1846]: Session 26 logged out. Waiting for processes to exit. Aug 19 00:18:20.085787 systemd[1]: sshd@23-10.200.20.41:22-10.200.16.10:58138.service: Deactivated successfully. Aug 19 00:18:20.087366 systemd[1]: session-26.scope: Deactivated successfully. Aug 19 00:18:20.088860 systemd-logind[1846]: Removed session 26.