Mar 12 23:48:29.130821 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Mar 12 23:48:29.130840 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Mar 12 22:07:21 -00 2026 Mar 12 23:48:29.130846 kernel: KASLR enabled Mar 12 23:48:29.130850 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 12 23:48:29.130854 kernel: printk: legacy bootconsole [pl11] enabled Mar 12 23:48:29.130859 kernel: efi: EFI v2.7 by EDK II Mar 12 23:48:29.130864 kernel: efi: ACPI 2.0=0x3f979018 SMBIOS=0x3f8a0000 SMBIOS 3.0=0x3f880000 MEMATTR=0x3e89c018 RNG=0x3f979998 MEMRESERVE=0x3db83598 Mar 12 23:48:29.130868 kernel: random: crng init done Mar 12 23:48:29.130872 kernel: secureboot: Secure boot disabled Mar 12 23:48:29.130876 kernel: ACPI: Early table checksum verification disabled Mar 12 23:48:29.130880 kernel: ACPI: RSDP 0x000000003F979018 000024 (v02 VRTUAL) Mar 12 23:48:29.130884 kernel: ACPI: XSDT 0x000000003F979F18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130888 kernel: ACPI: FACP 0x000000003F979C18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130892 kernel: ACPI: DSDT 0x000000003F95A018 01E046 (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 12 23:48:29.130898 kernel: ACPI: DBG2 0x000000003F979B18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130902 kernel: ACPI: GTDT 0x000000003F979D98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130907 kernel: ACPI: OEM0 0x000000003F979098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130911 kernel: ACPI: SPCR 0x000000003F979A98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130915 kernel: ACPI: APIC 0x000000003F979818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130920 kernel: ACPI: SRAT 0x000000003F979198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130925 kernel: ACPI: PPTT 0x000000003F979418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 12 23:48:29.130929 kernel: ACPI: BGRT 0x000000003F979E98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 12 23:48:29.130933 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 12 23:48:29.130937 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 12 23:48:29.130941 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 12 23:48:29.130945 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Mar 12 23:48:29.130949 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Mar 12 23:48:29.130954 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 12 23:48:29.130958 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 12 23:48:29.130962 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 12 23:48:29.130967 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 12 23:48:29.130971 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 12 23:48:29.130975 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 12 23:48:29.130979 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 12 23:48:29.130984 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 12 23:48:29.130988 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 12 23:48:29.130992 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Mar 12 23:48:29.130996 kernel: NODE_DATA(0) allocated [mem 0x1bf7ffa00-0x1bf806fff] Mar 12 23:48:29.131000 kernel: Zone ranges: Mar 12 23:48:29.131004 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 12 23:48:29.131012 kernel: DMA32 empty Mar 12 23:48:29.131016 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 12 23:48:29.131020 kernel: Device empty Mar 12 23:48:29.131025 kernel: Movable zone start for each node Mar 12 23:48:29.131029 kernel: Early memory node ranges Mar 12 23:48:29.131033 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 12 23:48:29.131039 kernel: node 0: [mem 0x0000000000824000-0x000000003f38ffff] Mar 12 23:48:29.131043 kernel: node 0: [mem 0x000000003f390000-0x000000003f93ffff] Mar 12 23:48:29.131047 kernel: node 0: [mem 0x000000003f940000-0x000000003f9effff] Mar 12 23:48:29.131052 kernel: node 0: [mem 0x000000003f9f0000-0x000000003fdeffff] Mar 12 23:48:29.131056 kernel: node 0: [mem 0x000000003fdf0000-0x000000003fffffff] Mar 12 23:48:29.131060 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 12 23:48:29.131065 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 12 23:48:29.131069 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 12 23:48:29.131073 kernel: cma: Reserved 16 MiB at 0x000000003ca00000 on node -1 Mar 12 23:48:29.131078 kernel: psci: probing for conduit method from ACPI. Mar 12 23:48:29.131082 kernel: psci: PSCIv1.3 detected in firmware. Mar 12 23:48:29.131098 kernel: psci: Using standard PSCI v0.2 function IDs Mar 12 23:48:29.131103 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 12 23:48:29.131107 kernel: psci: SMC Calling Convention v1.4 Mar 12 23:48:29.131112 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 12 23:48:29.131116 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 12 23:48:29.131121 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 12 23:48:29.131125 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 12 23:48:29.131129 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 12 23:48:29.131134 kernel: Detected PIPT I-cache on CPU0 Mar 12 23:48:29.131138 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Mar 12 23:48:29.131143 kernel: CPU features: detected: GIC system register CPU interface Mar 12 23:48:29.131147 kernel: CPU features: detected: Spectre-v4 Mar 12 23:48:29.131151 kernel: CPU features: detected: Spectre-BHB Mar 12 23:48:29.131157 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 12 23:48:29.131161 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 12 23:48:29.131166 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Mar 12 23:48:29.131170 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 12 23:48:29.131174 kernel: alternatives: applying boot alternatives Mar 12 23:48:29.131180 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:48:29.131184 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 12 23:48:29.131189 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 12 23:48:29.131193 kernel: Fallback order for Node 0: 0 Mar 12 23:48:29.131198 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Mar 12 23:48:29.131203 kernel: Policy zone: Normal Mar 12 23:48:29.131207 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 12 23:48:29.131211 kernel: software IO TLB: area num 2. Mar 12 23:48:29.131216 kernel: software IO TLB: mapped [mem 0x0000000035900000-0x0000000039900000] (64MB) Mar 12 23:48:29.131220 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 12 23:48:29.131225 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 12 23:48:29.131230 kernel: rcu: RCU event tracing is enabled. Mar 12 23:48:29.131234 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 12 23:48:29.131239 kernel: Trampoline variant of Tasks RCU enabled. Mar 12 23:48:29.131243 kernel: Tracing variant of Tasks RCU enabled. Mar 12 23:48:29.131248 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 12 23:48:29.131252 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 12 23:48:29.131257 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 12 23:48:29.131262 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 12 23:48:29.131266 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 12 23:48:29.131271 kernel: GICv3: 960 SPIs implemented Mar 12 23:48:29.131275 kernel: GICv3: 0 Extended SPIs implemented Mar 12 23:48:29.131279 kernel: Root IRQ handler: gic_handle_irq Mar 12 23:48:29.131284 kernel: GICv3: GICv3 features: 16 PPIs, RSS Mar 12 23:48:29.131288 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Mar 12 23:48:29.131293 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 12 23:48:29.131297 kernel: ITS: No ITS available, not enabling LPIs Mar 12 23:48:29.131301 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 12 23:48:29.131307 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Mar 12 23:48:29.131311 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 12 23:48:29.131316 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Mar 12 23:48:29.131320 kernel: Console: colour dummy device 80x25 Mar 12 23:48:29.131325 kernel: printk: legacy console [tty1] enabled Mar 12 23:48:29.131330 kernel: ACPI: Core revision 20240827 Mar 12 23:48:29.131335 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Mar 12 23:48:29.131339 kernel: pid_max: default: 32768 minimum: 301 Mar 12 23:48:29.131344 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 12 23:48:29.131348 kernel: landlock: Up and running. Mar 12 23:48:29.131353 kernel: SELinux: Initializing. Mar 12 23:48:29.131358 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 12 23:48:29.131363 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 12 23:48:29.131367 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0xa0000e, misc 0x31e1 Mar 12 23:48:29.131372 kernel: Hyper-V: Host Build 10.0.26102.1212-1-0 Mar 12 23:48:29.131380 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 12 23:48:29.131386 kernel: rcu: Hierarchical SRCU implementation. Mar 12 23:48:29.131391 kernel: rcu: Max phase no-delay instances is 400. Mar 12 23:48:29.131395 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 12 23:48:29.131400 kernel: Remapping and enabling EFI services. Mar 12 23:48:29.131405 kernel: smp: Bringing up secondary CPUs ... Mar 12 23:48:29.131409 kernel: Detected PIPT I-cache on CPU1 Mar 12 23:48:29.131415 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 12 23:48:29.131420 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Mar 12 23:48:29.131425 kernel: smp: Brought up 1 node, 2 CPUs Mar 12 23:48:29.131429 kernel: SMP: Total of 2 processors activated. Mar 12 23:48:29.131434 kernel: CPU: All CPU(s) started at EL1 Mar 12 23:48:29.131440 kernel: CPU features: detected: 32-bit EL0 Support Mar 12 23:48:29.131445 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 12 23:48:29.131449 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 12 23:48:29.131454 kernel: CPU features: detected: Common not Private translations Mar 12 23:48:29.131459 kernel: CPU features: detected: CRC32 instructions Mar 12 23:48:29.131464 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Mar 12 23:48:29.131468 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 12 23:48:29.131473 kernel: CPU features: detected: LSE atomic instructions Mar 12 23:48:29.131478 kernel: CPU features: detected: Privileged Access Never Mar 12 23:48:29.131483 kernel: CPU features: detected: Speculation barrier (SB) Mar 12 23:48:29.131488 kernel: CPU features: detected: TLB range maintenance instructions Mar 12 23:48:29.131493 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 12 23:48:29.131498 kernel: CPU features: detected: Scalable Vector Extension Mar 12 23:48:29.131502 kernel: alternatives: applying system-wide alternatives Mar 12 23:48:29.131507 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Mar 12 23:48:29.131512 kernel: SVE: maximum available vector length 16 bytes per vector Mar 12 23:48:29.131517 kernel: SVE: default vector length 16 bytes per vector Mar 12 23:48:29.131522 kernel: Memory: 3952828K/4194160K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 220144K reserved, 16384K cma-reserved) Mar 12 23:48:29.131528 kernel: devtmpfs: initialized Mar 12 23:48:29.131533 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 12 23:48:29.131537 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 12 23:48:29.131542 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 12 23:48:29.131547 kernel: 0 pages in range for non-PLT usage Mar 12 23:48:29.131551 kernel: 508400 pages in range for PLT usage Mar 12 23:48:29.131556 kernel: pinctrl core: initialized pinctrl subsystem Mar 12 23:48:29.131561 kernel: SMBIOS 3.1.0 present. Mar 12 23:48:29.131566 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 06/10/2025 Mar 12 23:48:29.131571 kernel: DMI: Memory slots populated: 2/2 Mar 12 23:48:29.131576 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 12 23:48:29.131581 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 12 23:48:29.131586 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 12 23:48:29.131591 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 12 23:48:29.131596 kernel: audit: initializing netlink subsys (disabled) Mar 12 23:48:29.131600 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Mar 12 23:48:29.131605 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 12 23:48:29.131610 kernel: cpuidle: using governor menu Mar 12 23:48:29.131615 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 12 23:48:29.131620 kernel: ASID allocator initialised with 32768 entries Mar 12 23:48:29.131625 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 12 23:48:29.131629 kernel: Serial: AMBA PL011 UART driver Mar 12 23:48:29.131634 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 12 23:48:29.131639 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 12 23:48:29.131644 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 12 23:48:29.131648 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 12 23:48:29.131654 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 12 23:48:29.131659 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 12 23:48:29.131664 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 12 23:48:29.131668 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 12 23:48:29.131673 kernel: ACPI: Added _OSI(Module Device) Mar 12 23:48:29.131678 kernel: ACPI: Added _OSI(Processor Device) Mar 12 23:48:29.131682 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 12 23:48:29.131687 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 12 23:48:29.131692 kernel: ACPI: Interpreter enabled Mar 12 23:48:29.131697 kernel: ACPI: Using GIC for interrupt routing Mar 12 23:48:29.131702 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 12 23:48:29.131707 kernel: printk: legacy console [ttyAMA0] enabled Mar 12 23:48:29.131712 kernel: printk: legacy bootconsole [pl11] disabled Mar 12 23:48:29.131716 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 12 23:48:29.131721 kernel: ACPI: CPU0 has been hot-added Mar 12 23:48:29.131726 kernel: ACPI: CPU1 has been hot-added Mar 12 23:48:29.131731 kernel: iommu: Default domain type: Translated Mar 12 23:48:29.131735 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 12 23:48:29.131740 kernel: efivars: Registered efivars operations Mar 12 23:48:29.131746 kernel: vgaarb: loaded Mar 12 23:48:29.131751 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 12 23:48:29.131755 kernel: VFS: Disk quotas dquot_6.6.0 Mar 12 23:48:29.131760 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 12 23:48:29.131765 kernel: pnp: PnP ACPI init Mar 12 23:48:29.131769 kernel: pnp: PnP ACPI: found 0 devices Mar 12 23:48:29.131774 kernel: NET: Registered PF_INET protocol family Mar 12 23:48:29.131779 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 12 23:48:29.131784 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 12 23:48:29.131790 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 12 23:48:29.131794 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 23:48:29.131799 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 12 23:48:29.131804 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 12 23:48:29.131809 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 12 23:48:29.131814 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 12 23:48:29.131818 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 12 23:48:29.131823 kernel: PCI: CLS 0 bytes, default 64 Mar 12 23:48:29.131828 kernel: kvm [1]: HYP mode not available Mar 12 23:48:29.131833 kernel: Initialise system trusted keyrings Mar 12 23:48:29.131838 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 12 23:48:29.131843 kernel: Key type asymmetric registered Mar 12 23:48:29.131847 kernel: Asymmetric key parser 'x509' registered Mar 12 23:48:29.131852 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 12 23:48:29.131857 kernel: io scheduler mq-deadline registered Mar 12 23:48:29.131862 kernel: io scheduler kyber registered Mar 12 23:48:29.131866 kernel: io scheduler bfq registered Mar 12 23:48:29.131871 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 12 23:48:29.131877 kernel: thunder_xcv, ver 1.0 Mar 12 23:48:29.131881 kernel: thunder_bgx, ver 1.0 Mar 12 23:48:29.131886 kernel: nicpf, ver 1.0 Mar 12 23:48:29.131891 kernel: nicvf, ver 1.0 Mar 12 23:48:29.132010 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 12 23:48:29.132062 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-12T23:48:28 UTC (1773359308) Mar 12 23:48:29.132069 kernel: efifb: probing for efifb Mar 12 23:48:29.132076 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 12 23:48:29.132081 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 12 23:48:29.132095 kernel: efifb: scrolling: redraw Mar 12 23:48:29.132100 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 12 23:48:29.132105 kernel: Console: switching to colour frame buffer device 128x48 Mar 12 23:48:29.132109 kernel: fb0: EFI VGA frame buffer device Mar 12 23:48:29.132114 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 12 23:48:29.132119 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 12 23:48:29.132124 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 12 23:48:29.132130 kernel: NET: Registered PF_INET6 protocol family Mar 12 23:48:29.132134 kernel: watchdog: NMI not fully supported Mar 12 23:48:29.132139 kernel: watchdog: Hard watchdog permanently disabled Mar 12 23:48:29.132144 kernel: Segment Routing with IPv6 Mar 12 23:48:29.132149 kernel: In-situ OAM (IOAM) with IPv6 Mar 12 23:48:29.132153 kernel: NET: Registered PF_PACKET protocol family Mar 12 23:48:29.132158 kernel: Key type dns_resolver registered Mar 12 23:48:29.132163 kernel: registered taskstats version 1 Mar 12 23:48:29.132167 kernel: Loading compiled-in X.509 certificates Mar 12 23:48:29.132172 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 653709f5ad64856a37b70c07139630123477ee1c' Mar 12 23:48:29.132178 kernel: Demotion targets for Node 0: null Mar 12 23:48:29.132182 kernel: Key type .fscrypt registered Mar 12 23:48:29.132187 kernel: Key type fscrypt-provisioning registered Mar 12 23:48:29.132192 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 12 23:48:29.132196 kernel: ima: Allocated hash algorithm: sha1 Mar 12 23:48:29.132201 kernel: ima: No architecture policies found Mar 12 23:48:29.132206 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 12 23:48:29.132211 kernel: clk: Disabling unused clocks Mar 12 23:48:29.132216 kernel: PM: genpd: Disabling unused power domains Mar 12 23:48:29.132222 kernel: Warning: unable to open an initial console. Mar 12 23:48:29.132226 kernel: Freeing unused kernel memory: 39552K Mar 12 23:48:29.132231 kernel: Run /init as init process Mar 12 23:48:29.132236 kernel: with arguments: Mar 12 23:48:29.132241 kernel: /init Mar 12 23:48:29.132245 kernel: with environment: Mar 12 23:48:29.132250 kernel: HOME=/ Mar 12 23:48:29.132254 kernel: TERM=linux Mar 12 23:48:29.132260 systemd[1]: Successfully made /usr/ read-only. Mar 12 23:48:29.132268 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:48:29.132274 systemd[1]: Detected virtualization microsoft. Mar 12 23:48:29.132279 systemd[1]: Detected architecture arm64. Mar 12 23:48:29.132284 systemd[1]: Running in initrd. Mar 12 23:48:29.132289 systemd[1]: No hostname configured, using default hostname. Mar 12 23:48:29.132294 systemd[1]: Hostname set to . Mar 12 23:48:29.132299 systemd[1]: Initializing machine ID from random generator. Mar 12 23:48:29.132305 systemd[1]: Queued start job for default target initrd.target. Mar 12 23:48:29.132311 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:48:29.132316 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:48:29.132322 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 12 23:48:29.132327 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:48:29.132333 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 12 23:48:29.132338 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 12 23:48:29.132345 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 12 23:48:29.132351 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 12 23:48:29.132356 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:48:29.132361 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:48:29.132366 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:48:29.132371 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:48:29.132376 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:48:29.132382 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:48:29.132388 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:48:29.132394 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:48:29.132399 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 12 23:48:29.132404 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 12 23:48:29.132409 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:48:29.132415 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:48:29.132420 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:48:29.132425 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:48:29.132430 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 12 23:48:29.132437 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:48:29.132442 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 12 23:48:29.132447 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 12 23:48:29.132453 systemd[1]: Starting systemd-fsck-usr.service... Mar 12 23:48:29.132458 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:48:29.132463 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:48:29.132480 systemd-journald[225]: Collecting audit messages is disabled. Mar 12 23:48:29.132496 systemd-journald[225]: Journal started Mar 12 23:48:29.132510 systemd-journald[225]: Runtime Journal (/run/log/journal/396347f79f274bf3b10ba890f374efb2) is 8M, max 78.3M, 70.3M free. Mar 12 23:48:29.140128 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:48:29.145700 systemd-modules-load[227]: Inserted module 'overlay' Mar 12 23:48:29.169109 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 12 23:48:29.173201 systemd-modules-load[227]: Inserted module 'br_netfilter' Mar 12 23:48:29.181482 kernel: Bridge firewalling registered Mar 12 23:48:29.181512 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:48:29.187389 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 12 23:48:29.192592 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:48:29.203113 systemd[1]: Finished systemd-fsck-usr.service. Mar 12 23:48:29.211946 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:48:29.221247 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:29.233474 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 23:48:29.254565 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:48:29.270925 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 23:48:29.278661 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:48:29.293113 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:48:29.305642 systemd-tmpfiles[255]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 12 23:48:29.319114 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:48:29.331130 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:48:29.338491 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:48:29.352934 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 12 23:48:29.372880 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:48:29.385477 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:48:29.404599 dracut-cmdline[264]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:48:29.430950 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:48:29.447627 systemd-resolved[266]: Positive Trust Anchors: Mar 12 23:48:29.447641 systemd-resolved[266]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:48:29.447660 systemd-resolved[266]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:48:29.449345 systemd-resolved[266]: Defaulting to hostname 'linux'. Mar 12 23:48:29.451116 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:48:29.456196 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:48:29.563115 kernel: SCSI subsystem initialized Mar 12 23:48:29.568096 kernel: Loading iSCSI transport class v2.0-870. Mar 12 23:48:29.576250 kernel: iscsi: registered transport (tcp) Mar 12 23:48:29.589338 kernel: iscsi: registered transport (qla4xxx) Mar 12 23:48:29.589353 kernel: QLogic iSCSI HBA Driver Mar 12 23:48:29.603629 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:48:29.624978 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:48:29.631931 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:48:29.684480 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 12 23:48:29.690696 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 12 23:48:29.755107 kernel: raid6: neonx8 gen() 18543 MB/s Mar 12 23:48:29.774095 kernel: raid6: neonx4 gen() 18540 MB/s Mar 12 23:48:29.793096 kernel: raid6: neonx2 gen() 17087 MB/s Mar 12 23:48:29.814096 kernel: raid6: neonx1 gen() 15117 MB/s Mar 12 23:48:29.833093 kernel: raid6: int64x8 gen() 10543 MB/s Mar 12 23:48:29.852198 kernel: raid6: int64x4 gen() 10612 MB/s Mar 12 23:48:29.872117 kernel: raid6: int64x2 gen() 8997 MB/s Mar 12 23:48:29.894035 kernel: raid6: int64x1 gen() 7056 MB/s Mar 12 23:48:29.894135 kernel: raid6: using algorithm neonx8 gen() 18543 MB/s Mar 12 23:48:29.915663 kernel: raid6: .... xor() 14912 MB/s, rmw enabled Mar 12 23:48:29.915671 kernel: raid6: using neon recovery algorithm Mar 12 23:48:29.924411 kernel: xor: measuring software checksum speed Mar 12 23:48:29.924420 kernel: 8regs : 28615 MB/sec Mar 12 23:48:29.927985 kernel: 32regs : 28765 MB/sec Mar 12 23:48:29.930728 kernel: arm64_neon : 37638 MB/sec Mar 12 23:48:29.934372 kernel: xor: using function: arm64_neon (37638 MB/sec) Mar 12 23:48:29.975112 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 12 23:48:29.980487 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:48:29.992713 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:48:30.021826 systemd-udevd[477]: Using default interface naming scheme 'v255'. Mar 12 23:48:30.028953 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:48:30.043242 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 12 23:48:30.068712 dracut-pre-trigger[491]: rd.md=0: removing MD RAID activation Mar 12 23:48:30.090621 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:48:30.098208 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:48:30.148867 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:48:30.163239 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 12 23:48:30.222109 kernel: hv_vmbus: Vmbus version:5.3 Mar 12 23:48:30.250575 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 12 23:48:30.250631 kernel: hv_vmbus: registering driver hid_hyperv Mar 12 23:48:30.250640 kernel: hv_vmbus: registering driver hv_storvsc Mar 12 23:48:30.250646 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 12 23:48:30.250653 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 12 23:48:30.258117 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:48:30.271407 kernel: scsi host0: storvsc_host_t Mar 12 23:48:30.271449 kernel: hv_vmbus: registering driver hv_netvsc Mar 12 23:48:30.271457 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 12 23:48:30.258506 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:30.311740 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 12 23:48:30.311758 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 12 23:48:30.311898 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 12 23:48:30.311917 kernel: scsi host1: storvsc_host_t Mar 12 23:48:30.311997 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Mar 12 23:48:30.309251 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:48:30.318341 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:48:30.335389 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:48:30.351524 kernel: PTP clock support registered Mar 12 23:48:30.351545 kernel: hv_utils: Registering HyperV Utility Driver Mar 12 23:48:30.361318 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 12 23:48:30.361514 kernel: hv_vmbus: registering driver hv_utils Mar 12 23:48:30.361523 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 12 23:48:30.374420 kernel: hv_utils: Heartbeat IC version 3.0 Mar 12 23:48:30.374468 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 12 23:48:30.374624 kernel: hv_utils: Shutdown IC version 3.2 Mar 12 23:48:30.374641 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 12 23:48:30.378047 kernel: hv_utils: TimeSync IC version 4.0 Mar 12 23:48:30.378059 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 12 23:48:30.460817 systemd-resolved[266]: Clock change detected. Flushing caches. Mar 12 23:48:30.476474 kernel: hv_netvsc 7ced8d88-f908-7ced-8d88-f9087ced8d88 eth0: VF slot 1 added Mar 12 23:48:30.477118 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:30.494368 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 12 23:48:30.494417 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 12 23:48:30.502817 kernel: hv_vmbus: registering driver hv_pci Mar 12 23:48:30.502867 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 12 23:48:30.510071 kernel: hv_pci 16f6dbb0-cec6-4675-9efe-04232bfe3b11: PCI VMBus probing: Using version 0x10004 Mar 12 23:48:30.510285 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 12 23:48:30.523151 kernel: hv_pci 16f6dbb0-cec6-4675-9efe-04232bfe3b11: PCI host bridge to bus cec6:00 Mar 12 23:48:30.523375 kernel: pci_bus cec6:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 12 23:48:30.524823 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 12 23:48:30.524974 kernel: pci_bus cec6:00: No busn resource found for root bus, will use [bus 00-ff] Mar 12 23:48:30.535907 kernel: pci cec6:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Mar 12 23:48:30.541819 kernel: pci cec6:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 12 23:48:30.546811 kernel: pci cec6:00:02.0: enabling Extended Tags Mar 12 23:48:30.565169 kernel: pci cec6:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at cec6:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Mar 12 23:48:30.577598 kernel: pci_bus cec6:00: busn_res: [bus 00-ff] end is updated to 00 Mar 12 23:48:30.577779 kernel: pci cec6:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Mar 12 23:48:30.587825 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#216 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 12 23:48:30.613884 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#233 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 12 23:48:30.653675 kernel: mlx5_core cec6:00:02.0: enabling device (0000 -> 0002) Mar 12 23:48:30.663440 kernel: mlx5_core cec6:00:02.0: PTM is not supported by PCIe Mar 12 23:48:30.663559 kernel: mlx5_core cec6:00:02.0: firmware version: 16.30.5026 Mar 12 23:48:30.845673 kernel: hv_netvsc 7ced8d88-f908-7ced-8d88-f9087ced8d88 eth0: VF registering: eth1 Mar 12 23:48:30.845891 kernel: mlx5_core cec6:00:02.0 eth1: joined to eth0 Mar 12 23:48:30.851812 kernel: mlx5_core cec6:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 12 23:48:30.862830 kernel: mlx5_core cec6:00:02.0 enP52934s1: renamed from eth1 Mar 12 23:48:31.021350 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 12 23:48:31.122007 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 12 23:48:31.184029 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 12 23:48:31.190892 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 12 23:48:31.197466 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 12 23:48:31.216951 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 12 23:48:31.232765 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 12 23:48:31.239124 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:48:31.249796 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:48:31.261627 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:48:31.268142 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 12 23:48:31.298744 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 12 23:48:31.315074 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:48:32.318869 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 12 23:48:32.319179 disk-uuid[659]: The operation has completed successfully. Mar 12 23:48:32.400745 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 12 23:48:32.402821 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 12 23:48:32.423955 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 12 23:48:32.441239 sh[825]: Success Mar 12 23:48:32.476184 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 12 23:48:32.476250 kernel: device-mapper: uevent: version 1.0.3 Mar 12 23:48:32.482750 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 12 23:48:32.492825 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 12 23:48:32.753319 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 12 23:48:32.774025 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 12 23:48:32.779937 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 12 23:48:32.808974 kernel: BTRFS: device fsid fcbb17b2-5053-44fc-82f0-b24e4919d6d8 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (843) Mar 12 23:48:32.820735 kernel: BTRFS info (device dm-0): first mount of filesystem fcbb17b2-5053-44fc-82f0-b24e4919d6d8 Mar 12 23:48:32.820784 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:33.135043 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 12 23:48:33.135122 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 12 23:48:33.167690 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 12 23:48:33.172266 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:48:33.180535 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 12 23:48:33.181364 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 12 23:48:33.204516 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 12 23:48:33.236842 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (866) Mar 12 23:48:33.248516 kernel: BTRFS info (device sda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:33.248577 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:33.275281 kernel: BTRFS info (device sda6): turning on async discard Mar 12 23:48:33.275357 kernel: BTRFS info (device sda6): enabling free space tree Mar 12 23:48:33.286835 kernel: BTRFS info (device sda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:33.287688 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 12 23:48:33.295966 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 12 23:48:33.343589 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:48:33.356592 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:48:33.390226 systemd-networkd[1012]: lo: Link UP Mar 12 23:48:33.390240 systemd-networkd[1012]: lo: Gained carrier Mar 12 23:48:33.391084 systemd-networkd[1012]: Enumeration completed Mar 12 23:48:33.391173 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:48:33.398516 systemd-networkd[1012]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:33.398520 systemd-networkd[1012]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:48:33.398788 systemd[1]: Reached target network.target - Network. Mar 12 23:48:33.479823 kernel: mlx5_core cec6:00:02.0 enP52934s1: Link up Mar 12 23:48:33.522819 kernel: hv_netvsc 7ced8d88-f908-7ced-8d88-f9087ced8d88 eth0: Data path switched to VF: enP52934s1 Mar 12 23:48:33.523200 systemd-networkd[1012]: enP52934s1: Link UP Mar 12 23:48:33.523271 systemd-networkd[1012]: eth0: Link UP Mar 12 23:48:33.523405 systemd-networkd[1012]: eth0: Gained carrier Mar 12 23:48:33.523421 systemd-networkd[1012]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:33.545586 systemd-networkd[1012]: enP52934s1: Gained carrier Mar 12 23:48:33.555843 systemd-networkd[1012]: eth0: DHCPv4 address 10.200.20.40/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 12 23:48:34.499184 ignition[951]: Ignition 2.22.0 Mar 12 23:48:34.499198 ignition[951]: Stage: fetch-offline Mar 12 23:48:34.499310 ignition[951]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:34.506630 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:48:34.499318 ignition[951]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 12 23:48:34.515562 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 12 23:48:34.499406 ignition[951]: parsed url from cmdline: "" Mar 12 23:48:34.499408 ignition[951]: no config URL provided Mar 12 23:48:34.499412 ignition[951]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:48:34.499417 ignition[951]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:48:34.499421 ignition[951]: failed to fetch config: resource requires networking Mar 12 23:48:34.502382 ignition[951]: Ignition finished successfully Mar 12 23:48:34.557459 ignition[1023]: Ignition 2.22.0 Mar 12 23:48:34.557478 ignition[1023]: Stage: fetch Mar 12 23:48:34.557841 ignition[1023]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:34.557851 ignition[1023]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 12 23:48:34.557938 ignition[1023]: parsed url from cmdline: "" Mar 12 23:48:34.557941 ignition[1023]: no config URL provided Mar 12 23:48:34.557945 ignition[1023]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:48:34.557950 ignition[1023]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:48:34.557968 ignition[1023]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 12 23:48:34.691501 ignition[1023]: GET result: OK Mar 12 23:48:34.691576 ignition[1023]: config has been read from IMDS userdata Mar 12 23:48:34.691598 ignition[1023]: parsing config with SHA512: 4ab47783fd8044dc45df2d3abac2b69ebf41ca049dc59b5edfa648d611ebe8040824dd8c6b78b6bc5660dbf18e65f6f3bf49cf3cb4586496b042d85b67bd80f0 Mar 12 23:48:34.697922 unknown[1023]: fetched base config from "system" Mar 12 23:48:34.698227 ignition[1023]: fetch: fetch complete Mar 12 23:48:34.697931 unknown[1023]: fetched base config from "system" Mar 12 23:48:34.698231 ignition[1023]: fetch: fetch passed Mar 12 23:48:34.697936 unknown[1023]: fetched user config from "azure" Mar 12 23:48:34.698271 ignition[1023]: Ignition finished successfully Mar 12 23:48:34.700038 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 12 23:48:34.709039 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 12 23:48:34.751631 ignition[1030]: Ignition 2.22.0 Mar 12 23:48:34.751644 ignition[1030]: Stage: kargs Mar 12 23:48:34.751856 ignition[1030]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:34.751864 ignition[1030]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 12 23:48:34.758851 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 12 23:48:34.752476 ignition[1030]: kargs: kargs passed Mar 12 23:48:34.768956 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 12 23:48:34.752522 ignition[1030]: Ignition finished successfully Mar 12 23:48:34.801748 ignition[1037]: Ignition 2.22.0 Mar 12 23:48:34.801764 ignition[1037]: Stage: disks Mar 12 23:48:34.807838 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 12 23:48:34.801967 ignition[1037]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:34.813502 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 12 23:48:34.801974 ignition[1037]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 12 23:48:34.823964 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 12 23:48:34.802465 ignition[1037]: disks: disks passed Mar 12 23:48:34.833788 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:48:34.802508 ignition[1037]: Ignition finished successfully Mar 12 23:48:34.843495 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:48:34.852592 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:48:34.863686 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 12 23:48:34.955322 systemd-fsck[1045]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Mar 12 23:48:34.964180 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 12 23:48:34.971386 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 12 23:48:35.211822 kernel: EXT4-fs (sda9): mounted filesystem 4b09db19-3beb-48c2-8dcb-3eec5602206c r/w with ordered data mode. Quota mode: none. Mar 12 23:48:35.212736 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 12 23:48:35.217132 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 12 23:48:35.242683 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:48:35.248983 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 12 23:48:35.269231 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 12 23:48:35.281523 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 12 23:48:35.281578 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:48:35.297612 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 12 23:48:35.306693 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 12 23:48:35.334393 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1059) Mar 12 23:48:35.334439 kernel: BTRFS info (device sda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:35.344168 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:35.355226 kernel: BTRFS info (device sda6): turning on async discard Mar 12 23:48:35.355273 kernel: BTRFS info (device sda6): enabling free space tree Mar 12 23:48:35.357493 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:48:35.556931 systemd-networkd[1012]: eth0: Gained IPv6LL Mar 12 23:48:35.752865 coreos-metadata[1061]: Mar 12 23:48:35.752 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 12 23:48:35.762661 coreos-metadata[1061]: Mar 12 23:48:35.762 INFO Fetch successful Mar 12 23:48:35.766863 coreos-metadata[1061]: Mar 12 23:48:35.766 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 12 23:48:35.776189 coreos-metadata[1061]: Mar 12 23:48:35.775 INFO Fetch successful Mar 12 23:48:35.788778 coreos-metadata[1061]: Mar 12 23:48:35.788 INFO wrote hostname ci-4459.2.4-n-6470b86a4c to /sysroot/etc/hostname Mar 12 23:48:35.796284 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 12 23:48:36.075313 initrd-setup-root[1089]: cut: /sysroot/etc/passwd: No such file or directory Mar 12 23:48:36.138226 initrd-setup-root[1096]: cut: /sysroot/etc/group: No such file or directory Mar 12 23:48:36.145388 initrd-setup-root[1103]: cut: /sysroot/etc/shadow: No such file or directory Mar 12 23:48:36.153657 initrd-setup-root[1110]: cut: /sysroot/etc/gshadow: No such file or directory Mar 12 23:48:37.042832 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 12 23:48:37.049252 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 12 23:48:37.067219 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 12 23:48:37.080920 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 12 23:48:37.092177 kernel: BTRFS info (device sda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:37.116923 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 12 23:48:37.121360 ignition[1178]: INFO : Ignition 2.22.0 Mar 12 23:48:37.121360 ignition[1178]: INFO : Stage: mount Mar 12 23:48:37.121360 ignition[1178]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:37.121360 ignition[1178]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 12 23:48:37.121360 ignition[1178]: INFO : mount: mount passed Mar 12 23:48:37.121360 ignition[1178]: INFO : Ignition finished successfully Mar 12 23:48:37.125782 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 12 23:48:37.135189 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 12 23:48:37.168155 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:48:37.194977 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1190) Mar 12 23:48:37.204778 kernel: BTRFS info (device sda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:37.204824 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:37.214312 kernel: BTRFS info (device sda6): turning on async discard Mar 12 23:48:37.214368 kernel: BTRFS info (device sda6): enabling free space tree Mar 12 23:48:37.215908 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:48:37.245369 ignition[1207]: INFO : Ignition 2.22.0 Mar 12 23:48:37.249634 ignition[1207]: INFO : Stage: files Mar 12 23:48:37.249634 ignition[1207]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:37.249634 ignition[1207]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 12 23:48:37.249634 ignition[1207]: DEBUG : files: compiled without relabeling support, skipping Mar 12 23:48:37.267656 ignition[1207]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 12 23:48:37.267656 ignition[1207]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 12 23:48:37.315031 ignition[1207]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 12 23:48:37.321584 ignition[1207]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 12 23:48:37.321584 ignition[1207]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 12 23:48:37.315974 unknown[1207]: wrote ssh authorized keys file for user: core Mar 12 23:48:37.386983 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:48:37.396074 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 12 23:48:37.514432 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 12 23:48:38.148220 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:48:38.148220 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 12 23:48:38.166112 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 12 23:48:38.166112 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:48:38.166112 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:48:38.166112 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:48:38.166112 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:48:38.166112 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:48:38.166112 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:48:38.221063 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:48:38.221063 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:48:38.221063 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:38.221063 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:38.221063 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:38.221063 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-arm64.raw: attempt #1 Mar 12 23:48:38.541967 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 12 23:48:38.952196 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:38.952196 ignition[1207]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 12 23:48:38.968075 ignition[1207]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:48:38.981552 ignition[1207]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:48:38.981552 ignition[1207]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 12 23:48:39.008074 ignition[1207]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 12 23:48:39.008074 ignition[1207]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 12 23:48:39.008074 ignition[1207]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:48:39.008074 ignition[1207]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:48:39.008074 ignition[1207]: INFO : files: files passed Mar 12 23:48:39.008074 ignition[1207]: INFO : Ignition finished successfully Mar 12 23:48:38.983667 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 12 23:48:38.995884 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 12 23:48:39.030410 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 12 23:48:39.050512 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 12 23:48:39.052373 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 12 23:48:39.089497 initrd-setup-root-after-ignition[1236]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:48:39.089497 initrd-setup-root-after-ignition[1236]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:48:39.085006 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:48:39.127046 initrd-setup-root-after-ignition[1240]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:48:39.095157 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 12 23:48:39.107659 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 12 23:48:39.164449 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 12 23:48:39.164558 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 12 23:48:39.174505 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 12 23:48:39.184335 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 12 23:48:39.193187 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 12 23:48:39.194026 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 12 23:48:39.229651 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:48:39.237446 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 12 23:48:39.270674 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:48:39.276462 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:48:39.286699 systemd[1]: Stopped target timers.target - Timer Units. Mar 12 23:48:39.295945 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 12 23:48:39.296066 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:48:39.309172 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 12 23:48:39.313980 systemd[1]: Stopped target basic.target - Basic System. Mar 12 23:48:39.322916 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 12 23:48:39.331627 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:48:39.340298 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 12 23:48:39.349667 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:48:39.359606 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 12 23:48:39.369374 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:48:39.379829 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 12 23:48:39.388776 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 12 23:48:39.398157 systemd[1]: Stopped target swap.target - Swaps. Mar 12 23:48:39.406579 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 12 23:48:39.406689 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:48:39.420537 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:48:39.425494 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:48:39.434881 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 12 23:48:39.434944 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:48:39.444188 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 12 23:48:39.444296 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 12 23:48:39.458840 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 12 23:48:39.458948 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:48:39.465137 systemd[1]: ignition-files.service: Deactivated successfully. Mar 12 23:48:39.465222 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 12 23:48:39.529115 ignition[1260]: INFO : Ignition 2.22.0 Mar 12 23:48:39.529115 ignition[1260]: INFO : Stage: umount Mar 12 23:48:39.529115 ignition[1260]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:39.529115 ignition[1260]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 12 23:48:39.529115 ignition[1260]: INFO : umount: umount passed Mar 12 23:48:39.529115 ignition[1260]: INFO : Ignition finished successfully Mar 12 23:48:39.473263 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 12 23:48:39.473333 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 12 23:48:39.484603 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 12 23:48:39.499127 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 12 23:48:39.499289 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:48:39.522039 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 12 23:48:39.533052 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 12 23:48:39.533350 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:48:39.541252 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 12 23:48:39.541376 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:48:39.556573 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 12 23:48:39.556679 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 12 23:48:39.571611 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 12 23:48:39.571707 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 12 23:48:39.578178 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 12 23:48:39.578247 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 12 23:48:39.586465 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 12 23:48:39.586497 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 12 23:48:39.600379 systemd[1]: Stopped target network.target - Network. Mar 12 23:48:39.613856 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 12 23:48:39.613950 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:48:39.619202 systemd[1]: Stopped target paths.target - Path Units. Mar 12 23:48:39.628474 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 12 23:48:39.633038 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:48:39.638318 systemd[1]: Stopped target slices.target - Slice Units. Mar 12 23:48:39.647043 systemd[1]: Stopped target sockets.target - Socket Units. Mar 12 23:48:39.655873 systemd[1]: iscsid.socket: Deactivated successfully. Mar 12 23:48:39.655921 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:48:39.664796 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 12 23:48:39.664833 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:48:39.673826 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 12 23:48:39.673882 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 12 23:48:39.683049 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 12 23:48:39.683083 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 12 23:48:39.692231 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 12 23:48:39.701218 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 12 23:48:39.710924 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 12 23:48:39.711431 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 12 23:48:39.711524 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 12 23:48:39.725703 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 12 23:48:39.925853 kernel: hv_netvsc 7ced8d88-f908-7ced-8d88-f9087ced8d88 eth0: Data path switched from VF: enP52934s1 Mar 12 23:48:39.725783 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 12 23:48:39.740545 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 12 23:48:39.740751 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 12 23:48:39.740874 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 12 23:48:39.755429 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 12 23:48:39.757766 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 12 23:48:39.765695 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 12 23:48:39.765745 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:48:39.775174 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 12 23:48:39.794195 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 12 23:48:39.794283 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:48:39.804109 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 12 23:48:39.804158 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:48:39.817008 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 12 23:48:39.817059 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 12 23:48:39.822074 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 12 23:48:39.822116 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:48:39.835679 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:48:39.844704 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 12 23:48:39.844767 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:48:39.873734 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 12 23:48:39.873914 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:48:39.879831 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 12 23:48:39.879874 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 12 23:48:39.888951 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 12 23:48:39.888986 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:48:39.893568 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 12 23:48:39.893617 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:48:39.905870 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 12 23:48:39.905916 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 12 23:48:39.925971 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 23:48:39.926027 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:48:39.937079 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 12 23:48:39.954879 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 12 23:48:39.954975 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:48:39.970850 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 12 23:48:39.970910 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:48:39.982255 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:48:39.982312 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:39.997033 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 12 23:48:39.997085 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 12 23:48:39.997114 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:48:39.997375 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 12 23:48:39.997500 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 12 23:48:40.034878 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 12 23:48:40.035012 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 12 23:48:40.177487 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 12 23:48:40.177621 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 12 23:48:40.186145 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 12 23:48:40.195521 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 12 23:48:40.195594 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 12 23:48:40.206569 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 12 23:48:40.251110 systemd[1]: Switching root. Mar 12 23:48:40.360829 systemd-journald[225]: Journal stopped Mar 12 23:48:44.460337 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Mar 12 23:48:44.460357 kernel: SELinux: policy capability network_peer_controls=1 Mar 12 23:48:44.460364 kernel: SELinux: policy capability open_perms=1 Mar 12 23:48:44.460370 kernel: SELinux: policy capability extended_socket_class=1 Mar 12 23:48:44.460376 kernel: SELinux: policy capability always_check_network=0 Mar 12 23:48:44.460381 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 12 23:48:44.460387 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 12 23:48:44.460393 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 12 23:48:44.460398 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 12 23:48:44.460403 kernel: SELinux: policy capability userspace_initial_context=0 Mar 12 23:48:44.460409 systemd[1]: Successfully loaded SELinux policy in 188.944ms. Mar 12 23:48:44.460416 kernel: audit: type=1403 audit(1773359321.047:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 12 23:48:44.460424 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.478ms. Mar 12 23:48:44.460431 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:48:44.460438 systemd[1]: Detected virtualization microsoft. Mar 12 23:48:44.460445 systemd[1]: Detected architecture arm64. Mar 12 23:48:44.460450 systemd[1]: Detected first boot. Mar 12 23:48:44.460457 systemd[1]: Hostname set to . Mar 12 23:48:44.460463 systemd[1]: Initializing machine ID from random generator. Mar 12 23:48:44.460468 zram_generator::config[1303]: No configuration found. Mar 12 23:48:44.460475 kernel: NET: Registered PF_VSOCK protocol family Mar 12 23:48:44.460480 systemd[1]: Populated /etc with preset unit settings. Mar 12 23:48:44.460486 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 12 23:48:44.460493 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 12 23:48:44.460499 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 12 23:48:44.460505 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 12 23:48:44.460511 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 12 23:48:44.460517 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 12 23:48:44.460523 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 12 23:48:44.460529 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 12 23:48:44.460536 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 12 23:48:44.460542 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 12 23:48:44.460548 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 12 23:48:44.460555 systemd[1]: Created slice user.slice - User and Session Slice. Mar 12 23:48:44.460561 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:48:44.460567 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:48:44.460573 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 12 23:48:44.460579 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 12 23:48:44.460586 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 12 23:48:44.460592 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:48:44.460600 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 12 23:48:44.460606 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:48:44.460613 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:48:44.460619 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 12 23:48:44.460625 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 12 23:48:44.460631 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 12 23:48:44.460638 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 12 23:48:44.460644 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:48:44.460650 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:48:44.460656 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:48:44.460662 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:48:44.460668 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 12 23:48:44.460674 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 12 23:48:44.460682 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 12 23:48:44.460688 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:48:44.460695 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:48:44.460701 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:48:44.460707 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 12 23:48:44.460713 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 12 23:48:44.460720 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 12 23:48:44.460726 systemd[1]: Mounting media.mount - External Media Directory... Mar 12 23:48:44.460732 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 12 23:48:44.460738 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 12 23:48:44.460744 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 12 23:48:44.460751 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 12 23:48:44.460757 systemd[1]: Reached target machines.target - Containers. Mar 12 23:48:44.460763 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 12 23:48:44.460771 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:48:44.460777 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:48:44.460783 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 12 23:48:44.460789 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:48:44.460796 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:48:44.460816 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:48:44.460822 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 12 23:48:44.460828 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:48:44.460837 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 12 23:48:44.460843 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 12 23:48:44.460850 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 12 23:48:44.460856 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 12 23:48:44.460862 systemd[1]: Stopped systemd-fsck-usr.service. Mar 12 23:48:44.460868 kernel: fuse: init (API version 7.41) Mar 12 23:48:44.460874 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:48:44.460880 kernel: loop: module loaded Mar 12 23:48:44.460886 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:48:44.460893 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:48:44.460899 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:48:44.460905 kernel: ACPI: bus type drm_connector registered Mar 12 23:48:44.460926 systemd-journald[1407]: Collecting audit messages is disabled. Mar 12 23:48:44.460941 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 12 23:48:44.460948 systemd-journald[1407]: Journal started Mar 12 23:48:44.460963 systemd-journald[1407]: Runtime Journal (/run/log/journal/9c74f5b083814b82b7a8f9359fbc4a75) is 8M, max 78.3M, 70.3M free. Mar 12 23:48:43.652350 systemd[1]: Queued start job for default target multi-user.target. Mar 12 23:48:43.673403 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 12 23:48:43.673838 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 12 23:48:43.674126 systemd[1]: systemd-journald.service: Consumed 2.614s CPU time. Mar 12 23:48:44.485928 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 12 23:48:44.500942 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:48:44.508523 systemd[1]: verity-setup.service: Deactivated successfully. Mar 12 23:48:44.508574 systemd[1]: Stopped verity-setup.service. Mar 12 23:48:44.523341 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:48:44.524170 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 12 23:48:44.528961 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 12 23:48:44.534282 systemd[1]: Mounted media.mount - External Media Directory. Mar 12 23:48:44.538628 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 12 23:48:44.543290 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 12 23:48:44.548469 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 12 23:48:44.553007 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 12 23:48:44.558134 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:48:44.563445 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 12 23:48:44.563659 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 12 23:48:44.569980 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:48:44.570197 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:48:44.575283 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:48:44.575483 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:48:44.580600 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:48:44.580796 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:48:44.586188 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 12 23:48:44.586387 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 12 23:48:44.591212 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:48:44.591400 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:48:44.596480 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:48:44.601720 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:48:44.607531 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 12 23:48:44.613701 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 12 23:48:44.620731 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:48:44.633636 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:48:44.639657 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 12 23:48:44.650896 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 12 23:48:44.655983 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 12 23:48:44.656017 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:48:44.661448 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 12 23:48:44.668941 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 12 23:48:44.674170 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:48:44.675690 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 12 23:48:44.681601 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 12 23:48:44.686777 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:48:44.687773 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 12 23:48:44.693307 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:48:44.696445 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:48:44.722949 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 12 23:48:44.730418 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 12 23:48:44.735920 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 12 23:48:44.741658 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 12 23:48:44.748441 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 12 23:48:44.755301 systemd-journald[1407]: Time spent on flushing to /var/log/journal/9c74f5b083814b82b7a8f9359fbc4a75 is 21.494ms for 928 entries. Mar 12 23:48:44.755301 systemd-journald[1407]: System Journal (/var/log/journal/9c74f5b083814b82b7a8f9359fbc4a75) is 8M, max 2.6G, 2.6G free. Mar 12 23:48:44.789614 systemd-journald[1407]: Received client request to flush runtime journal. Mar 12 23:48:44.762326 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 12 23:48:44.769983 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 12 23:48:44.785006 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:48:44.791187 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 12 23:48:44.804487 kernel: loop0: detected capacity change from 0 to 27936 Mar 12 23:48:44.910830 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 12 23:48:44.912167 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 12 23:48:44.956755 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 12 23:48:44.966031 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:48:45.052823 systemd-tmpfiles[1457]: ACLs are not supported, ignoring. Mar 12 23:48:45.052837 systemd-tmpfiles[1457]: ACLs are not supported, ignoring. Mar 12 23:48:45.055869 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:48:45.268838 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 12 23:48:45.278007 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 12 23:48:45.286294 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:48:45.314583 systemd-udevd[1463]: Using default interface naming scheme 'v255'. Mar 12 23:48:45.328825 kernel: loop1: detected capacity change from 0 to 200864 Mar 12 23:48:45.379831 kernel: loop2: detected capacity change from 0 to 100632 Mar 12 23:48:45.467016 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:48:45.476945 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:48:45.517145 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 12 23:48:45.525497 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 12 23:48:45.631834 kernel: mousedev: PS/2 mouse device common for all mice Mar 12 23:48:45.639829 kernel: hv_vmbus: registering driver hyperv_fb Mar 12 23:48:45.651225 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 12 23:48:45.651339 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#247 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 12 23:48:45.651579 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 12 23:48:45.664825 kernel: Console: switching to colour dummy device 80x25 Mar 12 23:48:45.663918 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 12 23:48:45.676103 kernel: Console: switching to colour frame buffer device 128x48 Mar 12 23:48:45.694485 kernel: hv_vmbus: registering driver hv_balloon Mar 12 23:48:45.694585 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 12 23:48:45.694600 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 12 23:48:45.821840 kernel: loop3: detected capacity change from 0 to 119840 Mar 12 23:48:45.822457 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:48:45.832484 systemd-networkd[1479]: lo: Link UP Mar 12 23:48:45.833507 systemd-networkd[1479]: lo: Gained carrier Mar 12 23:48:45.834665 systemd-networkd[1479]: Enumeration completed Mar 12 23:48:45.835504 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:48:45.841281 systemd-networkd[1479]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:45.842848 systemd-networkd[1479]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:48:45.845099 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 12 23:48:45.854312 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 12 23:48:45.862874 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:48:45.863070 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:45.872133 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:48:45.875040 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:48:45.886842 kernel: MACsec IEEE 802.1AE Mar 12 23:48:45.916995 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:48:45.917469 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:45.918823 kernel: mlx5_core cec6:00:02.0 enP52934s1: Link up Mar 12 23:48:45.923768 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:48:45.926824 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 12 23:48:45.935002 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 12 23:48:45.948080 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:48:45.948812 kernel: hv_netvsc 7ced8d88-f908-7ced-8d88-f9087ced8d88 eth0: Data path switched to VF: enP52934s1 Mar 12 23:48:45.953593 systemd-networkd[1479]: enP52934s1: Link UP Mar 12 23:48:45.954146 systemd-networkd[1479]: eth0: Link UP Mar 12 23:48:45.954260 systemd-networkd[1479]: eth0: Gained carrier Mar 12 23:48:45.954320 systemd-networkd[1479]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:45.955360 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 12 23:48:45.962784 systemd-networkd[1479]: enP52934s1: Gained carrier Mar 12 23:48:45.970880 systemd-networkd[1479]: eth0: DHCPv4 address 10.200.20.40/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 12 23:48:46.010749 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 12 23:48:46.187833 kernel: loop4: detected capacity change from 0 to 27936 Mar 12 23:48:46.201829 kernel: loop5: detected capacity change from 0 to 200864 Mar 12 23:48:46.219830 kernel: loop6: detected capacity change from 0 to 100632 Mar 12 23:48:46.233826 kernel: loop7: detected capacity change from 0 to 119840 Mar 12 23:48:46.242897 (sd-merge)[1614]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 12 23:48:46.243315 (sd-merge)[1614]: Merged extensions into '/usr'. Mar 12 23:48:46.246072 systemd[1]: Reload requested from client PID 1443 ('systemd-sysext') (unit systemd-sysext.service)... Mar 12 23:48:46.246194 systemd[1]: Reloading... Mar 12 23:48:46.307860 zram_generator::config[1647]: No configuration found. Mar 12 23:48:46.478117 systemd[1]: Reloading finished in 231 ms. Mar 12 23:48:46.496076 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:46.502221 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 12 23:48:46.519983 systemd[1]: Starting ensure-sysext.service... Mar 12 23:48:46.526021 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:48:46.537652 systemd[1]: Reload requested from client PID 1702 ('systemctl') (unit ensure-sysext.service)... Mar 12 23:48:46.537782 systemd[1]: Reloading... Mar 12 23:48:46.572962 systemd-tmpfiles[1703]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 12 23:48:46.573387 systemd-tmpfiles[1703]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 12 23:48:46.573646 systemd-tmpfiles[1703]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 12 23:48:46.573819 systemd-tmpfiles[1703]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 12 23:48:46.574263 systemd-tmpfiles[1703]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 12 23:48:46.574405 systemd-tmpfiles[1703]: ACLs are not supported, ignoring. Mar 12 23:48:46.574436 systemd-tmpfiles[1703]: ACLs are not supported, ignoring. Mar 12 23:48:46.592218 systemd-tmpfiles[1703]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:48:46.592233 systemd-tmpfiles[1703]: Skipping /boot Mar 12 23:48:46.598044 systemd-tmpfiles[1703]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:48:46.598056 systemd-tmpfiles[1703]: Skipping /boot Mar 12 23:48:46.616823 zram_generator::config[1743]: No configuration found. Mar 12 23:48:46.761990 systemd[1]: Reloading finished in 223 ms. Mar 12 23:48:46.786832 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:48:46.800714 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:48:46.835858 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 12 23:48:46.841134 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:48:46.850083 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:48:46.859119 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:48:46.874538 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:48:46.880273 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:48:46.880560 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:48:46.882843 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 12 23:48:46.892021 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:48:46.898410 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 12 23:48:46.908001 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:48:46.908207 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:48:46.914148 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:48:46.914630 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:48:46.921538 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:48:46.921698 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:48:46.933297 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:48:46.936961 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:48:46.944032 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:48:46.951062 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:48:46.957716 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:48:46.958122 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:48:46.964293 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 12 23:48:46.971261 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:48:46.971423 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:48:46.977266 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:48:46.977398 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:48:46.984861 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:48:46.985008 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:48:46.996420 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:48:46.998216 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:48:47.009028 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:48:47.014811 augenrules[1832]: No rules Mar 12 23:48:47.015099 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:48:47.023371 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:48:47.029001 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:48:47.029119 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:48:47.029230 systemd[1]: Reached target time-set.target - System Time Set. Mar 12 23:48:47.034525 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:48:47.036852 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:48:47.042575 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 12 23:48:47.049011 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:48:47.049137 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:48:47.049137 systemd-resolved[1804]: Positive Trust Anchors: Mar 12 23:48:47.049416 systemd-resolved[1804]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:48:47.049482 systemd-resolved[1804]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:48:47.054657 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:48:47.054792 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:48:47.061227 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:48:47.061348 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:48:47.062121 systemd-resolved[1804]: Using system hostname 'ci-4459.2.4-n-6470b86a4c'. Mar 12 23:48:47.067309 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:48:47.072521 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:48:47.072683 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:48:47.081982 systemd[1]: Finished ensure-sysext.service. Mar 12 23:48:47.088394 systemd[1]: Reached target network.target - Network. Mar 12 23:48:47.093028 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:48:47.098200 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:48:47.098271 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:48:47.750570 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 12 23:48:47.756458 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 12 23:48:47.844960 systemd-networkd[1479]: eth0: Gained IPv6LL Mar 12 23:48:47.847499 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 12 23:48:47.855382 systemd[1]: Reached target network-online.target - Network is Online. Mar 12 23:48:51.283485 ldconfig[1437]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 12 23:48:51.296586 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 12 23:48:51.303950 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 12 23:48:51.325637 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 12 23:48:51.331130 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:48:51.335687 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 12 23:48:51.341145 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 12 23:48:51.347386 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 12 23:48:51.352430 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 12 23:48:51.358101 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 12 23:48:51.363861 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 12 23:48:51.363893 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:48:51.368197 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:48:51.374460 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 12 23:48:51.381328 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 12 23:48:51.387916 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 12 23:48:51.394221 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 12 23:48:51.400381 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 12 23:48:51.407204 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 12 23:48:51.413065 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 12 23:48:51.420435 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 12 23:48:51.426179 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:48:51.430781 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:48:51.435349 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:48:51.435372 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:48:51.454849 systemd[1]: Starting chronyd.service - NTP client/server... Mar 12 23:48:51.469309 systemd[1]: Starting containerd.service - containerd container runtime... Mar 12 23:48:51.475953 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 12 23:48:51.483000 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 12 23:48:51.488826 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 12 23:48:51.500933 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 12 23:48:51.506295 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 12 23:48:51.510751 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 12 23:48:51.512931 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 12 23:48:51.518542 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 12 23:48:51.519401 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:48:51.524219 jq[1858]: false Mar 12 23:48:51.524778 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 12 23:48:51.535939 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 12 23:48:51.541465 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 12 23:48:51.547870 extend-filesystems[1862]: Found /dev/sda6 Mar 12 23:48:51.548202 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 12 23:48:51.563067 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 12 23:48:51.570794 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 12 23:48:51.575392 KVP[1863]: KVP starting; pid is:1863 Mar 12 23:48:51.578123 extend-filesystems[1862]: Found /dev/sda9 Mar 12 23:48:51.592893 kernel: hv_utils: KVP IC version 4.0 Mar 12 23:48:51.578768 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 12 23:48:51.583933 chronyd[1853]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 12 23:48:51.593154 extend-filesystems[1862]: Checking size of /dev/sda9 Mar 12 23:48:51.579123 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 12 23:48:51.587526 KVP[1863]: KVP LIC Version: 3.1 Mar 12 23:48:51.579555 systemd[1]: Starting update-engine.service - Update Engine... Mar 12 23:48:51.601459 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 12 23:48:51.608321 jq[1889]: true Mar 12 23:48:51.610364 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 12 23:48:51.617945 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 12 23:48:51.618097 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 12 23:48:51.619374 systemd[1]: motdgen.service: Deactivated successfully. Mar 12 23:48:51.619516 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 12 23:48:51.621047 chronyd[1853]: Timezone right/UTC failed leap second check, ignoring Mar 12 23:48:51.628038 chronyd[1853]: Loaded seccomp filter (level 2) Mar 12 23:48:51.630175 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 12 23:48:51.636860 systemd[1]: Started chronyd.service - NTP client/server. Mar 12 23:48:51.645669 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 12 23:48:51.645897 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 12 23:48:51.651118 extend-filesystems[1862]: Old size kept for /dev/sda9 Mar 12 23:48:51.663265 update_engine[1881]: I20260312 23:48:51.661572 1881 main.cc:92] Flatcar Update Engine starting Mar 12 23:48:51.665058 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 12 23:48:51.668564 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 12 23:48:51.675244 (ntainerd)[1902]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 12 23:48:51.682717 jq[1901]: true Mar 12 23:48:51.710255 systemd-logind[1877]: New seat seat0. Mar 12 23:48:51.711954 systemd-logind[1877]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Mar 12 23:48:51.712211 systemd[1]: Started systemd-logind.service - User Login Management. Mar 12 23:48:51.736003 tar[1899]: linux-arm64/LICENSE Mar 12 23:48:51.736375 tar[1899]: linux-arm64/helm Mar 12 23:48:51.787669 dbus-daemon[1856]: [system] SELinux support is enabled Mar 12 23:48:51.788120 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 12 23:48:51.795611 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 12 23:48:51.795641 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 12 23:48:51.797816 bash[1940]: Updated "/home/core/.ssh/authorized_keys" Mar 12 23:48:51.802148 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 12 23:48:51.802172 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 12 23:48:51.802548 update_engine[1881]: I20260312 23:48:51.802202 1881 update_check_scheduler.cc:74] Next update check in 3m38s Mar 12 23:48:51.809979 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 12 23:48:51.817139 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 12 23:48:51.818309 systemd[1]: Started update-engine.service - Update Engine. Mar 12 23:48:51.820534 dbus-daemon[1856]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 12 23:48:51.835912 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 12 23:48:51.894309 coreos-metadata[1855]: Mar 12 23:48:51.893 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 12 23:48:51.899274 coreos-metadata[1855]: Mar 12 23:48:51.897 INFO Fetch successful Mar 12 23:48:51.899274 coreos-metadata[1855]: Mar 12 23:48:51.898 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 12 23:48:51.903414 coreos-metadata[1855]: Mar 12 23:48:51.903 INFO Fetch successful Mar 12 23:48:51.903414 coreos-metadata[1855]: Mar 12 23:48:51.903 INFO Fetching http://168.63.129.16/machine/6f17a268-9320-4804-8ba6-2c5e43d3b786/e11d2718%2Dd67f%2D4913%2D8f34%2Daccb8ab6b79e.%5Fci%2D4459.2.4%2Dn%2D6470b86a4c?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 12 23:48:51.906131 coreos-metadata[1855]: Mar 12 23:48:51.906 INFO Fetch successful Mar 12 23:48:51.906411 coreos-metadata[1855]: Mar 12 23:48:51.906 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 12 23:48:51.920717 coreos-metadata[1855]: Mar 12 23:48:51.919 INFO Fetch successful Mar 12 23:48:51.966792 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 12 23:48:51.976724 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 12 23:48:52.041861 sshd_keygen[1891]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 12 23:48:52.068635 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 12 23:48:52.078729 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 12 23:48:52.092150 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 12 23:48:52.103836 systemd[1]: issuegen.service: Deactivated successfully. Mar 12 23:48:52.107966 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 12 23:48:52.121133 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 12 23:48:52.138050 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 12 23:48:52.146744 locksmithd[1959]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 12 23:48:52.147934 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 12 23:48:52.166044 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 12 23:48:52.173661 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 12 23:48:52.182712 systemd[1]: Reached target getty.target - Login Prompts. Mar 12 23:48:52.254332 tar[1899]: linux-arm64/README.md Mar 12 23:48:52.269870 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 12 23:48:52.456876 containerd[1902]: time="2026-03-12T23:48:52Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 12 23:48:52.459508 containerd[1902]: time="2026-03-12T23:48:52.459465296Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.467818224Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.336µs" Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.467850232Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.467865560Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468012048Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468022744Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468040048Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468080000Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468086680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468269744Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468279600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468287144Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:48:52.468823 containerd[1902]: time="2026-03-12T23:48:52.468292080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 12 23:48:52.469051 containerd[1902]: time="2026-03-12T23:48:52.468338432Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 12 23:48:52.469051 containerd[1902]: time="2026-03-12T23:48:52.468477960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:48:52.469051 containerd[1902]: time="2026-03-12T23:48:52.468496376Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:48:52.469051 containerd[1902]: time="2026-03-12T23:48:52.468502800Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 12 23:48:52.469051 containerd[1902]: time="2026-03-12T23:48:52.468532320Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 12 23:48:52.469051 containerd[1902]: time="2026-03-12T23:48:52.468684416Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 12 23:48:52.469051 containerd[1902]: time="2026-03-12T23:48:52.468735144Z" level=info msg="metadata content store policy set" policy=shared Mar 12 23:48:52.488280 containerd[1902]: time="2026-03-12T23:48:52.488193360Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 12 23:48:52.488593 containerd[1902]: time="2026-03-12T23:48:52.488567144Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.488774712Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489719024Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489744408Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489753184Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489774368Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489789792Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489805008Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489818048Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489824280Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489833480Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489959616Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489974760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489983776Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 12 23:48:52.488792 containerd[1902]: time="2026-03-12T23:48:52.489992736Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.489999712Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490007032Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490014648Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490021576Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490028760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490037368Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490043792Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490092312Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490103000Z" level=info msg="Start snapshots syncer" Mar 12 23:48:52.496350 containerd[1902]: time="2026-03-12T23:48:52.490129952Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 12 23:48:52.490971 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:48:52.496704 containerd[1902]: time="2026-03-12T23:48:52.490312272Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 12 23:48:52.496704 containerd[1902]: time="2026-03-12T23:48:52.490354376Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490399728Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490519104Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490533784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490541040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490549656Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490557184Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490565496Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490572152Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490590168Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490599048Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490605520Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490634560Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490647744Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:48:52.496829 containerd[1902]: time="2026-03-12T23:48:52.490653344Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:48:52.496902 (kubelet)[2043]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490659112Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490664776Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490670352Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490676776Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490690480Z" level=info msg="runtime interface created" Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490693640Z" level=info msg="created NRI interface" Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490698752Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490708168Z" level=info msg="Connect containerd service" Mar 12 23:48:52.497206 containerd[1902]: time="2026-03-12T23:48:52.490727224Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 12 23:48:52.498047 containerd[1902]: time="2026-03-12T23:48:52.498025528Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 23:48:52.805837 kubelet[2043]: E0312 23:48:52.805710 2043 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:48:52.808002 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:48:52.808224 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:48:52.809886 systemd[1]: kubelet.service: Consumed 511ms CPU time, 248.9M memory peak. Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091452112Z" level=info msg="Start subscribing containerd event" Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091525640Z" level=info msg="Start recovering state" Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091611848Z" level=info msg="Start event monitor" Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091622680Z" level=info msg="Start cni network conf syncer for default" Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091630840Z" level=info msg="Start streaming server" Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091638016Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091642912Z" level=info msg="runtime interface starting up..." Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091647616Z" level=info msg="starting plugins..." Mar 12 23:48:53.091700 containerd[1902]: time="2026-03-12T23:48:53.091659256Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 12 23:48:53.097351 containerd[1902]: time="2026-03-12T23:48:53.091612672Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 12 23:48:53.097351 containerd[1902]: time="2026-03-12T23:48:53.091775936Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 12 23:48:53.097351 containerd[1902]: time="2026-03-12T23:48:53.091832616Z" level=info msg="containerd successfully booted in 0.635477s" Mar 12 23:48:53.091952 systemd[1]: Started containerd.service - containerd container runtime. Mar 12 23:48:53.099897 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 12 23:48:53.105192 systemd[1]: Startup finished in 1.770s (kernel) + 12.141s (initrd) + 12.245s (userspace) = 26.156s. Mar 12 23:48:53.429371 login[2027]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 12 23:48:53.429569 login[2026]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:53.440156 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 12 23:48:53.442030 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 12 23:48:53.443650 systemd-logind[1877]: New session 2 of user core. Mar 12 23:48:53.471624 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 12 23:48:53.475410 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 12 23:48:53.486851 (systemd)[2066]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 12 23:48:53.489187 systemd-logind[1877]: New session c1 of user core. Mar 12 23:48:53.617774 systemd[2066]: Queued start job for default target default.target. Mar 12 23:48:53.624566 systemd[2066]: Created slice app.slice - User Application Slice. Mar 12 23:48:53.624591 systemd[2066]: Reached target paths.target - Paths. Mar 12 23:48:53.624621 systemd[2066]: Reached target timers.target - Timers. Mar 12 23:48:53.625637 systemd[2066]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 12 23:48:53.635628 systemd[2066]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 12 23:48:53.635795 systemd[2066]: Reached target sockets.target - Sockets. Mar 12 23:48:53.635866 systemd[2066]: Reached target basic.target - Basic System. Mar 12 23:48:53.635886 systemd[2066]: Reached target default.target - Main User Target. Mar 12 23:48:53.635904 systemd[2066]: Startup finished in 142ms. Mar 12 23:48:53.636270 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 12 23:48:53.644965 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 12 23:48:54.076947 waagent[2024]: 2026-03-12T23:48:54.076873Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Mar 12 23:48:54.081604 waagent[2024]: 2026-03-12T23:48:54.081561Z INFO Daemon Daemon OS: flatcar 4459.2.4 Mar 12 23:48:54.085191 waagent[2024]: 2026-03-12T23:48:54.085157Z INFO Daemon Daemon Python: 3.11.13 Mar 12 23:48:54.088771 waagent[2024]: 2026-03-12T23:48:54.088715Z INFO Daemon Daemon Run daemon Mar 12 23:48:54.091972 waagent[2024]: 2026-03-12T23:48:54.091940Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.2.4' Mar 12 23:48:54.098829 waagent[2024]: 2026-03-12T23:48:54.098771Z INFO Daemon Daemon Using waagent for provisioning Mar 12 23:48:54.103211 waagent[2024]: 2026-03-12T23:48:54.103175Z INFO Daemon Daemon Activate resource disk Mar 12 23:48:54.107309 waagent[2024]: 2026-03-12T23:48:54.107280Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 12 23:48:54.116041 waagent[2024]: 2026-03-12T23:48:54.116001Z INFO Daemon Daemon Found device: None Mar 12 23:48:54.119855 waagent[2024]: 2026-03-12T23:48:54.119822Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 12 23:48:54.127584 waagent[2024]: 2026-03-12T23:48:54.127554Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 12 23:48:54.136381 waagent[2024]: 2026-03-12T23:48:54.136339Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 12 23:48:54.141033 waagent[2024]: 2026-03-12T23:48:54.140995Z INFO Daemon Daemon Running default provisioning handler Mar 12 23:48:54.149984 waagent[2024]: 2026-03-12T23:48:54.149937Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 12 23:48:54.160720 waagent[2024]: 2026-03-12T23:48:54.160675Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 12 23:48:54.168400 waagent[2024]: 2026-03-12T23:48:54.168365Z INFO Daemon Daemon cloud-init is enabled: False Mar 12 23:48:54.172608 waagent[2024]: 2026-03-12T23:48:54.172580Z INFO Daemon Daemon Copying ovf-env.xml Mar 12 23:48:54.223631 waagent[2024]: 2026-03-12T23:48:54.221032Z INFO Daemon Daemon Successfully mounted dvd Mar 12 23:48:54.261481 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 12 23:48:54.262429 waagent[2024]: 2026-03-12T23:48:54.262363Z INFO Daemon Daemon Detect protocol endpoint Mar 12 23:48:54.266767 waagent[2024]: 2026-03-12T23:48:54.266719Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 12 23:48:54.271203 waagent[2024]: 2026-03-12T23:48:54.271167Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 12 23:48:54.276913 waagent[2024]: 2026-03-12T23:48:54.276885Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 12 23:48:54.281889 waagent[2024]: 2026-03-12T23:48:54.281856Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 12 23:48:54.285944 waagent[2024]: 2026-03-12T23:48:54.285918Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 12 23:48:54.328810 waagent[2024]: 2026-03-12T23:48:54.328712Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 12 23:48:54.334594 waagent[2024]: 2026-03-12T23:48:54.334572Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 12 23:48:54.338918 waagent[2024]: 2026-03-12T23:48:54.338891Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 12 23:48:54.430572 login[2027]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:54.438666 systemd-logind[1877]: New session 1 of user core. Mar 12 23:48:54.443954 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 12 23:48:54.447027 waagent[2024]: 2026-03-12T23:48:54.446963Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 12 23:48:54.457192 waagent[2024]: 2026-03-12T23:48:54.457130Z INFO Daemon Daemon Forcing an update of the goal state. Mar 12 23:48:54.466821 waagent[2024]: 2026-03-12T23:48:54.464844Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 12 23:48:54.484138 waagent[2024]: 2026-03-12T23:48:54.484092Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.179 Mar 12 23:48:54.488691 waagent[2024]: 2026-03-12T23:48:54.488649Z INFO Daemon Mar 12 23:48:54.491021 waagent[2024]: 2026-03-12T23:48:54.490980Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 340dc136-f9e8-45b3-8076-8b236dc469ef eTag: 2991776517738330964 source: Fabric] Mar 12 23:48:54.499666 waagent[2024]: 2026-03-12T23:48:54.499629Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 12 23:48:54.505053 waagent[2024]: 2026-03-12T23:48:54.505014Z INFO Daemon Mar 12 23:48:54.507242 waagent[2024]: 2026-03-12T23:48:54.507213Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 12 23:48:54.516811 waagent[2024]: 2026-03-12T23:48:54.516772Z INFO Daemon Daemon Downloading artifacts profile blob Mar 12 23:48:54.579880 waagent[2024]: 2026-03-12T23:48:54.579731Z INFO Daemon Downloaded certificate {'thumbprint': '8192294A8FB0EEC8646DC9752794A4A60ABD632B', 'hasPrivateKey': True} Mar 12 23:48:54.587638 waagent[2024]: 2026-03-12T23:48:54.587594Z INFO Daemon Fetch goal state completed Mar 12 23:48:54.598457 waagent[2024]: 2026-03-12T23:48:54.598422Z INFO Daemon Daemon Starting provisioning Mar 12 23:48:54.602787 waagent[2024]: 2026-03-12T23:48:54.602753Z INFO Daemon Daemon Handle ovf-env.xml. Mar 12 23:48:54.606959 waagent[2024]: 2026-03-12T23:48:54.606930Z INFO Daemon Daemon Set hostname [ci-4459.2.4-n-6470b86a4c] Mar 12 23:48:54.613963 waagent[2024]: 2026-03-12T23:48:54.613919Z INFO Daemon Daemon Publish hostname [ci-4459.2.4-n-6470b86a4c] Mar 12 23:48:54.619295 waagent[2024]: 2026-03-12T23:48:54.619258Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 12 23:48:54.624153 waagent[2024]: 2026-03-12T23:48:54.624118Z INFO Daemon Daemon Primary interface is [eth0] Mar 12 23:48:54.635280 systemd-networkd[1479]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:54.635287 systemd-networkd[1479]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:48:54.635320 systemd-networkd[1479]: eth0: DHCP lease lost Mar 12 23:48:54.636592 waagent[2024]: 2026-03-12T23:48:54.636554Z INFO Daemon Daemon Create user account if not exists Mar 12 23:48:54.641588 waagent[2024]: 2026-03-12T23:48:54.641554Z INFO Daemon Daemon User core already exists, skip useradd Mar 12 23:48:54.646203 waagent[2024]: 2026-03-12T23:48:54.646167Z INFO Daemon Daemon Configure sudoer Mar 12 23:48:54.654407 waagent[2024]: 2026-03-12T23:48:54.654364Z INFO Daemon Daemon Configure sshd Mar 12 23:48:54.662178 waagent[2024]: 2026-03-12T23:48:54.662135Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 12 23:48:54.673133 waagent[2024]: 2026-03-12T23:48:54.673088Z INFO Daemon Daemon Deploy ssh public key. Mar 12 23:48:54.673876 systemd-networkd[1479]: eth0: DHCPv4 address 10.200.20.40/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 12 23:48:55.772299 waagent[2024]: 2026-03-12T23:48:55.768488Z INFO Daemon Daemon Provisioning complete Mar 12 23:48:55.783970 waagent[2024]: 2026-03-12T23:48:55.783933Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 12 23:48:55.789245 waagent[2024]: 2026-03-12T23:48:55.789205Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 12 23:48:55.796855 waagent[2024]: 2026-03-12T23:48:55.796818Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Mar 12 23:48:55.898841 waagent[2116]: 2026-03-12T23:48:55.897820Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Mar 12 23:48:55.898841 waagent[2116]: 2026-03-12T23:48:55.897950Z INFO ExtHandler ExtHandler OS: flatcar 4459.2.4 Mar 12 23:48:55.898841 waagent[2116]: 2026-03-12T23:48:55.897988Z INFO ExtHandler ExtHandler Python: 3.11.13 Mar 12 23:48:55.898841 waagent[2116]: 2026-03-12T23:48:55.898021Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Mar 12 23:48:55.938768 waagent[2116]: 2026-03-12T23:48:55.938707Z INFO ExtHandler ExtHandler Distro: flatcar-4459.2.4; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Mar 12 23:48:55.939116 waagent[2116]: 2026-03-12T23:48:55.939085Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 12 23:48:55.939233 waagent[2116]: 2026-03-12T23:48:55.939210Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 12 23:48:55.945503 waagent[2116]: 2026-03-12T23:48:55.945452Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 12 23:48:55.950994 waagent[2116]: 2026-03-12T23:48:55.950955Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.179 Mar 12 23:48:55.951487 waagent[2116]: 2026-03-12T23:48:55.951454Z INFO ExtHandler Mar 12 23:48:55.951616 waagent[2116]: 2026-03-12T23:48:55.951592Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 76d0bb89-070f-4147-a027-6c2533d44eb0 eTag: 2991776517738330964 source: Fabric] Mar 12 23:48:55.951981 waagent[2116]: 2026-03-12T23:48:55.951948Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 12 23:48:55.952494 waagent[2116]: 2026-03-12T23:48:55.952461Z INFO ExtHandler Mar 12 23:48:55.952629 waagent[2116]: 2026-03-12T23:48:55.952604Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 12 23:48:55.956500 waagent[2116]: 2026-03-12T23:48:55.956472Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 12 23:48:56.013830 waagent[2116]: 2026-03-12T23:48:56.013436Z INFO ExtHandler Downloaded certificate {'thumbprint': '8192294A8FB0EEC8646DC9752794A4A60ABD632B', 'hasPrivateKey': True} Mar 12 23:48:56.013930 waagent[2116]: 2026-03-12T23:48:56.013863Z INFO ExtHandler Fetch goal state completed Mar 12 23:48:56.028250 waagent[2116]: 2026-03-12T23:48:56.028158Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.4 27 Jan 2026 (Library: OpenSSL 3.4.4 27 Jan 2026) Mar 12 23:48:56.031767 waagent[2116]: 2026-03-12T23:48:56.031711Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2116 Mar 12 23:48:56.031899 waagent[2116]: 2026-03-12T23:48:56.031869Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 12 23:48:56.032159 waagent[2116]: 2026-03-12T23:48:56.032129Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Mar 12 23:48:56.033278 waagent[2116]: 2026-03-12T23:48:56.033239Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.2.4', '', 'Flatcar Container Linux by Kinvolk'] Mar 12 23:48:56.033604 waagent[2116]: 2026-03-12T23:48:56.033571Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.2.4', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Mar 12 23:48:56.033724 waagent[2116]: 2026-03-12T23:48:56.033699Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Mar 12 23:48:56.034192 waagent[2116]: 2026-03-12T23:48:56.034158Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 12 23:48:56.072447 waagent[2116]: 2026-03-12T23:48:56.072407Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 12 23:48:56.072641 waagent[2116]: 2026-03-12T23:48:56.072610Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 12 23:48:56.077229 waagent[2116]: 2026-03-12T23:48:56.077190Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 12 23:48:56.081708 systemd[1]: Reload requested from client PID 2131 ('systemctl') (unit waagent.service)... Mar 12 23:48:56.081722 systemd[1]: Reloading... Mar 12 23:48:56.154828 zram_generator::config[2176]: No configuration found. Mar 12 23:48:56.297299 systemd[1]: Reloading finished in 215 ms. Mar 12 23:48:56.322830 waagent[2116]: 2026-03-12T23:48:56.321544Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 12 23:48:56.322830 waagent[2116]: 2026-03-12T23:48:56.321685Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 12 23:48:56.568217 waagent[2116]: 2026-03-12T23:48:56.568084Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 12 23:48:56.568428 waagent[2116]: 2026-03-12T23:48:56.568392Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Mar 12 23:48:56.569092 waagent[2116]: 2026-03-12T23:48:56.569043Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 12 23:48:56.569410 waagent[2116]: 2026-03-12T23:48:56.569334Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 12 23:48:56.570245 waagent[2116]: 2026-03-12T23:48:56.569580Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 12 23:48:56.570245 waagent[2116]: 2026-03-12T23:48:56.569653Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 12 23:48:56.570245 waagent[2116]: 2026-03-12T23:48:56.569845Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 12 23:48:56.570245 waagent[2116]: 2026-03-12T23:48:56.570001Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 12 23:48:56.570245 waagent[2116]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 12 23:48:56.570245 waagent[2116]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 12 23:48:56.570245 waagent[2116]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 12 23:48:56.570245 waagent[2116]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 12 23:48:56.570245 waagent[2116]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 12 23:48:56.570245 waagent[2116]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 12 23:48:56.570541 waagent[2116]: 2026-03-12T23:48:56.570499Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 12 23:48:56.570601 waagent[2116]: 2026-03-12T23:48:56.570554Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 12 23:48:56.570956 waagent[2116]: 2026-03-12T23:48:56.570918Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 12 23:48:56.570998 waagent[2116]: 2026-03-12T23:48:56.570968Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 12 23:48:56.571081 waagent[2116]: 2026-03-12T23:48:56.571056Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 12 23:48:56.571481 waagent[2116]: 2026-03-12T23:48:56.571453Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 12 23:48:56.571622 waagent[2116]: 2026-03-12T23:48:56.571588Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 12 23:48:56.571761 waagent[2116]: 2026-03-12T23:48:56.571724Z INFO EnvHandler ExtHandler Configure routes Mar 12 23:48:56.571808 waagent[2116]: 2026-03-12T23:48:56.571784Z INFO EnvHandler ExtHandler Gateway:None Mar 12 23:48:56.571853 waagent[2116]: 2026-03-12T23:48:56.571836Z INFO EnvHandler ExtHandler Routes:None Mar 12 23:48:56.577407 waagent[2116]: 2026-03-12T23:48:56.577360Z INFO ExtHandler ExtHandler Mar 12 23:48:56.577464 waagent[2116]: 2026-03-12T23:48:56.577431Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: cc19f630-1eeb-4aab-8a60-97c4baf30086 correlation e91705fe-6224-4ca1-9794-146af0913396 created: 2026-03-12T23:47:54.889908Z] Mar 12 23:48:56.577721 waagent[2116]: 2026-03-12T23:48:56.577688Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 12 23:48:56.578145 waagent[2116]: 2026-03-12T23:48:56.578117Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Mar 12 23:48:56.603388 waagent[2116]: 2026-03-12T23:48:56.603324Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Mar 12 23:48:56.603388 waagent[2116]: Try `iptables -h' or 'iptables --help' for more information.) Mar 12 23:48:56.603748 waagent[2116]: 2026-03-12T23:48:56.603711Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: EBA53730-07FE-4EE0-9F4B-1C3517E1AD79;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Mar 12 23:48:56.617141 waagent[2116]: 2026-03-12T23:48:56.616764Z INFO MonitorHandler ExtHandler Network interfaces: Mar 12 23:48:56.617141 waagent[2116]: Executing ['ip', '-a', '-o', 'link']: Mar 12 23:48:56.617141 waagent[2116]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 12 23:48:56.617141 waagent[2116]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:88:f9:08 brd ff:ff:ff:ff:ff:ff Mar 12 23:48:56.617141 waagent[2116]: 3: enP52934s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:88:f9:08 brd ff:ff:ff:ff:ff:ff\ altname enP52934p0s2 Mar 12 23:48:56.617141 waagent[2116]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 12 23:48:56.617141 waagent[2116]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 12 23:48:56.617141 waagent[2116]: 2: eth0 inet 10.200.20.40/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 12 23:48:56.617141 waagent[2116]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 12 23:48:56.617141 waagent[2116]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 12 23:48:56.617141 waagent[2116]: 2: eth0 inet6 fe80::7eed:8dff:fe88:f908/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 12 23:48:56.800068 waagent[2116]: 2026-03-12T23:48:56.800011Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Mar 12 23:48:56.800068 waagent[2116]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 12 23:48:56.800068 waagent[2116]: pkts bytes target prot opt in out source destination Mar 12 23:48:56.800068 waagent[2116]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 12 23:48:56.800068 waagent[2116]: pkts bytes target prot opt in out source destination Mar 12 23:48:56.800068 waagent[2116]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 12 23:48:56.800068 waagent[2116]: pkts bytes target prot opt in out source destination Mar 12 23:48:56.800068 waagent[2116]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 12 23:48:56.800068 waagent[2116]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 12 23:48:56.800068 waagent[2116]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 12 23:48:56.802833 waagent[2116]: 2026-03-12T23:48:56.802725Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 12 23:48:56.802833 waagent[2116]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 12 23:48:56.802833 waagent[2116]: pkts bytes target prot opt in out source destination Mar 12 23:48:56.802833 waagent[2116]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 12 23:48:56.802833 waagent[2116]: pkts bytes target prot opt in out source destination Mar 12 23:48:56.802833 waagent[2116]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 12 23:48:56.802833 waagent[2116]: pkts bytes target prot opt in out source destination Mar 12 23:48:56.802833 waagent[2116]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 12 23:48:56.802833 waagent[2116]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 12 23:48:56.802833 waagent[2116]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 12 23:48:56.803217 waagent[2116]: 2026-03-12T23:48:56.803192Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 12 23:49:03.058997 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 12 23:49:03.060643 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:03.162016 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:03.165256 (kubelet)[2265]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:49:03.286648 kubelet[2265]: E0312 23:49:03.286572 2265 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:49:03.288946 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:49:03.289062 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:49:03.289354 systemd[1]: kubelet.service: Consumed 107ms CPU time, 107.2M memory peak. Mar 12 23:49:10.994973 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 12 23:49:10.995861 systemd[1]: Started sshd@0-10.200.20.40:22-10.200.16.10:35860.service - OpenSSH per-connection server daemon (10.200.16.10:35860). Mar 12 23:49:11.551882 sshd[2272]: Accepted publickey for core from 10.200.16.10 port 35860 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:49:11.552958 sshd-session[2272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:49:11.556866 systemd-logind[1877]: New session 3 of user core. Mar 12 23:49:11.565934 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 12 23:49:11.855460 systemd[1]: Started sshd@1-10.200.20.40:22-10.200.16.10:35872.service - OpenSSH per-connection server daemon (10.200.16.10:35872). Mar 12 23:49:12.279173 sshd[2278]: Accepted publickey for core from 10.200.16.10 port 35872 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:49:12.280220 sshd-session[2278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:49:12.283831 systemd-logind[1877]: New session 4 of user core. Mar 12 23:49:12.295113 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 12 23:49:12.514279 sshd[2281]: Connection closed by 10.200.16.10 port 35872 Mar 12 23:49:12.514871 sshd-session[2278]: pam_unix(sshd:session): session closed for user core Mar 12 23:49:12.517679 systemd-logind[1877]: Session 4 logged out. Waiting for processes to exit. Mar 12 23:49:12.517848 systemd[1]: sshd@1-10.200.20.40:22-10.200.16.10:35872.service: Deactivated successfully. Mar 12 23:49:12.519296 systemd[1]: session-4.scope: Deactivated successfully. Mar 12 23:49:12.522222 systemd-logind[1877]: Removed session 4. Mar 12 23:49:12.613351 systemd[1]: Started sshd@2-10.200.20.40:22-10.200.16.10:35880.service - OpenSSH per-connection server daemon (10.200.16.10:35880). Mar 12 23:49:13.036709 sshd[2287]: Accepted publickey for core from 10.200.16.10 port 35880 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:49:13.037717 sshd-session[2287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:49:13.041105 systemd-logind[1877]: New session 5 of user core. Mar 12 23:49:13.049095 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 12 23:49:13.268584 sshd[2290]: Connection closed by 10.200.16.10 port 35880 Mar 12 23:49:13.268488 sshd-session[2287]: pam_unix(sshd:session): session closed for user core Mar 12 23:49:13.271575 systemd[1]: sshd@2-10.200.20.40:22-10.200.16.10:35880.service: Deactivated successfully. Mar 12 23:49:13.273098 systemd[1]: session-5.scope: Deactivated successfully. Mar 12 23:49:13.273838 systemd-logind[1877]: Session 5 logged out. Waiting for processes to exit. Mar 12 23:49:13.275036 systemd-logind[1877]: Removed session 5. Mar 12 23:49:13.313377 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 12 23:49:13.314811 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:13.363016 systemd[1]: Started sshd@3-10.200.20.40:22-10.200.16.10:35884.service - OpenSSH per-connection server daemon (10.200.16.10:35884). Mar 12 23:49:13.416822 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:13.421066 (kubelet)[2307]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:49:13.537729 kubelet[2307]: E0312 23:49:13.537679 2307 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:49:13.539882 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:49:13.539996 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:49:13.541893 systemd[1]: kubelet.service: Consumed 104ms CPU time, 107.2M memory peak. Mar 12 23:49:13.781411 sshd[2299]: Accepted publickey for core from 10.200.16.10 port 35884 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:49:13.782391 sshd-session[2299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:49:13.786107 systemd-logind[1877]: New session 6 of user core. Mar 12 23:49:13.795935 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 12 23:49:14.015994 sshd[2314]: Connection closed by 10.200.16.10 port 35884 Mar 12 23:49:14.015351 sshd-session[2299]: pam_unix(sshd:session): session closed for user core Mar 12 23:49:14.018341 systemd[1]: sshd@3-10.200.20.40:22-10.200.16.10:35884.service: Deactivated successfully. Mar 12 23:49:14.019636 systemd[1]: session-6.scope: Deactivated successfully. Mar 12 23:49:14.021565 systemd-logind[1877]: Session 6 logged out. Waiting for processes to exit. Mar 12 23:49:14.022532 systemd-logind[1877]: Removed session 6. Mar 12 23:49:14.110016 systemd[1]: Started sshd@4-10.200.20.40:22-10.200.16.10:35886.service - OpenSSH per-connection server daemon (10.200.16.10:35886). Mar 12 23:49:14.528902 sshd[2320]: Accepted publickey for core from 10.200.16.10 port 35886 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:49:14.529623 sshd-session[2320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:49:14.533225 systemd-logind[1877]: New session 7 of user core. Mar 12 23:49:14.547929 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 12 23:49:14.820778 sudo[2324]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 12 23:49:14.821015 sudo[2324]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:49:14.847102 sudo[2324]: pam_unix(sudo:session): session closed for user root Mar 12 23:49:14.925919 sshd[2323]: Connection closed by 10.200.16.10 port 35886 Mar 12 23:49:14.925200 sshd-session[2320]: pam_unix(sshd:session): session closed for user core Mar 12 23:49:14.928234 systemd-logind[1877]: Session 7 logged out. Waiting for processes to exit. Mar 12 23:49:14.928410 systemd[1]: sshd@4-10.200.20.40:22-10.200.16.10:35886.service: Deactivated successfully. Mar 12 23:49:14.929982 systemd[1]: session-7.scope: Deactivated successfully. Mar 12 23:49:14.933502 systemd-logind[1877]: Removed session 7. Mar 12 23:49:15.006569 systemd[1]: Started sshd@5-10.200.20.40:22-10.200.16.10:35890.service - OpenSSH per-connection server daemon (10.200.16.10:35890). Mar 12 23:49:15.407843 sshd[2330]: Accepted publickey for core from 10.200.16.10 port 35890 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:49:15.408581 sshd-session[2330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:49:15.411967 systemd-logind[1877]: New session 8 of user core. Mar 12 23:49:15.422106 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 12 23:49:15.445091 chronyd[1853]: Selected source PHC0 Mar 12 23:49:15.556627 sudo[2335]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 12 23:49:15.557335 sudo[2335]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:49:15.565602 sudo[2335]: pam_unix(sudo:session): session closed for user root Mar 12 23:49:15.569552 sudo[2334]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 12 23:49:15.569760 sudo[2334]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:49:15.577082 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:49:15.603112 augenrules[2357]: No rules Mar 12 23:49:15.604267 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:49:15.605859 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:49:15.607197 sudo[2334]: pam_unix(sudo:session): session closed for user root Mar 12 23:49:15.679925 sshd[2333]: Connection closed by 10.200.16.10 port 35890 Mar 12 23:49:15.680608 sshd-session[2330]: pam_unix(sshd:session): session closed for user core Mar 12 23:49:15.684188 systemd[1]: sshd@5-10.200.20.40:22-10.200.16.10:35890.service: Deactivated successfully. Mar 12 23:49:15.685719 systemd[1]: session-8.scope: Deactivated successfully. Mar 12 23:49:15.686443 systemd-logind[1877]: Session 8 logged out. Waiting for processes to exit. Mar 12 23:49:15.688182 systemd-logind[1877]: Removed session 8. Mar 12 23:49:15.778546 systemd[1]: Started sshd@6-10.200.20.40:22-10.200.16.10:35902.service - OpenSSH per-connection server daemon (10.200.16.10:35902). Mar 12 23:49:16.192561 sshd[2366]: Accepted publickey for core from 10.200.16.10 port 35902 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:49:16.193559 sshd-session[2366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:49:16.196906 systemd-logind[1877]: New session 9 of user core. Mar 12 23:49:16.204035 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 12 23:49:16.349305 sudo[2370]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 12 23:49:16.349516 sudo[2370]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:49:17.719147 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 12 23:49:17.729053 (dockerd)[2387]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 12 23:49:19.275645 dockerd[2387]: time="2026-03-12T23:49:19.275589601Z" level=info msg="Starting up" Mar 12 23:49:19.276295 dockerd[2387]: time="2026-03-12T23:49:19.276273313Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 12 23:49:19.284658 dockerd[2387]: time="2026-03-12T23:49:19.284629553Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 12 23:49:19.347728 dockerd[2387]: time="2026-03-12T23:49:19.347691401Z" level=info msg="Loading containers: start." Mar 12 23:49:19.386849 kernel: Initializing XFRM netlink socket Mar 12 23:49:19.691541 systemd-networkd[1479]: docker0: Link UP Mar 12 23:49:19.707390 dockerd[2387]: time="2026-03-12T23:49:19.707341441Z" level=info msg="Loading containers: done." Mar 12 23:49:19.728091 dockerd[2387]: time="2026-03-12T23:49:19.727766241Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 12 23:49:19.728091 dockerd[2387]: time="2026-03-12T23:49:19.727869105Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 12 23:49:19.728091 dockerd[2387]: time="2026-03-12T23:49:19.727953065Z" level=info msg="Initializing buildkit" Mar 12 23:49:19.775926 dockerd[2387]: time="2026-03-12T23:49:19.775892185Z" level=info msg="Completed buildkit initialization" Mar 12 23:49:19.781423 dockerd[2387]: time="2026-03-12T23:49:19.781382361Z" level=info msg="Daemon has completed initialization" Mar 12 23:49:19.781598 dockerd[2387]: time="2026-03-12T23:49:19.781499273Z" level=info msg="API listen on /run/docker.sock" Mar 12 23:49:19.781672 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 12 23:49:20.158677 containerd[1902]: time="2026-03-12T23:49:20.158386769Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 12 23:49:21.211221 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1871905838.mount: Deactivated successfully. Mar 12 23:49:22.339608 containerd[1902]: time="2026-03-12T23:49:22.339547977Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:22.348789 containerd[1902]: time="2026-03-12T23:49:22.348754537Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=24583252" Mar 12 23:49:22.351964 containerd[1902]: time="2026-03-12T23:49:22.351938817Z" level=info msg="ImageCreate event name:\"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:22.357427 containerd[1902]: time="2026-03-12T23:49:22.356825889Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:22.357427 containerd[1902]: time="2026-03-12T23:49:22.357230265Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"24579851\" in 2.198806648s" Mar 12 23:49:22.357427 containerd[1902]: time="2026-03-12T23:49:22.357260033Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\"" Mar 12 23:49:22.357988 containerd[1902]: time="2026-03-12T23:49:22.357959473Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 12 23:49:23.563278 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 12 23:49:23.566075 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:23.675409 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:23.683059 (kubelet)[2664]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:49:23.775175 kubelet[2664]: E0312 23:49:23.775104 2664 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:49:23.777218 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:49:23.777435 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:49:23.779873 systemd[1]: kubelet.service: Consumed 108ms CPU time, 107.1M memory peak. Mar 12 23:49:24.145836 containerd[1902]: time="2026-03-12T23:49:24.145193157Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:24.148689 containerd[1902]: time="2026-03-12T23:49:24.148665394Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=19139641" Mar 12 23:49:24.152010 containerd[1902]: time="2026-03-12T23:49:24.151988130Z" level=info msg="ImageCreate event name:\"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:24.156163 containerd[1902]: time="2026-03-12T23:49:24.156137906Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:24.156602 containerd[1902]: time="2026-03-12T23:49:24.156572302Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"20724045\" in 1.798585139s" Mar 12 23:49:24.156602 containerd[1902]: time="2026-03-12T23:49:24.156602192Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\"" Mar 12 23:49:24.157283 containerd[1902]: time="2026-03-12T23:49:24.157016469Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 12 23:49:25.301639 containerd[1902]: time="2026-03-12T23:49:25.301573092Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:25.328689 containerd[1902]: time="2026-03-12T23:49:25.327283217Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=14195544" Mar 12 23:49:25.731327 containerd[1902]: time="2026-03-12T23:49:25.731280539Z" level=info msg="ImageCreate event name:\"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:25.737544 containerd[1902]: time="2026-03-12T23:49:25.737141113Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:25.738030 containerd[1902]: time="2026-03-12T23:49:25.738005029Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"15779966\" in 1.580960912s" Mar 12 23:49:25.738125 containerd[1902]: time="2026-03-12T23:49:25.738111401Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\"" Mar 12 23:49:25.738625 containerd[1902]: time="2026-03-12T23:49:25.738599717Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 12 23:49:26.747300 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1047885981.mount: Deactivated successfully. Mar 12 23:49:26.969656 containerd[1902]: time="2026-03-12T23:49:26.969596433Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:26.972444 containerd[1902]: time="2026-03-12T23:49:26.972416356Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=22697088" Mar 12 23:49:26.975794 containerd[1902]: time="2026-03-12T23:49:26.975691529Z" level=info msg="ImageCreate event name:\"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:26.979674 containerd[1902]: time="2026-03-12T23:49:26.979651506Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:26.980008 containerd[1902]: time="2026-03-12T23:49:26.979935789Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"22696107\" in 1.241307815s" Mar 12 23:49:26.980008 containerd[1902]: time="2026-03-12T23:49:26.979963703Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\"" Mar 12 23:49:26.980478 containerd[1902]: time="2026-03-12T23:49:26.980447114Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 12 23:49:27.640467 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount285369291.mount: Deactivated successfully. Mar 12 23:49:28.617089 containerd[1902]: time="2026-03-12T23:49:28.617029290Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:28.620286 containerd[1902]: time="2026-03-12T23:49:28.620079590Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395406" Mar 12 23:49:28.623486 containerd[1902]: time="2026-03-12T23:49:28.623460487Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:28.627675 containerd[1902]: time="2026-03-12T23:49:28.627642065Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:28.628335 containerd[1902]: time="2026-03-12T23:49:28.628307020Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.647830217s" Mar 12 23:49:28.628428 containerd[1902]: time="2026-03-12T23:49:28.628414921Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Mar 12 23:49:28.629041 containerd[1902]: time="2026-03-12T23:49:28.629016209Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 12 23:49:29.206502 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4097198100.mount: Deactivated successfully. Mar 12 23:49:29.226678 containerd[1902]: time="2026-03-12T23:49:29.226199375Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:29.229943 containerd[1902]: time="2026-03-12T23:49:29.229916462Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709" Mar 12 23:49:29.233886 containerd[1902]: time="2026-03-12T23:49:29.233861526Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:29.238837 containerd[1902]: time="2026-03-12T23:49:29.238775086Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:29.239241 containerd[1902]: time="2026-03-12T23:49:29.239209776Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 610.163566ms" Mar 12 23:49:29.239241 containerd[1902]: time="2026-03-12T23:49:29.239240057Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 12 23:49:29.239781 containerd[1902]: time="2026-03-12T23:49:29.239686323Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 12 23:49:29.864702 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3298092392.mount: Deactivated successfully. Mar 12 23:49:31.095844 containerd[1902]: time="2026-03-12T23:49:31.095304353Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:31.098356 containerd[1902]: time="2026-03-12T23:49:31.098303491Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21125515" Mar 12 23:49:31.101795 containerd[1902]: time="2026-03-12T23:49:31.101749631Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:31.106031 containerd[1902]: time="2026-03-12T23:49:31.105983283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:31.106669 containerd[1902]: time="2026-03-12T23:49:31.106506392Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 1.866794132s" Mar 12 23:49:31.106669 containerd[1902]: time="2026-03-12T23:49:31.106535786Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"" Mar 12 23:49:33.813304 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 12 23:49:33.816967 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:33.853839 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 12 23:49:33.926382 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:33.930138 (kubelet)[2828]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:49:33.958610 kubelet[2828]: E0312 23:49:33.958574 2828 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:49:33.961097 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:49:33.961198 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:49:33.961442 systemd[1]: kubelet.service: Consumed 101ms CPU time, 106.5M memory peak. Mar 12 23:49:34.210428 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:34.210751 systemd[1]: kubelet.service: Consumed 101ms CPU time, 106.5M memory peak. Mar 12 23:49:34.212696 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:34.233869 systemd[1]: Reload requested from client PID 2843 ('systemctl') (unit session-9.scope)... Mar 12 23:49:34.233992 systemd[1]: Reloading... Mar 12 23:49:34.320982 zram_generator::config[2893]: No configuration found. Mar 12 23:49:34.480266 systemd[1]: Reloading finished in 245 ms. Mar 12 23:49:34.534592 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 12 23:49:34.534785 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 12 23:49:34.535108 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:34.535876 systemd[1]: kubelet.service: Consumed 74ms CPU time, 95M memory peak. Mar 12 23:49:34.539117 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:34.959553 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:34.966097 (kubelet)[2957]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:49:34.998757 kubelet[2957]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 23:49:34.998757 kubelet[2957]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:49:34.999395 kubelet[2957]: I0312 23:49:34.999348 2957 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 23:49:35.905115 kubelet[2957]: I0312 23:49:35.905070 2957 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 12 23:49:35.905115 kubelet[2957]: I0312 23:49:35.905104 2957 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:49:35.905115 kubelet[2957]: I0312 23:49:35.905128 2957 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 23:49:35.905356 kubelet[2957]: I0312 23:49:35.905134 2957 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:49:35.905356 kubelet[2957]: I0312 23:49:35.905310 2957 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 23:49:35.932825 kubelet[2957]: E0312 23:49:35.932654 2957 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.40:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 23:49:35.932825 kubelet[2957]: I0312 23:49:35.932695 2957 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:49:35.938224 kubelet[2957]: I0312 23:49:35.938146 2957 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:49:35.941049 kubelet[2957]: I0312 23:49:35.941031 2957 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 23:49:35.941220 kubelet[2957]: I0312 23:49:35.941196 2957 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:49:35.941323 kubelet[2957]: I0312 23:49:35.941212 2957 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.4-n-6470b86a4c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:49:35.941323 kubelet[2957]: I0312 23:49:35.941322 2957 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 23:49:35.941426 kubelet[2957]: I0312 23:49:35.941328 2957 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 23:49:35.941426 kubelet[2957]: I0312 23:49:35.941412 2957 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 23:49:35.987968 kubelet[2957]: I0312 23:49:35.987934 2957 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:35.989356 kubelet[2957]: I0312 23:49:35.989336 2957 kubelet.go:475] "Attempting to sync node with API server" Mar 12 23:49:35.989408 kubelet[2957]: I0312 23:49:35.989366 2957 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:49:35.989988 kubelet[2957]: I0312 23:49:35.989970 2957 kubelet.go:387] "Adding apiserver pod source" Mar 12 23:49:35.990069 kubelet[2957]: I0312 23:49:35.989994 2957 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:49:35.991121 kubelet[2957]: E0312 23:49:35.991096 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.4-n-6470b86a4c&limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 23:49:35.991880 kubelet[2957]: E0312 23:49:35.991818 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 23:49:35.991940 kubelet[2957]: I0312 23:49:35.991891 2957 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:49:35.992303 kubelet[2957]: I0312 23:49:35.992274 2957 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:49:35.992303 kubelet[2957]: I0312 23:49:35.992297 2957 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 23:49:35.992361 kubelet[2957]: W0312 23:49:35.992324 2957 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 12 23:49:35.995991 kubelet[2957]: I0312 23:49:35.995911 2957 server.go:1262] "Started kubelet" Mar 12 23:49:35.996724 kubelet[2957]: I0312 23:49:35.996633 2957 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:49:35.996724 kubelet[2957]: I0312 23:49:35.996689 2957 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 23:49:35.997343 kubelet[2957]: I0312 23:49:35.997166 2957 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 23:49:35.997565 kubelet[2957]: I0312 23:49:35.997459 2957 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:49:35.998598 kubelet[2957]: I0312 23:49:35.998567 2957 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:49:36.000168 kubelet[2957]: I0312 23:49:36.000092 2957 server.go:310] "Adding debug handlers to kubelet server" Mar 12 23:49:36.002591 kubelet[2957]: I0312 23:49:36.002391 2957 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:49:36.003988 kubelet[2957]: I0312 23:49:36.003974 2957 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 12 23:49:36.004807 kubelet[2957]: E0312 23:49:36.004204 2957 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" Mar 12 23:49:36.005619 kubelet[2957]: E0312 23:49:36.003829 2957 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.40:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.40:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.4-n-6470b86a4c.189c3cf4f157e14d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.4-n-6470b86a4c,UID:ci-4459.2.4-n-6470b86a4c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.4-n-6470b86a4c,},FirstTimestamp:2026-03-12 23:49:35.995887949 +0000 UTC m=+1.027377622,LastTimestamp:2026-03-12 23:49:35.995887949 +0000 UTC m=+1.027377622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.4-n-6470b86a4c,}" Mar 12 23:49:36.005619 kubelet[2957]: I0312 23:49:36.005522 2957 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:49:36.006057 kubelet[2957]: I0312 23:49:36.005669 2957 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:49:36.006280 kubelet[2957]: I0312 23:49:36.005541 2957 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 23:49:36.006423 kubelet[2957]: I0312 23:49:36.005573 2957 reconciler.go:29] "Reconciler: start to sync state" Mar 12 23:49:36.006964 kubelet[2957]: E0312 23:49:36.006848 2957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.4-n-6470b86a4c?timeout=10s\": dial tcp 10.200.20.40:6443: connect: connection refused" interval="200ms" Mar 12 23:49:36.007118 kubelet[2957]: E0312 23:49:36.006905 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 23:49:36.007496 kubelet[2957]: E0312 23:49:36.007465 2957 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 23:49:36.007761 kubelet[2957]: I0312 23:49:36.007744 2957 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:49:36.019041 kubelet[2957]: I0312 23:49:36.018891 2957 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 23:49:36.019041 kubelet[2957]: I0312 23:49:36.018904 2957 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 23:49:36.019041 kubelet[2957]: I0312 23:49:36.018916 2957 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:36.036094 kubelet[2957]: I0312 23:49:36.036074 2957 policy_none.go:49] "None policy: Start" Mar 12 23:49:36.036190 kubelet[2957]: I0312 23:49:36.036182 2957 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 23:49:36.036237 kubelet[2957]: I0312 23:49:36.036228 2957 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 23:49:36.046422 kubelet[2957]: I0312 23:49:36.046406 2957 policy_none.go:47] "Start" Mar 12 23:49:36.047553 kubelet[2957]: I0312 23:49:36.047525 2957 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 23:49:36.050444 kubelet[2957]: I0312 23:49:36.050395 2957 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 23:49:36.050444 kubelet[2957]: I0312 23:49:36.050409 2957 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 12 23:49:36.050444 kubelet[2957]: I0312 23:49:36.050426 2957 kubelet.go:2428] "Starting kubelet main sync loop" Mar 12 23:49:36.050991 kubelet[2957]: E0312 23:49:36.050660 2957 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:49:36.051144 kubelet[2957]: E0312 23:49:36.051128 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 23:49:36.053485 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 12 23:49:36.063326 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 12 23:49:36.066323 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 12 23:49:36.077749 kubelet[2957]: E0312 23:49:36.077678 2957 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:49:36.079009 kubelet[2957]: I0312 23:49:36.078996 2957 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 23:49:36.079615 kubelet[2957]: I0312 23:49:36.079266 2957 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:49:36.079615 kubelet[2957]: I0312 23:49:36.079525 2957 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 23:49:36.080748 kubelet[2957]: E0312 23:49:36.080720 2957 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:49:36.080878 kubelet[2957]: E0312 23:49:36.080865 2957 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.2.4-n-6470b86a4c\" not found" Mar 12 23:49:36.181044 kubelet[2957]: I0312 23:49:36.180872 2957 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.181512 kubelet[2957]: E0312 23:49:36.181488 2957 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.40:6443/api/v1/nodes\": dial tcp 10.200.20.40:6443: connect: connection refused" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.206820 kubelet[2957]: I0312 23:49:36.206768 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3baa11c9def2f2592185abad0ba92f0d-ca-certs\") pod \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" (UID: \"3baa11c9def2f2592185abad0ba92f0d\") " pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.206967 kubelet[2957]: I0312 23:49:36.206793 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3baa11c9def2f2592185abad0ba92f0d-k8s-certs\") pod \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" (UID: \"3baa11c9def2f2592185abad0ba92f0d\") " pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.207066 kubelet[2957]: I0312 23:49:36.207054 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3baa11c9def2f2592185abad0ba92f0d-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" (UID: \"3baa11c9def2f2592185abad0ba92f0d\") " pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.207480 kubelet[2957]: E0312 23:49:36.207455 2957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.4-n-6470b86a4c?timeout=10s\": dial tcp 10.200.20.40:6443: connect: connection refused" interval="400ms" Mar 12 23:49:36.383642 kubelet[2957]: I0312 23:49:36.383560 2957 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.384019 kubelet[2957]: E0312 23:49:36.383994 2957 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.40:6443/api/v1/nodes\": dial tcp 10.200.20.40:6443: connect: connection refused" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.598862 systemd[1]: Created slice kubepods-burstable-pod3baa11c9def2f2592185abad0ba92f0d.slice - libcontainer container kubepods-burstable-pod3baa11c9def2f2592185abad0ba92f0d.slice. Mar 12 23:49:36.604492 kubelet[2957]: E0312 23:49:36.604395 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.607913 kubelet[2957]: E0312 23:49:36.607881 2957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.4-n-6470b86a4c?timeout=10s\": dial tcp 10.200.20.40:6443: connect: connection refused" interval="800ms" Mar 12 23:49:36.608699 systemd[1]: Created slice kubepods-burstable-pode6cbb2457522a4b95b9476867dd26e57.slice - libcontainer container kubepods-burstable-pode6cbb2457522a4b95b9476867dd26e57.slice. Mar 12 23:49:36.610116 kubelet[2957]: I0312 23:49:36.609636 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.610116 kubelet[2957]: I0312 23:49:36.609663 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.610116 kubelet[2957]: I0312 23:49:36.609675 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.610116 kubelet[2957]: I0312 23:49:36.609686 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1958f6dbb48eb1e067f5682e2c22a80e-kubeconfig\") pod \"kube-scheduler-ci-4459.2.4-n-6470b86a4c\" (UID: \"1958f6dbb48eb1e067f5682e2c22a80e\") " pod="kube-system/kube-scheduler-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.610116 kubelet[2957]: I0312 23:49:36.609697 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-ca-certs\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.610274 kubelet[2957]: I0312 23:49:36.609707 2957 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.611633 kubelet[2957]: E0312 23:49:36.611416 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.612414 containerd[1902]: time="2026-03-12T23:49:36.612376374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.4-n-6470b86a4c,Uid:3baa11c9def2f2592185abad0ba92f0d,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:36.618741 systemd[1]: Created slice kubepods-burstable-pod1958f6dbb48eb1e067f5682e2c22a80e.slice - libcontainer container kubepods-burstable-pod1958f6dbb48eb1e067f5682e2c22a80e.slice. Mar 12 23:49:36.620840 kubelet[2957]: E0312 23:49:36.620816 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.785685 kubelet[2957]: I0312 23:49:36.785648 2957 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.785982 kubelet[2957]: E0312 23:49:36.785957 2957 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.40:6443/api/v1/nodes\": dial tcp 10.200.20.40:6443: connect: connection refused" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:36.917652 containerd[1902]: time="2026-03-12T23:49:36.917305082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.4-n-6470b86a4c,Uid:e6cbb2457522a4b95b9476867dd26e57,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:36.926674 containerd[1902]: time="2026-03-12T23:49:36.926638733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.4-n-6470b86a4c,Uid:1958f6dbb48eb1e067f5682e2c22a80e,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:36.948906 update_engine[1881]: I20260312 23:49:36.948848 1881 update_attempter.cc:509] Updating boot flags... Mar 12 23:49:36.976977 kubelet[2957]: E0312 23:49:36.976070 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.4-n-6470b86a4c&limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 23:49:36.995796 kubelet[2957]: E0312 23:49:36.995770 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 23:49:37.114476 kubelet[2957]: E0312 23:49:37.114435 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 23:49:37.324139 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1945228019.mount: Deactivated successfully. Mar 12 23:49:37.345560 containerd[1902]: time="2026-03-12T23:49:37.345096737Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:37.356492 containerd[1902]: time="2026-03-12T23:49:37.356458441Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 12 23:49:37.362867 containerd[1902]: time="2026-03-12T23:49:37.362829935Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:37.369315 containerd[1902]: time="2026-03-12T23:49:37.369280639Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:37.371981 containerd[1902]: time="2026-03-12T23:49:37.371959632Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 12 23:49:37.376723 containerd[1902]: time="2026-03-12T23:49:37.376686152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:37.377199 containerd[1902]: time="2026-03-12T23:49:37.377172117Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 761.023976ms" Mar 12 23:49:37.379822 containerd[1902]: time="2026-03-12T23:49:37.379597203Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:37.382561 containerd[1902]: time="2026-03-12T23:49:37.382541448Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 12 23:49:37.383040 kubelet[2957]: E0312 23:49:37.383006 2957 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.40:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 23:49:37.390769 containerd[1902]: time="2026-03-12T23:49:37.390534586Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 469.700858ms" Mar 12 23:49:37.408462 kubelet[2957]: E0312 23:49:37.408424 2957 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.4-n-6470b86a4c?timeout=10s\": dial tcp 10.200.20.40:6443: connect: connection refused" interval="1.6s" Mar 12 23:49:37.412333 containerd[1902]: time="2026-03-12T23:49:37.412129355Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 478.806692ms" Mar 12 23:49:37.441670 containerd[1902]: time="2026-03-12T23:49:37.441632898Z" level=info msg="connecting to shim 23cd5feb8143ac33db85a1d191bdc90d023a13b370fde23920d3c234413371e3" address="unix:///run/containerd/s/23b4b203c02444d9b9efa2f8894a078df55372086837f123f14455dcf8d4d2d3" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:37.444173 containerd[1902]: time="2026-03-12T23:49:37.444144756Z" level=info msg="connecting to shim 88d08dd8f3a3b8bce29a0e0ab3506527886463056de48dbe96c0792ecd1d814e" address="unix:///run/containerd/s/65e79421b25642cb416e953886c89ddbc60c14d8eaff21c4b6e61c9c37a67982" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:37.462000 systemd[1]: Started cri-containerd-23cd5feb8143ac33db85a1d191bdc90d023a13b370fde23920d3c234413371e3.scope - libcontainer container 23cd5feb8143ac33db85a1d191bdc90d023a13b370fde23920d3c234413371e3. Mar 12 23:49:37.468706 containerd[1902]: time="2026-03-12T23:49:37.468670945Z" level=info msg="connecting to shim 7ed5e1ada70d847be788a94f326cc8af960c598ef9b24888b21cb15b83e12fc7" address="unix:///run/containerd/s/daa087742098d8c37aee5fc6fa26f3c8c6cdeccfa4c27350cee0241640fc9230" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:37.477942 systemd[1]: Started cri-containerd-88d08dd8f3a3b8bce29a0e0ab3506527886463056de48dbe96c0792ecd1d814e.scope - libcontainer container 88d08dd8f3a3b8bce29a0e0ab3506527886463056de48dbe96c0792ecd1d814e. Mar 12 23:49:37.495953 systemd[1]: Started cri-containerd-7ed5e1ada70d847be788a94f326cc8af960c598ef9b24888b21cb15b83e12fc7.scope - libcontainer container 7ed5e1ada70d847be788a94f326cc8af960c598ef9b24888b21cb15b83e12fc7. Mar 12 23:49:37.523699 containerd[1902]: time="2026-03-12T23:49:37.523663934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.4-n-6470b86a4c,Uid:3baa11c9def2f2592185abad0ba92f0d,Namespace:kube-system,Attempt:0,} returns sandbox id \"23cd5feb8143ac33db85a1d191bdc90d023a13b370fde23920d3c234413371e3\"" Mar 12 23:49:37.537684 containerd[1902]: time="2026-03-12T23:49:37.537646310Z" level=info msg="CreateContainer within sandbox \"23cd5feb8143ac33db85a1d191bdc90d023a13b370fde23920d3c234413371e3\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 12 23:49:37.547354 containerd[1902]: time="2026-03-12T23:49:37.547319398Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.4-n-6470b86a4c,Uid:1958f6dbb48eb1e067f5682e2c22a80e,Namespace:kube-system,Attempt:0,} returns sandbox id \"7ed5e1ada70d847be788a94f326cc8af960c598ef9b24888b21cb15b83e12fc7\"" Mar 12 23:49:37.554543 containerd[1902]: time="2026-03-12T23:49:37.554516223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.4-n-6470b86a4c,Uid:e6cbb2457522a4b95b9476867dd26e57,Namespace:kube-system,Attempt:0,} returns sandbox id \"88d08dd8f3a3b8bce29a0e0ab3506527886463056de48dbe96c0792ecd1d814e\"" Mar 12 23:49:37.556506 containerd[1902]: time="2026-03-12T23:49:37.556472746Z" level=info msg="CreateContainer within sandbox \"7ed5e1ada70d847be788a94f326cc8af960c598ef9b24888b21cb15b83e12fc7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 12 23:49:37.562629 containerd[1902]: time="2026-03-12T23:49:37.562575924Z" level=info msg="CreateContainer within sandbox \"88d08dd8f3a3b8bce29a0e0ab3506527886463056de48dbe96c0792ecd1d814e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 12 23:49:37.566862 containerd[1902]: time="2026-03-12T23:49:37.566828199Z" level=info msg="Container 98f9d632b6936ac09d6c32d61291b345f26ab18ef935c499ca0188d5b2a41fcd: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:37.588898 kubelet[2957]: I0312 23:49:37.588421 2957 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:37.589324 kubelet[2957]: E0312 23:49:37.589290 2957 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.40:6443/api/v1/nodes\": dial tcp 10.200.20.40:6443: connect: connection refused" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:37.593768 containerd[1902]: time="2026-03-12T23:49:37.593728681Z" level=info msg="Container e9fadf1ba6e7c42e5eb64d41037225be4bbcd53f671e426dee1f8a26a995de84: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:37.600828 containerd[1902]: time="2026-03-12T23:49:37.600762074Z" level=info msg="Container 91ff9cb66c610e704e6f297a1a1495d64f31ba92829655a256484ff833022e11: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:37.609175 containerd[1902]: time="2026-03-12T23:49:37.609138924Z" level=info msg="CreateContainer within sandbox \"23cd5feb8143ac33db85a1d191bdc90d023a13b370fde23920d3c234413371e3\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"98f9d632b6936ac09d6c32d61291b345f26ab18ef935c499ca0188d5b2a41fcd\"" Mar 12 23:49:37.609793 containerd[1902]: time="2026-03-12T23:49:37.609766463Z" level=info msg="StartContainer for \"98f9d632b6936ac09d6c32d61291b345f26ab18ef935c499ca0188d5b2a41fcd\"" Mar 12 23:49:37.610597 containerd[1902]: time="2026-03-12T23:49:37.610564208Z" level=info msg="connecting to shim 98f9d632b6936ac09d6c32d61291b345f26ab18ef935c499ca0188d5b2a41fcd" address="unix:///run/containerd/s/23b4b203c02444d9b9efa2f8894a078df55372086837f123f14455dcf8d4d2d3" protocol=ttrpc version=3 Mar 12 23:49:37.625925 containerd[1902]: time="2026-03-12T23:49:37.625881016Z" level=info msg="CreateContainer within sandbox \"7ed5e1ada70d847be788a94f326cc8af960c598ef9b24888b21cb15b83e12fc7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e9fadf1ba6e7c42e5eb64d41037225be4bbcd53f671e426dee1f8a26a995de84\"" Mar 12 23:49:37.626424 containerd[1902]: time="2026-03-12T23:49:37.626340843Z" level=info msg="StartContainer for \"e9fadf1ba6e7c42e5eb64d41037225be4bbcd53f671e426dee1f8a26a995de84\"" Mar 12 23:49:37.627390 containerd[1902]: time="2026-03-12T23:49:37.627349214Z" level=info msg="connecting to shim e9fadf1ba6e7c42e5eb64d41037225be4bbcd53f671e426dee1f8a26a995de84" address="unix:///run/containerd/s/daa087742098d8c37aee5fc6fa26f3c8c6cdeccfa4c27350cee0241640fc9230" protocol=ttrpc version=3 Mar 12 23:49:37.629661 containerd[1902]: time="2026-03-12T23:49:37.629351595Z" level=info msg="CreateContainer within sandbox \"88d08dd8f3a3b8bce29a0e0ab3506527886463056de48dbe96c0792ecd1d814e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"91ff9cb66c610e704e6f297a1a1495d64f31ba92829655a256484ff833022e11\"" Mar 12 23:49:37.629511 systemd[1]: Started cri-containerd-98f9d632b6936ac09d6c32d61291b345f26ab18ef935c499ca0188d5b2a41fcd.scope - libcontainer container 98f9d632b6936ac09d6c32d61291b345f26ab18ef935c499ca0188d5b2a41fcd. Mar 12 23:49:37.630270 containerd[1902]: time="2026-03-12T23:49:37.630213159Z" level=info msg="StartContainer for \"91ff9cb66c610e704e6f297a1a1495d64f31ba92829655a256484ff833022e11\"" Mar 12 23:49:37.631563 containerd[1902]: time="2026-03-12T23:49:37.631503142Z" level=info msg="connecting to shim 91ff9cb66c610e704e6f297a1a1495d64f31ba92829655a256484ff833022e11" address="unix:///run/containerd/s/65e79421b25642cb416e953886c89ddbc60c14d8eaff21c4b6e61c9c37a67982" protocol=ttrpc version=3 Mar 12 23:49:37.649726 systemd[1]: Started cri-containerd-91ff9cb66c610e704e6f297a1a1495d64f31ba92829655a256484ff833022e11.scope - libcontainer container 91ff9cb66c610e704e6f297a1a1495d64f31ba92829655a256484ff833022e11. Mar 12 23:49:37.658104 systemd[1]: Started cri-containerd-e9fadf1ba6e7c42e5eb64d41037225be4bbcd53f671e426dee1f8a26a995de84.scope - libcontainer container e9fadf1ba6e7c42e5eb64d41037225be4bbcd53f671e426dee1f8a26a995de84. Mar 12 23:49:37.690969 containerd[1902]: time="2026-03-12T23:49:37.690928726Z" level=info msg="StartContainer for \"98f9d632b6936ac09d6c32d61291b345f26ab18ef935c499ca0188d5b2a41fcd\" returns successfully" Mar 12 23:49:37.711553 containerd[1902]: time="2026-03-12T23:49:37.711514085Z" level=info msg="StartContainer for \"91ff9cb66c610e704e6f297a1a1495d64f31ba92829655a256484ff833022e11\" returns successfully" Mar 12 23:49:37.736317 containerd[1902]: time="2026-03-12T23:49:37.736188768Z" level=info msg="StartContainer for \"e9fadf1ba6e7c42e5eb64d41037225be4bbcd53f671e426dee1f8a26a995de84\" returns successfully" Mar 12 23:49:38.061425 kubelet[2957]: E0312 23:49:38.061266 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:38.063607 kubelet[2957]: E0312 23:49:38.063586 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:38.066911 kubelet[2957]: E0312 23:49:38.065442 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:38.993897 kubelet[2957]: I0312 23:49:38.993723 2957 apiserver.go:52] "Watching apiserver" Mar 12 23:49:39.006847 kubelet[2957]: I0312 23:49:39.006785 2957 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 23:49:39.011989 kubelet[2957]: E0312 23:49:39.011943 2957 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.067337 kubelet[2957]: E0312 23:49:39.067132 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.067897 kubelet[2957]: E0312 23:49:39.067884 2957 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.4-n-6470b86a4c\" not found" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.191876 kubelet[2957]: I0312 23:49:39.191529 2957 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.201769 kubelet[2957]: I0312 23:49:39.201550 2957 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.205572 kubelet[2957]: I0312 23:49:39.205460 2957 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.213627 kubelet[2957]: E0312 23:49:39.213515 2957 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.213627 kubelet[2957]: I0312 23:49:39.213552 2957 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.216076 kubelet[2957]: E0312 23:49:39.216051 2957 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.216076 kubelet[2957]: I0312 23:49:39.216070 2957 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:39.217872 kubelet[2957]: E0312 23:49:39.217850 2957 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.4-n-6470b86a4c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:40.067878 kubelet[2957]: I0312 23:49:40.067281 2957 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:40.076767 kubelet[2957]: I0312 23:49:40.076721 2957 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 23:49:41.416967 systemd[1]: Reload requested from client PID 3307 ('systemctl') (unit session-9.scope)... Mar 12 23:49:41.417082 systemd[1]: Reloading... Mar 12 23:49:41.511835 zram_generator::config[3354]: No configuration found. Mar 12 23:49:41.680528 systemd[1]: Reloading finished in 262 ms. Mar 12 23:49:41.706637 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:41.719178 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 23:49:41.719370 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:41.719414 systemd[1]: kubelet.service: Consumed 1.228s CPU time, 122.4M memory peak. Mar 12 23:49:41.722034 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:42.026250 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:42.035056 (kubelet)[3419]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:49:42.071519 kubelet[3419]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 23:49:42.072394 kubelet[3419]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:49:42.072394 kubelet[3419]: I0312 23:49:42.071908 3419 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 23:49:42.078187 kubelet[3419]: I0312 23:49:42.078165 3419 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 12 23:49:42.078282 kubelet[3419]: I0312 23:49:42.078273 3419 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:49:42.078848 kubelet[3419]: I0312 23:49:42.078829 3419 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 23:49:42.079529 kubelet[3419]: I0312 23:49:42.078922 3419 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:49:42.079529 kubelet[3419]: I0312 23:49:42.079087 3419 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 23:49:42.080304 kubelet[3419]: I0312 23:49:42.080285 3419 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 12 23:49:42.082006 kubelet[3419]: I0312 23:49:42.081970 3419 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:49:42.086278 kubelet[3419]: I0312 23:49:42.086263 3419 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:49:42.089297 kubelet[3419]: I0312 23:49:42.089277 3419 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 23:49:42.089551 kubelet[3419]: I0312 23:49:42.089533 3419 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:49:42.089820 kubelet[3419]: I0312 23:49:42.089606 3419 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.4-n-6470b86a4c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:49:42.089945 kubelet[3419]: I0312 23:49:42.089932 3419 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 23:49:42.090468 kubelet[3419]: I0312 23:49:42.089999 3419 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 23:49:42.090468 kubelet[3419]: I0312 23:49:42.090029 3419 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 23:49:42.090468 kubelet[3419]: I0312 23:49:42.090421 3419 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:42.090677 kubelet[3419]: I0312 23:49:42.090667 3419 kubelet.go:475] "Attempting to sync node with API server" Mar 12 23:49:42.090737 kubelet[3419]: I0312 23:49:42.090730 3419 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:49:42.090826 kubelet[3419]: I0312 23:49:42.090795 3419 kubelet.go:387] "Adding apiserver pod source" Mar 12 23:49:42.090899 kubelet[3419]: I0312 23:49:42.090892 3419 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:49:42.093350 kubelet[3419]: I0312 23:49:42.093335 3419 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:49:42.093772 kubelet[3419]: I0312 23:49:42.093755 3419 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:49:42.093902 kubelet[3419]: I0312 23:49:42.093879 3419 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 23:49:42.097888 kubelet[3419]: I0312 23:49:42.097872 3419 server.go:1262] "Started kubelet" Mar 12 23:49:42.099295 kubelet[3419]: I0312 23:49:42.099271 3419 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 23:49:42.113352 kubelet[3419]: I0312 23:49:42.112628 3419 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:49:42.113763 kubelet[3419]: I0312 23:49:42.113750 3419 server.go:310] "Adding debug handlers to kubelet server" Mar 12 23:49:42.115354 kubelet[3419]: I0312 23:49:42.115319 3419 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:49:42.115562 kubelet[3419]: I0312 23:49:42.115548 3419 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 23:49:42.115926 kubelet[3419]: I0312 23:49:42.115911 3419 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:49:42.116159 kubelet[3419]: I0312 23:49:42.116142 3419 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:49:42.117207 kubelet[3419]: I0312 23:49:42.117190 3419 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 12 23:49:42.118558 kubelet[3419]: I0312 23:49:42.118085 3419 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 23:49:42.118558 kubelet[3419]: E0312 23:49:42.113860 3419 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 23:49:42.118558 kubelet[3419]: I0312 23:49:42.118344 3419 reconciler.go:29] "Reconciler: start to sync state" Mar 12 23:49:42.120017 kubelet[3419]: I0312 23:49:42.119369 3419 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:49:42.120017 kubelet[3419]: I0312 23:49:42.119445 3419 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:49:42.121102 kubelet[3419]: I0312 23:49:42.121032 3419 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 23:49:42.122167 kubelet[3419]: I0312 23:49:42.122087 3419 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:49:42.126753 kubelet[3419]: I0312 23:49:42.122184 3419 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 23:49:42.126883 kubelet[3419]: I0312 23:49:42.126870 3419 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 12 23:49:42.126956 kubelet[3419]: I0312 23:49:42.126948 3419 kubelet.go:2428] "Starting kubelet main sync loop" Mar 12 23:49:42.127051 kubelet[3419]: E0312 23:49:42.127030 3419 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:49:42.160257 kubelet[3419]: I0312 23:49:42.160234 3419 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 23:49:42.160833 kubelet[3419]: I0312 23:49:42.160667 3419 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 23:49:42.160833 kubelet[3419]: I0312 23:49:42.160697 3419 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:42.161017 kubelet[3419]: I0312 23:49:42.161002 3419 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 12 23:49:42.161100 kubelet[3419]: I0312 23:49:42.161079 3419 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 12 23:49:42.161161 kubelet[3419]: I0312 23:49:42.161155 3419 policy_none.go:49] "None policy: Start" Mar 12 23:49:42.161919 kubelet[3419]: I0312 23:49:42.161200 3419 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 23:49:42.161919 kubelet[3419]: I0312 23:49:42.161213 3419 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 23:49:42.161919 kubelet[3419]: I0312 23:49:42.161308 3419 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 12 23:49:42.161919 kubelet[3419]: I0312 23:49:42.161314 3419 policy_none.go:47] "Start" Mar 12 23:49:42.165242 kubelet[3419]: E0312 23:49:42.165227 3419 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:49:42.165765 kubelet[3419]: I0312 23:49:42.165748 3419 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 23:49:42.166387 kubelet[3419]: I0312 23:49:42.166203 3419 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:49:42.166611 kubelet[3419]: I0312 23:49:42.166593 3419 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 23:49:42.170074 kubelet[3419]: E0312 23:49:42.169061 3419 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:49:42.227847 kubelet[3419]: I0312 23:49:42.227813 3419 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.228026 kubelet[3419]: I0312 23:49:42.228011 3419 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.228363 kubelet[3419]: I0312 23:49:42.228349 3419 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.240205 kubelet[3419]: I0312 23:49:42.240105 3419 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 23:49:42.240611 kubelet[3419]: I0312 23:49:42.240510 3419 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 23:49:42.241189 kubelet[3419]: I0312 23:49:42.241131 3419 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 23:49:42.241407 kubelet[3419]: E0312 23:49:42.241378 3419 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.273152 kubelet[3419]: I0312 23:49:42.273131 3419 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.293825 kubelet[3419]: I0312 23:49:42.293026 3419 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.293825 kubelet[3419]: I0312 23:49:42.293103 3419 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320086 kubelet[3419]: I0312 23:49:42.320051 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3baa11c9def2f2592185abad0ba92f0d-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" (UID: \"3baa11c9def2f2592185abad0ba92f0d\") " pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320086 kubelet[3419]: I0312 23:49:42.320084 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320229 kubelet[3419]: I0312 23:49:42.320096 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320229 kubelet[3419]: I0312 23:49:42.320115 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-ca-certs\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320229 kubelet[3419]: I0312 23:49:42.320126 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320229 kubelet[3419]: I0312 23:49:42.320136 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e6cbb2457522a4b95b9476867dd26e57-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.4-n-6470b86a4c\" (UID: \"e6cbb2457522a4b95b9476867dd26e57\") " pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320229 kubelet[3419]: I0312 23:49:42.320145 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1958f6dbb48eb1e067f5682e2c22a80e-kubeconfig\") pod \"kube-scheduler-ci-4459.2.4-n-6470b86a4c\" (UID: \"1958f6dbb48eb1e067f5682e2c22a80e\") " pod="kube-system/kube-scheduler-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320408 kubelet[3419]: I0312 23:49:42.320154 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3baa11c9def2f2592185abad0ba92f0d-ca-certs\") pod \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" (UID: \"3baa11c9def2f2592185abad0ba92f0d\") " pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:42.320408 kubelet[3419]: I0312 23:49:42.320162 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3baa11c9def2f2592185abad0ba92f0d-k8s-certs\") pod \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" (UID: \"3baa11c9def2f2592185abad0ba92f0d\") " pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:43.098382 kubelet[3419]: I0312 23:49:43.098131 3419 apiserver.go:52] "Watching apiserver" Mar 12 23:49:43.118317 kubelet[3419]: I0312 23:49:43.118273 3419 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 23:49:43.154849 kubelet[3419]: I0312 23:49:43.153600 3419 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:43.164480 kubelet[3419]: I0312 23:49:43.164326 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.2.4-n-6470b86a4c" podStartSLOduration=1.16431057 podStartE2EDuration="1.16431057s" podCreationTimestamp="2026-03-12 23:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:43.151816391 +0000 UTC m=+1.114059359" watchObservedRunningTime="2026-03-12 23:49:43.16431057 +0000 UTC m=+1.126553538" Mar 12 23:49:43.167105 kubelet[3419]: I0312 23:49:43.167076 3419 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 23:49:43.167184 kubelet[3419]: E0312 23:49:43.167134 3419 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.4-n-6470b86a4c\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" Mar 12 23:49:43.177939 kubelet[3419]: I0312 23:49:43.177894 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.2.4-n-6470b86a4c" podStartSLOduration=3.177882266 podStartE2EDuration="3.177882266s" podCreationTimestamp="2026-03-12 23:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:43.167510309 +0000 UTC m=+1.129753269" watchObservedRunningTime="2026-03-12 23:49:43.177882266 +0000 UTC m=+1.140125306" Mar 12 23:49:43.177939 kubelet[3419]: I0312 23:49:43.177969 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.2.4-n-6470b86a4c" podStartSLOduration=1.177965109 podStartE2EDuration="1.177965109s" podCreationTimestamp="2026-03-12 23:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:43.177156124 +0000 UTC m=+1.139399092" watchObservedRunningTime="2026-03-12 23:49:43.177965109 +0000 UTC m=+1.140208093" Mar 12 23:49:47.033673 kubelet[3419]: I0312 23:49:47.033622 3419 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 12 23:49:47.035106 kubelet[3419]: I0312 23:49:47.034625 3419 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 12 23:49:47.035131 containerd[1902]: time="2026-03-12T23:49:47.034236723Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 12 23:49:47.873945 systemd[1]: Created slice kubepods-besteffort-podd36e2025_f320_4df6_b0d2_1e8bae35be4a.slice - libcontainer container kubepods-besteffort-podd36e2025_f320_4df6_b0d2_1e8bae35be4a.slice. Mar 12 23:49:47.951023 kubelet[3419]: I0312 23:49:47.950981 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d36e2025-f320-4df6-b0d2-1e8bae35be4a-lib-modules\") pod \"kube-proxy-q6spq\" (UID: \"d36e2025-f320-4df6-b0d2-1e8bae35be4a\") " pod="kube-system/kube-proxy-q6spq" Mar 12 23:49:47.951023 kubelet[3419]: I0312 23:49:47.951019 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfbv\" (UniqueName: \"kubernetes.io/projected/d36e2025-f320-4df6-b0d2-1e8bae35be4a-kube-api-access-qrfbv\") pod \"kube-proxy-q6spq\" (UID: \"d36e2025-f320-4df6-b0d2-1e8bae35be4a\") " pod="kube-system/kube-proxy-q6spq" Mar 12 23:49:47.951023 kubelet[3419]: I0312 23:49:47.951034 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d36e2025-f320-4df6-b0d2-1e8bae35be4a-kube-proxy\") pod \"kube-proxy-q6spq\" (UID: \"d36e2025-f320-4df6-b0d2-1e8bae35be4a\") " pod="kube-system/kube-proxy-q6spq" Mar 12 23:49:47.951294 kubelet[3419]: I0312 23:49:47.951044 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d36e2025-f320-4df6-b0d2-1e8bae35be4a-xtables-lock\") pod \"kube-proxy-q6spq\" (UID: \"d36e2025-f320-4df6-b0d2-1e8bae35be4a\") " pod="kube-system/kube-proxy-q6spq" Mar 12 23:49:48.188109 containerd[1902]: time="2026-03-12T23:49:48.187997284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q6spq,Uid:d36e2025-f320-4df6-b0d2-1e8bae35be4a,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:48.226396 containerd[1902]: time="2026-03-12T23:49:48.226360559Z" level=info msg="connecting to shim b7b1d131641a11e88d14fc31256bae8d2cafaff69f34140ccee9aa5a16085f56" address="unix:///run/containerd/s/12eab58c938980ed30af8e3d24cae3d44ef6d04f64a700baa5b67dc6e3b24a88" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:48.247022 systemd[1]: Started cri-containerd-b7b1d131641a11e88d14fc31256bae8d2cafaff69f34140ccee9aa5a16085f56.scope - libcontainer container b7b1d131641a11e88d14fc31256bae8d2cafaff69f34140ccee9aa5a16085f56. Mar 12 23:49:48.281392 containerd[1902]: time="2026-03-12T23:49:48.281069216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q6spq,Uid:d36e2025-f320-4df6-b0d2-1e8bae35be4a,Namespace:kube-system,Attempt:0,} returns sandbox id \"b7b1d131641a11e88d14fc31256bae8d2cafaff69f34140ccee9aa5a16085f56\"" Mar 12 23:49:48.290466 containerd[1902]: time="2026-03-12T23:49:48.290434425Z" level=info msg="CreateContainer within sandbox \"b7b1d131641a11e88d14fc31256bae8d2cafaff69f34140ccee9aa5a16085f56\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 12 23:49:48.306135 systemd[1]: Created slice kubepods-besteffort-podb5e605f8_5f5e_405e_abb9_e3fc9dbcea88.slice - libcontainer container kubepods-besteffort-podb5e605f8_5f5e_405e_abb9_e3fc9dbcea88.slice. Mar 12 23:49:48.316777 containerd[1902]: time="2026-03-12T23:49:48.316555570Z" level=info msg="Container f5ed018aa62bf54dee9343ea43dbe634e4cc1ea690ed9ab8e77a37c913904cb6: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:48.337206 containerd[1902]: time="2026-03-12T23:49:48.337157146Z" level=info msg="CreateContainer within sandbox \"b7b1d131641a11e88d14fc31256bae8d2cafaff69f34140ccee9aa5a16085f56\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f5ed018aa62bf54dee9343ea43dbe634e4cc1ea690ed9ab8e77a37c913904cb6\"" Mar 12 23:49:48.338016 containerd[1902]: time="2026-03-12T23:49:48.337987613Z" level=info msg="StartContainer for \"f5ed018aa62bf54dee9343ea43dbe634e4cc1ea690ed9ab8e77a37c913904cb6\"" Mar 12 23:49:48.339487 containerd[1902]: time="2026-03-12T23:49:48.339452347Z" level=info msg="connecting to shim f5ed018aa62bf54dee9343ea43dbe634e4cc1ea690ed9ab8e77a37c913904cb6" address="unix:///run/containerd/s/12eab58c938980ed30af8e3d24cae3d44ef6d04f64a700baa5b67dc6e3b24a88" protocol=ttrpc version=3 Mar 12 23:49:48.353309 kubelet[3419]: I0312 23:49:48.353284 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b5e605f8-5f5e-405e-abb9-e3fc9dbcea88-var-lib-calico\") pod \"tigera-operator-5588576f44-fjzhc\" (UID: \"b5e605f8-5f5e-405e-abb9-e3fc9dbcea88\") " pod="tigera-operator/tigera-operator-5588576f44-fjzhc" Mar 12 23:49:48.353675 kubelet[3419]: I0312 23:49:48.353588 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqt7\" (UniqueName: \"kubernetes.io/projected/b5e605f8-5f5e-405e-abb9-e3fc9dbcea88-kube-api-access-6cqt7\") pod \"tigera-operator-5588576f44-fjzhc\" (UID: \"b5e605f8-5f5e-405e-abb9-e3fc9dbcea88\") " pod="tigera-operator/tigera-operator-5588576f44-fjzhc" Mar 12 23:49:48.357925 systemd[1]: Started cri-containerd-f5ed018aa62bf54dee9343ea43dbe634e4cc1ea690ed9ab8e77a37c913904cb6.scope - libcontainer container f5ed018aa62bf54dee9343ea43dbe634e4cc1ea690ed9ab8e77a37c913904cb6. Mar 12 23:49:48.414111 containerd[1902]: time="2026-03-12T23:49:48.413927985Z" level=info msg="StartContainer for \"f5ed018aa62bf54dee9343ea43dbe634e4cc1ea690ed9ab8e77a37c913904cb6\" returns successfully" Mar 12 23:49:48.618691 containerd[1902]: time="2026-03-12T23:49:48.618653876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-fjzhc,Uid:b5e605f8-5f5e-405e-abb9-e3fc9dbcea88,Namespace:tigera-operator,Attempt:0,}" Mar 12 23:49:48.653327 containerd[1902]: time="2026-03-12T23:49:48.653275233Z" level=info msg="connecting to shim e87dcfc417bc15cdb7cf291a2b60fc1462272d58ad9834e3c8b2d366b9fe12ff" address="unix:///run/containerd/s/3f9779fb65a09b59f8a193bf062d94da1bb5445556599ea5971fcef588d84bae" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:48.672950 systemd[1]: Started cri-containerd-e87dcfc417bc15cdb7cf291a2b60fc1462272d58ad9834e3c8b2d366b9fe12ff.scope - libcontainer container e87dcfc417bc15cdb7cf291a2b60fc1462272d58ad9834e3c8b2d366b9fe12ff. Mar 12 23:49:48.703962 containerd[1902]: time="2026-03-12T23:49:48.703921176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-fjzhc,Uid:b5e605f8-5f5e-405e-abb9-e3fc9dbcea88,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e87dcfc417bc15cdb7cf291a2b60fc1462272d58ad9834e3c8b2d366b9fe12ff\"" Mar 12 23:49:48.705360 containerd[1902]: time="2026-03-12T23:49:48.705331291Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 12 23:49:49.176871 kubelet[3419]: I0312 23:49:49.176447 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-q6spq" podStartSLOduration=2.1764327 podStartE2EDuration="2.1764327s" podCreationTimestamp="2026-03-12 23:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:49.176300975 +0000 UTC m=+7.138543935" watchObservedRunningTime="2026-03-12 23:49:49.1764327 +0000 UTC m=+7.138675660" Mar 12 23:49:50.270386 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount124136667.mount: Deactivated successfully. Mar 12 23:49:50.938966 containerd[1902]: time="2026-03-12T23:49:50.938914723Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:50.941970 containerd[1902]: time="2026-03-12T23:49:50.941940554Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 12 23:49:50.945280 containerd[1902]: time="2026-03-12T23:49:50.945131728Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:50.949201 containerd[1902]: time="2026-03-12T23:49:50.949171881Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:50.949996 containerd[1902]: time="2026-03-12T23:49:50.949970731Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.24460719s" Mar 12 23:49:50.950052 containerd[1902]: time="2026-03-12T23:49:50.949997196Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 12 23:49:50.956649 containerd[1902]: time="2026-03-12T23:49:50.956606633Z" level=info msg="CreateContainer within sandbox \"e87dcfc417bc15cdb7cf291a2b60fc1462272d58ad9834e3c8b2d366b9fe12ff\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 12 23:49:50.977737 containerd[1902]: time="2026-03-12T23:49:50.977327927Z" level=info msg="Container 1dba4cf12ec9fe108e3a2bd40858b04c2f7a047001e10be31e2a66331e0eb003: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:50.979638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3180440598.mount: Deactivated successfully. Mar 12 23:49:50.991164 containerd[1902]: time="2026-03-12T23:49:50.991101273Z" level=info msg="CreateContainer within sandbox \"e87dcfc417bc15cdb7cf291a2b60fc1462272d58ad9834e3c8b2d366b9fe12ff\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1dba4cf12ec9fe108e3a2bd40858b04c2f7a047001e10be31e2a66331e0eb003\"" Mar 12 23:49:50.991910 containerd[1902]: time="2026-03-12T23:49:50.991870826Z" level=info msg="StartContainer for \"1dba4cf12ec9fe108e3a2bd40858b04c2f7a047001e10be31e2a66331e0eb003\"" Mar 12 23:49:50.992757 containerd[1902]: time="2026-03-12T23:49:50.992662707Z" level=info msg="connecting to shim 1dba4cf12ec9fe108e3a2bd40858b04c2f7a047001e10be31e2a66331e0eb003" address="unix:///run/containerd/s/3f9779fb65a09b59f8a193bf062d94da1bb5445556599ea5971fcef588d84bae" protocol=ttrpc version=3 Mar 12 23:49:51.007931 systemd[1]: Started cri-containerd-1dba4cf12ec9fe108e3a2bd40858b04c2f7a047001e10be31e2a66331e0eb003.scope - libcontainer container 1dba4cf12ec9fe108e3a2bd40858b04c2f7a047001e10be31e2a66331e0eb003. Mar 12 23:49:51.033250 containerd[1902]: time="2026-03-12T23:49:51.033215289Z" level=info msg="StartContainer for \"1dba4cf12ec9fe108e3a2bd40858b04c2f7a047001e10be31e2a66331e0eb003\" returns successfully" Mar 12 23:49:52.556632 kubelet[3419]: I0312 23:49:52.556406 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-fjzhc" podStartSLOduration=2.310933895 podStartE2EDuration="4.556390457s" podCreationTimestamp="2026-03-12 23:49:48 +0000 UTC" firstStartedPulling="2026-03-12 23:49:48.705049503 +0000 UTC m=+6.667292463" lastFinishedPulling="2026-03-12 23:49:50.950506065 +0000 UTC m=+8.912749025" observedRunningTime="2026-03-12 23:49:51.18300241 +0000 UTC m=+9.145245370" watchObservedRunningTime="2026-03-12 23:49:52.556390457 +0000 UTC m=+10.518633417" Mar 12 23:49:56.025967 sudo[2370]: pam_unix(sudo:session): session closed for user root Mar 12 23:49:56.102372 sshd[2369]: Connection closed by 10.200.16.10 port 35902 Mar 12 23:49:56.103875 sshd-session[2366]: pam_unix(sshd:session): session closed for user core Mar 12 23:49:56.107442 systemd[1]: sshd@6-10.200.20.40:22-10.200.16.10:35902.service: Deactivated successfully. Mar 12 23:49:56.109989 systemd[1]: session-9.scope: Deactivated successfully. Mar 12 23:49:56.110372 systemd[1]: session-9.scope: Consumed 4.343s CPU time, 225.6M memory peak. Mar 12 23:49:56.114291 systemd-logind[1877]: Session 9 logged out. Waiting for processes to exit. Mar 12 23:49:56.116643 systemd-logind[1877]: Removed session 9. Mar 12 23:49:59.316276 systemd[1]: Created slice kubepods-besteffort-pod4722ba58_e819_441c_b7f3_6eaf7b46300f.slice - libcontainer container kubepods-besteffort-pod4722ba58_e819_441c_b7f3_6eaf7b46300f.slice. Mar 12 23:49:59.403465 systemd[1]: Created slice kubepods-besteffort-pod08e7bb03_fea8_42b0_854a_7e5b20991eda.slice - libcontainer container kubepods-besteffort-pod08e7bb03_fea8_42b0_854a_7e5b20991eda.slice. Mar 12 23:49:59.417739 kubelet[3419]: I0312 23:49:59.417692 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-xtables-lock\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.417739 kubelet[3419]: I0312 23:49:59.417728 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-bpffs\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.417739 kubelet[3419]: I0312 23:49:59.417740 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-cni-log-dir\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.418843 kubelet[3419]: I0312 23:49:59.417750 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/08e7bb03-fea8-42b0-854a-7e5b20991eda-node-certs\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.418843 kubelet[3419]: I0312 23:49:59.417760 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwhl\" (UniqueName: \"kubernetes.io/projected/08e7bb03-fea8-42b0-854a-7e5b20991eda-kube-api-access-9bwhl\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.418843 kubelet[3419]: I0312 23:49:59.417771 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4722ba58-e819-441c-b7f3-6eaf7b46300f-tigera-ca-bundle\") pod \"calico-typha-779cc4db9f-96wzn\" (UID: \"4722ba58-e819-441c-b7f3-6eaf7b46300f\") " pod="calico-system/calico-typha-779cc4db9f-96wzn" Mar 12 23:49:59.418843 kubelet[3419]: I0312 23:49:59.417780 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4722ba58-e819-441c-b7f3-6eaf7b46300f-typha-certs\") pod \"calico-typha-779cc4db9f-96wzn\" (UID: \"4722ba58-e819-441c-b7f3-6eaf7b46300f\") " pod="calico-system/calico-typha-779cc4db9f-96wzn" Mar 12 23:49:59.418843 kubelet[3419]: I0312 23:49:59.417789 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-cni-net-dir\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419162 kubelet[3419]: I0312 23:49:59.417808 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-lib-modules\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419162 kubelet[3419]: I0312 23:49:59.417821 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-var-run-calico\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419162 kubelet[3419]: I0312 23:49:59.417830 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-flexvol-driver-host\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419162 kubelet[3419]: I0312 23:49:59.417838 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-nodeproc\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419162 kubelet[3419]: I0312 23:49:59.417846 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-policysync\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419280 kubelet[3419]: I0312 23:49:59.417856 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ztd6\" (UniqueName: \"kubernetes.io/projected/4722ba58-e819-441c-b7f3-6eaf7b46300f-kube-api-access-5ztd6\") pod \"calico-typha-779cc4db9f-96wzn\" (UID: \"4722ba58-e819-441c-b7f3-6eaf7b46300f\") " pod="calico-system/calico-typha-779cc4db9f-96wzn" Mar 12 23:49:59.419280 kubelet[3419]: I0312 23:49:59.417866 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-cni-bin-dir\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419280 kubelet[3419]: I0312 23:49:59.417874 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-sys-fs\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419280 kubelet[3419]: I0312 23:49:59.417883 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08e7bb03-fea8-42b0-854a-7e5b20991eda-tigera-ca-bundle\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.419280 kubelet[3419]: I0312 23:49:59.417893 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/08e7bb03-fea8-42b0-854a-7e5b20991eda-var-lib-calico\") pod \"calico-node-n7l7d\" (UID: \"08e7bb03-fea8-42b0-854a-7e5b20991eda\") " pod="calico-system/calico-node-n7l7d" Mar 12 23:49:59.513843 kubelet[3419]: E0312 23:49:59.513624 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:49:59.520301 kubelet[3419]: E0312 23:49:59.520283 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.520550 kubelet[3419]: W0312 23:49:59.520497 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.520550 kubelet[3419]: E0312 23:49:59.520523 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.520852 kubelet[3419]: E0312 23:49:59.520840 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.521083 kubelet[3419]: W0312 23:49:59.520894 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.521083 kubelet[3419]: E0312 23:49:59.520921 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.521912 kubelet[3419]: E0312 23:49:59.521899 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.523158 kubelet[3419]: W0312 23:49:59.522016 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.523158 kubelet[3419]: E0312 23:49:59.523125 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.524002 kubelet[3419]: E0312 23:49:59.523303 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.524002 kubelet[3419]: W0312 23:49:59.523317 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.524002 kubelet[3419]: E0312 23:49:59.523326 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.524002 kubelet[3419]: E0312 23:49:59.523904 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.524002 kubelet[3419]: W0312 23:49:59.523914 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.524002 kubelet[3419]: E0312 23:49:59.523924 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.525814 kubelet[3419]: E0312 23:49:59.524265 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.525814 kubelet[3419]: W0312 23:49:59.524290 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.525814 kubelet[3419]: E0312 23:49:59.524302 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.525814 kubelet[3419]: E0312 23:49:59.524490 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.525814 kubelet[3419]: W0312 23:49:59.524497 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.525814 kubelet[3419]: E0312 23:49:59.524505 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.525814 kubelet[3419]: E0312 23:49:59.524636 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.525814 kubelet[3419]: W0312 23:49:59.524643 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.525814 kubelet[3419]: E0312 23:49:59.524650 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.525814 kubelet[3419]: E0312 23:49:59.524781 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.526014 kubelet[3419]: W0312 23:49:59.524787 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.526014 kubelet[3419]: E0312 23:49:59.524794 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.526014 kubelet[3419]: E0312 23:49:59.525199 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.526014 kubelet[3419]: W0312 23:49:59.525209 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.526014 kubelet[3419]: E0312 23:49:59.525219 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.526014 kubelet[3419]: E0312 23:49:59.525352 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.526014 kubelet[3419]: W0312 23:49:59.525359 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.526014 kubelet[3419]: E0312 23:49:59.525366 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.526014 kubelet[3419]: E0312 23:49:59.525505 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.526014 kubelet[3419]: W0312 23:49:59.525513 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.526155 kubelet[3419]: E0312 23:49:59.525520 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.526155 kubelet[3419]: E0312 23:49:59.525835 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.526155 kubelet[3419]: W0312 23:49:59.525847 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.526155 kubelet[3419]: E0312 23:49:59.525856 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.526155 kubelet[3419]: E0312 23:49:59.526088 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.526155 kubelet[3419]: W0312 23:49:59.526096 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.526155 kubelet[3419]: E0312 23:49:59.526106 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.527572 kubelet[3419]: E0312 23:49:59.526576 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.527572 kubelet[3419]: W0312 23:49:59.526592 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.527572 kubelet[3419]: E0312 23:49:59.526603 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.527572 kubelet[3419]: E0312 23:49:59.527018 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.527572 kubelet[3419]: W0312 23:49:59.527028 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.527572 kubelet[3419]: E0312 23:49:59.527039 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.527572 kubelet[3419]: E0312 23:49:59.527381 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.527572 kubelet[3419]: W0312 23:49:59.527396 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.527572 kubelet[3419]: E0312 23:49:59.527406 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.529211 kubelet[3419]: E0312 23:49:59.528000 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.529211 kubelet[3419]: W0312 23:49:59.528018 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.529211 kubelet[3419]: E0312 23:49:59.528029 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.532738 kubelet[3419]: E0312 23:49:59.532713 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.532825 kubelet[3419]: W0312 23:49:59.532748 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.532825 kubelet[3419]: E0312 23:49:59.532764 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.543642 kubelet[3419]: E0312 23:49:59.543623 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.543766 kubelet[3419]: W0312 23:49:59.543722 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.543766 kubelet[3419]: E0312 23:49:59.543741 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.557380 kubelet[3419]: E0312 23:49:59.557339 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.557655 kubelet[3419]: W0312 23:49:59.557468 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.557655 kubelet[3419]: E0312 23:49:59.557593 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.608382 kubelet[3419]: E0312 23:49:59.608037 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.608382 kubelet[3419]: W0312 23:49:59.608060 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.608382 kubelet[3419]: E0312 23:49:59.608082 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.608382 kubelet[3419]: E0312 23:49:59.608216 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.608382 kubelet[3419]: W0312 23:49:59.608222 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.608382 kubelet[3419]: E0312 23:49:59.608257 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.609089 kubelet[3419]: E0312 23:49:59.609069 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.609089 kubelet[3419]: W0312 23:49:59.609084 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.609151 kubelet[3419]: E0312 23:49:59.609096 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.609322 kubelet[3419]: E0312 23:49:59.609284 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.609360 kubelet[3419]: W0312 23:49:59.609339 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.609360 kubelet[3419]: E0312 23:49:59.609352 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.609847 kubelet[3419]: E0312 23:49:59.609667 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.609847 kubelet[3419]: W0312 23:49:59.609681 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.609847 kubelet[3419]: E0312 23:49:59.609693 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.609983 kubelet[3419]: E0312 23:49:59.609970 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.610039 kubelet[3419]: W0312 23:49:59.610029 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.610084 kubelet[3419]: E0312 23:49:59.610073 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.610346 kubelet[3419]: E0312 23:49:59.610258 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.610346 kubelet[3419]: W0312 23:49:59.610269 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.610346 kubelet[3419]: E0312 23:49:59.610278 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.610477 kubelet[3419]: E0312 23:49:59.610467 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.610532 kubelet[3419]: W0312 23:49:59.610523 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.610576 kubelet[3419]: E0312 23:49:59.610567 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.610893 kubelet[3419]: E0312 23:49:59.610744 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.610893 kubelet[3419]: W0312 23:49:59.610774 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.610893 kubelet[3419]: E0312 23:49:59.610783 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.611038 kubelet[3419]: E0312 23:49:59.611027 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.611094 kubelet[3419]: W0312 23:49:59.611085 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.611150 kubelet[3419]: E0312 23:49:59.611141 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.611415 kubelet[3419]: E0312 23:49:59.611330 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.611415 kubelet[3419]: W0312 23:49:59.611340 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.611415 kubelet[3419]: E0312 23:49:59.611348 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.611545 kubelet[3419]: E0312 23:49:59.611535 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.611712 kubelet[3419]: W0312 23:49:59.611589 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.611712 kubelet[3419]: E0312 23:49:59.611606 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.611879 kubelet[3419]: E0312 23:49:59.611867 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.612050 kubelet[3419]: W0312 23:49:59.611956 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.612050 kubelet[3419]: E0312 23:49:59.611971 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.612165 kubelet[3419]: E0312 23:49:59.612155 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.612216 kubelet[3419]: W0312 23:49:59.612208 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.612269 kubelet[3419]: E0312 23:49:59.612253 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.612515 kubelet[3419]: E0312 23:49:59.612430 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.612515 kubelet[3419]: W0312 23:49:59.612440 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.612515 kubelet[3419]: E0312 23:49:59.612448 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.612642 kubelet[3419]: E0312 23:49:59.612633 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.612697 kubelet[3419]: W0312 23:49:59.612687 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.612741 kubelet[3419]: E0312 23:49:59.612731 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.613047 kubelet[3419]: E0312 23:49:59.612965 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.613047 kubelet[3419]: W0312 23:49:59.612978 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.613047 kubelet[3419]: E0312 23:49:59.612987 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.613195 kubelet[3419]: E0312 23:49:59.613184 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.613248 kubelet[3419]: W0312 23:49:59.613239 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.613296 kubelet[3419]: E0312 23:49:59.613285 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.613503 kubelet[3419]: E0312 23:49:59.613492 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.613657 kubelet[3419]: W0312 23:49:59.613565 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.613657 kubelet[3419]: E0312 23:49:59.613581 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.613774 kubelet[3419]: E0312 23:49:59.613764 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.613853 kubelet[3419]: W0312 23:49:59.613842 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.614017 kubelet[3419]: E0312 23:49:59.613945 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.619451 kubelet[3419]: E0312 23:49:59.619098 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.619451 kubelet[3419]: W0312 23:49:59.619113 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.619451 kubelet[3419]: E0312 23:49:59.619124 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.619451 kubelet[3419]: I0312 23:49:59.619140 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fc362139-094a-4907-98c0-8c9e87d14519-registration-dir\") pod \"csi-node-driver-7g8rv\" (UID: \"fc362139-094a-4907-98c0-8c9e87d14519\") " pod="calico-system/csi-node-driver-7g8rv" Mar 12 23:49:59.619451 kubelet[3419]: E0312 23:49:59.619308 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.619451 kubelet[3419]: W0312 23:49:59.619316 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.619451 kubelet[3419]: E0312 23:49:59.619323 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.619451 kubelet[3419]: I0312 23:49:59.619334 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/fc362139-094a-4907-98c0-8c9e87d14519-varrun\") pod \"csi-node-driver-7g8rv\" (UID: \"fc362139-094a-4907-98c0-8c9e87d14519\") " pod="calico-system/csi-node-driver-7g8rv" Mar 12 23:49:59.619451 kubelet[3419]: E0312 23:49:59.619444 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.619631 kubelet[3419]: W0312 23:49:59.619458 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.619631 kubelet[3419]: E0312 23:49:59.619467 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.619631 kubelet[3419]: I0312 23:49:59.619477 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx877\" (UniqueName: \"kubernetes.io/projected/fc362139-094a-4907-98c0-8c9e87d14519-kube-api-access-xx877\") pod \"csi-node-driver-7g8rv\" (UID: \"fc362139-094a-4907-98c0-8c9e87d14519\") " pod="calico-system/csi-node-driver-7g8rv" Mar 12 23:49:59.619631 kubelet[3419]: E0312 23:49:59.619574 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.619631 kubelet[3419]: W0312 23:49:59.619579 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.619631 kubelet[3419]: E0312 23:49:59.619585 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.619631 kubelet[3419]: I0312 23:49:59.619593 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc362139-094a-4907-98c0-8c9e87d14519-kubelet-dir\") pod \"csi-node-driver-7g8rv\" (UID: \"fc362139-094a-4907-98c0-8c9e87d14519\") " pod="calico-system/csi-node-driver-7g8rv" Mar 12 23:49:59.619914 kubelet[3419]: E0312 23:49:59.619879 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.619914 kubelet[3419]: W0312 23:49:59.619895 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.619914 kubelet[3419]: E0312 23:49:59.619902 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.619914 kubelet[3419]: I0312 23:49:59.619913 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fc362139-094a-4907-98c0-8c9e87d14519-socket-dir\") pod \"csi-node-driver-7g8rv\" (UID: \"fc362139-094a-4907-98c0-8c9e87d14519\") " pod="calico-system/csi-node-driver-7g8rv" Mar 12 23:49:59.620422 kubelet[3419]: E0312 23:49:59.620399 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.620422 kubelet[3419]: W0312 23:49:59.620414 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.620422 kubelet[3419]: E0312 23:49:59.620423 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.620623 kubelet[3419]: E0312 23:49:59.620609 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.620623 kubelet[3419]: W0312 23:49:59.620619 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.620688 kubelet[3419]: E0312 23:49:59.620627 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.620783 kubelet[3419]: E0312 23:49:59.620768 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.620783 kubelet[3419]: W0312 23:49:59.620778 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.620860 kubelet[3419]: E0312 23:49:59.620785 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.620912 kubelet[3419]: E0312 23:49:59.620896 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.620912 kubelet[3419]: W0312 23:49:59.620907 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.620953 kubelet[3419]: E0312 23:49:59.620913 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.621062 kubelet[3419]: E0312 23:49:59.621045 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.621062 kubelet[3419]: W0312 23:49:59.621056 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.621062 kubelet[3419]: E0312 23:49:59.621062 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.621174 kubelet[3419]: E0312 23:49:59.621160 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.621174 kubelet[3419]: W0312 23:49:59.621169 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.621292 kubelet[3419]: E0312 23:49:59.621174 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.621361 kubelet[3419]: E0312 23:49:59.621347 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.621361 kubelet[3419]: W0312 23:49:59.621357 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.621404 kubelet[3419]: E0312 23:49:59.621363 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.621488 kubelet[3419]: E0312 23:49:59.621474 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.621488 kubelet[3419]: W0312 23:49:59.621482 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.621561 kubelet[3419]: E0312 23:49:59.621490 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.621604 kubelet[3419]: E0312 23:49:59.621589 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.621604 kubelet[3419]: W0312 23:49:59.621599 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.621636 kubelet[3419]: E0312 23:49:59.621604 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.621734 kubelet[3419]: E0312 23:49:59.621720 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.621734 kubelet[3419]: W0312 23:49:59.621728 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.621734 kubelet[3419]: E0312 23:49:59.621733 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.629485 containerd[1902]: time="2026-03-12T23:49:59.629433018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-779cc4db9f-96wzn,Uid:4722ba58-e819-441c-b7f3-6eaf7b46300f,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:59.671217 containerd[1902]: time="2026-03-12T23:49:59.671103230Z" level=info msg="connecting to shim 3707e50f686851956b17e8216f4ccbc123e08e0986781f34b63c64d0bb956cb7" address="unix:///run/containerd/s/3c63caddbde255eae2788f5ba3435e26a00f526d2e6cbd02b3978acd038ec5ee" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:59.692771 systemd[1]: Started cri-containerd-3707e50f686851956b17e8216f4ccbc123e08e0986781f34b63c64d0bb956cb7.scope - libcontainer container 3707e50f686851956b17e8216f4ccbc123e08e0986781f34b63c64d0bb956cb7. Mar 12 23:49:59.718553 containerd[1902]: time="2026-03-12T23:49:59.718438539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n7l7d,Uid:08e7bb03-fea8-42b0-854a-7e5b20991eda,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:59.722560 kubelet[3419]: E0312 23:49:59.722139 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.722560 kubelet[3419]: W0312 23:49:59.722191 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.722560 kubelet[3419]: E0312 23:49:59.722211 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.723582 kubelet[3419]: E0312 23:49:59.723484 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.725253 kubelet[3419]: W0312 23:49:59.725104 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.725253 kubelet[3419]: E0312 23:49:59.725127 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.725343 kubelet[3419]: E0312 23:49:59.725320 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.725343 kubelet[3419]: W0312 23:49:59.725331 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.725343 kubelet[3419]: E0312 23:49:59.725342 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.725477 kubelet[3419]: E0312 23:49:59.725460 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.725477 kubelet[3419]: W0312 23:49:59.725471 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.725685 kubelet[3419]: E0312 23:49:59.725479 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.725685 kubelet[3419]: E0312 23:49:59.725616 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.725685 kubelet[3419]: W0312 23:49:59.725622 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.725685 kubelet[3419]: E0312 23:49:59.725630 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.727019 kubelet[3419]: E0312 23:49:59.726977 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.727019 kubelet[3419]: W0312 23:49:59.726991 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.727019 kubelet[3419]: E0312 23:49:59.727003 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.727546 kubelet[3419]: E0312 23:49:59.727161 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.727546 kubelet[3419]: W0312 23:49:59.727168 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.727546 kubelet[3419]: E0312 23:49:59.727176 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.728162 kubelet[3419]: E0312 23:49:59.728147 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.728649 kubelet[3419]: W0312 23:49:59.728488 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.728649 kubelet[3419]: E0312 23:49:59.728512 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.729250 kubelet[3419]: E0312 23:49:59.729183 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.729409 kubelet[3419]: W0312 23:49:59.729309 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.729409 kubelet[3419]: E0312 23:49:59.729326 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.729667 kubelet[3419]: E0312 23:49:59.729597 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.729667 kubelet[3419]: W0312 23:49:59.729608 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.729667 kubelet[3419]: E0312 23:49:59.729619 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.731909 kubelet[3419]: E0312 23:49:59.731175 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.731909 kubelet[3419]: W0312 23:49:59.731189 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.731909 kubelet[3419]: E0312 23:49:59.731200 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.733197 kubelet[3419]: E0312 23:49:59.732635 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.733197 kubelet[3419]: W0312 23:49:59.732648 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.733197 kubelet[3419]: E0312 23:49:59.732659 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.736294 kubelet[3419]: E0312 23:49:59.735952 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.736294 kubelet[3419]: W0312 23:49:59.735975 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.736294 kubelet[3419]: E0312 23:49:59.735990 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.736457 kubelet[3419]: E0312 23:49:59.736440 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.736589 kubelet[3419]: W0312 23:49:59.736458 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.736589 kubelet[3419]: E0312 23:49:59.736473 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.737769 kubelet[3419]: E0312 23:49:59.737747 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.737769 kubelet[3419]: W0312 23:49:59.737761 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.737769 kubelet[3419]: E0312 23:49:59.737772 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.740525 kubelet[3419]: E0312 23:49:59.738098 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.740525 kubelet[3419]: W0312 23:49:59.738110 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.740525 kubelet[3419]: E0312 23:49:59.738121 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.740525 kubelet[3419]: E0312 23:49:59.738490 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.740525 kubelet[3419]: W0312 23:49:59.738500 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.740525 kubelet[3419]: E0312 23:49:59.738511 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.740525 kubelet[3419]: E0312 23:49:59.738989 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.740525 kubelet[3419]: W0312 23:49:59.738999 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.740525 kubelet[3419]: E0312 23:49:59.739010 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.740525 kubelet[3419]: E0312 23:49:59.739349 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.740972 kubelet[3419]: W0312 23:49:59.739360 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.740972 kubelet[3419]: E0312 23:49:59.739371 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.740972 kubelet[3419]: E0312 23:49:59.739705 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.740972 kubelet[3419]: W0312 23:49:59.739716 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.740972 kubelet[3419]: E0312 23:49:59.739727 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.740972 kubelet[3419]: E0312 23:49:59.740245 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.740972 kubelet[3419]: W0312 23:49:59.740266 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.740972 kubelet[3419]: E0312 23:49:59.740278 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.740972 kubelet[3419]: E0312 23:49:59.740613 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.740972 kubelet[3419]: W0312 23:49:59.740624 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.742149 kubelet[3419]: E0312 23:49:59.740635 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.742149 kubelet[3419]: E0312 23:49:59.740987 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.742149 kubelet[3419]: W0312 23:49:59.741000 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.742149 kubelet[3419]: E0312 23:49:59.741012 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.742149 kubelet[3419]: E0312 23:49:59.741392 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.742149 kubelet[3419]: W0312 23:49:59.741403 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.742149 kubelet[3419]: E0312 23:49:59.741421 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.742149 kubelet[3419]: E0312 23:49:59.741666 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.742149 kubelet[3419]: W0312 23:49:59.741676 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.742149 kubelet[3419]: E0312 23:49:59.741686 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.756522 kubelet[3419]: E0312 23:49:59.756502 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:59.756522 kubelet[3419]: W0312 23:49:59.756517 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:59.757169 kubelet[3419]: E0312 23:49:59.756531 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:59.761365 containerd[1902]: time="2026-03-12T23:49:59.761328560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-779cc4db9f-96wzn,Uid:4722ba58-e819-441c-b7f3-6eaf7b46300f,Namespace:calico-system,Attempt:0,} returns sandbox id \"3707e50f686851956b17e8216f4ccbc123e08e0986781f34b63c64d0bb956cb7\"" Mar 12 23:49:59.763679 containerd[1902]: time="2026-03-12T23:49:59.763431187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 12 23:49:59.784260 containerd[1902]: time="2026-03-12T23:49:59.784236172Z" level=info msg="connecting to shim 323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b" address="unix:///run/containerd/s/c8332044d7c9be62d09d1c906712fc4807f5f408d00f6f5c04ffadfa1f0bf9ab" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:59.799932 systemd[1]: Started cri-containerd-323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b.scope - libcontainer container 323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b. Mar 12 23:49:59.823366 containerd[1902]: time="2026-03-12T23:49:59.823223030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n7l7d,Uid:08e7bb03-fea8-42b0-854a-7e5b20991eda,Namespace:calico-system,Attempt:0,} returns sandbox id \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\"" Mar 12 23:50:01.102608 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3878177352.mount: Deactivated successfully. Mar 12 23:50:01.127255 kubelet[3419]: E0312 23:50:01.127200 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:02.011991 containerd[1902]: time="2026-03-12T23:50:02.011395339Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:02.014350 containerd[1902]: time="2026-03-12T23:50:02.014316967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 12 23:50:02.017702 containerd[1902]: time="2026-03-12T23:50:02.017679380Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:02.021788 containerd[1902]: time="2026-03-12T23:50:02.021748534Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:02.022221 containerd[1902]: time="2026-03-12T23:50:02.022185199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.258725211s" Mar 12 23:50:02.022221 containerd[1902]: time="2026-03-12T23:50:02.022213624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 12 23:50:02.024016 containerd[1902]: time="2026-03-12T23:50:02.023650377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 12 23:50:02.042609 containerd[1902]: time="2026-03-12T23:50:02.042566575Z" level=info msg="CreateContainer within sandbox \"3707e50f686851956b17e8216f4ccbc123e08e0986781f34b63c64d0bb956cb7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 12 23:50:02.064999 containerd[1902]: time="2026-03-12T23:50:02.064952111Z" level=info msg="Container c69659b69072225d08c4228f318decf83b01923533c619ada2fb0da67fba0328: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:02.084904 containerd[1902]: time="2026-03-12T23:50:02.084833483Z" level=info msg="CreateContainer within sandbox \"3707e50f686851956b17e8216f4ccbc123e08e0986781f34b63c64d0bb956cb7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c69659b69072225d08c4228f318decf83b01923533c619ada2fb0da67fba0328\"" Mar 12 23:50:02.085812 containerd[1902]: time="2026-03-12T23:50:02.085730055Z" level=info msg="StartContainer for \"c69659b69072225d08c4228f318decf83b01923533c619ada2fb0da67fba0328\"" Mar 12 23:50:02.087080 containerd[1902]: time="2026-03-12T23:50:02.087054459Z" level=info msg="connecting to shim c69659b69072225d08c4228f318decf83b01923533c619ada2fb0da67fba0328" address="unix:///run/containerd/s/3c63caddbde255eae2788f5ba3435e26a00f526d2e6cbd02b3978acd038ec5ee" protocol=ttrpc version=3 Mar 12 23:50:02.109965 systemd[1]: Started cri-containerd-c69659b69072225d08c4228f318decf83b01923533c619ada2fb0da67fba0328.scope - libcontainer container c69659b69072225d08c4228f318decf83b01923533c619ada2fb0da67fba0328. Mar 12 23:50:02.151717 containerd[1902]: time="2026-03-12T23:50:02.151619620Z" level=info msg="StartContainer for \"c69659b69072225d08c4228f318decf83b01923533c619ada2fb0da67fba0328\" returns successfully" Mar 12 23:50:02.231524 kubelet[3419]: E0312 23:50:02.231223 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.232875 kubelet[3419]: W0312 23:50:02.231617 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.232875 kubelet[3419]: E0312 23:50:02.231654 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.232875 kubelet[3419]: E0312 23:50:02.231996 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.232875 kubelet[3419]: W0312 23:50:02.232007 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.232875 kubelet[3419]: E0312 23:50:02.232045 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.232875 kubelet[3419]: E0312 23:50:02.232548 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.232875 kubelet[3419]: W0312 23:50:02.232560 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.232875 kubelet[3419]: E0312 23:50:02.232571 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.233197 kubelet[3419]: E0312 23:50:02.232993 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.233197 kubelet[3419]: W0312 23:50:02.233005 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.233197 kubelet[3419]: E0312 23:50:02.233017 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.233323 kubelet[3419]: E0312 23:50:02.233308 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.233323 kubelet[3419]: W0312 23:50:02.233321 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.233404 kubelet[3419]: E0312 23:50:02.233332 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.233664 kubelet[3419]: E0312 23:50:02.233647 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.233664 kubelet[3419]: W0312 23:50:02.233659 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.233760 kubelet[3419]: E0312 23:50:02.233670 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.234098 kubelet[3419]: E0312 23:50:02.234076 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.234098 kubelet[3419]: W0312 23:50:02.234088 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.234098 kubelet[3419]: E0312 23:50:02.234101 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.234449 kubelet[3419]: E0312 23:50:02.234429 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.234564 kubelet[3419]: W0312 23:50:02.234442 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.234602 kubelet[3419]: E0312 23:50:02.234567 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.235121 kubelet[3419]: E0312 23:50:02.235104 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.235121 kubelet[3419]: W0312 23:50:02.235118 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.235313 kubelet[3419]: E0312 23:50:02.235130 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.236914 kubelet[3419]: E0312 23:50:02.236891 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.236914 kubelet[3419]: W0312 23:50:02.236909 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.237022 kubelet[3419]: E0312 23:50:02.236921 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.237097 kubelet[3419]: E0312 23:50:02.237084 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.237097 kubelet[3419]: W0312 23:50:02.237094 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.237337 kubelet[3419]: E0312 23:50:02.237104 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.237462 kubelet[3419]: E0312 23:50:02.237446 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.237462 kubelet[3419]: W0312 23:50:02.237460 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.237550 kubelet[3419]: E0312 23:50:02.237471 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.237619 kubelet[3419]: E0312 23:50:02.237607 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.237619 kubelet[3419]: W0312 23:50:02.237615 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.237664 kubelet[3419]: E0312 23:50:02.237623 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.237840 kubelet[3419]: E0312 23:50:02.237823 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.237840 kubelet[3419]: W0312 23:50:02.237835 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.238187 kubelet[3419]: E0312 23:50:02.237846 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.238187 kubelet[3419]: E0312 23:50:02.237976 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.238187 kubelet[3419]: W0312 23:50:02.237983 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.238187 kubelet[3419]: E0312 23:50:02.237990 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.252157 kubelet[3419]: E0312 23:50:02.252006 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.252625 kubelet[3419]: W0312 23:50:02.252363 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.252625 kubelet[3419]: E0312 23:50:02.252390 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.252976 kubelet[3419]: E0312 23:50:02.252858 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.252976 kubelet[3419]: W0312 23:50:02.252874 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.252976 kubelet[3419]: E0312 23:50:02.252887 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.253154 kubelet[3419]: E0312 23:50:02.253139 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.253154 kubelet[3419]: W0312 23:50:02.253152 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.253331 kubelet[3419]: E0312 23:50:02.253163 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.253504 kubelet[3419]: E0312 23:50:02.253484 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.253504 kubelet[3419]: W0312 23:50:02.253497 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.253818 kubelet[3419]: E0312 23:50:02.253508 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.253818 kubelet[3419]: E0312 23:50:02.253779 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.253818 kubelet[3419]: W0312 23:50:02.253789 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.254154 kubelet[3419]: E0312 23:50:02.253898 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.254643 kubelet[3419]: E0312 23:50:02.254573 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.254834 kubelet[3419]: W0312 23:50:02.254819 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.254987 kubelet[3419]: E0312 23:50:02.254921 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.256984 kubelet[3419]: E0312 23:50:02.256962 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.256984 kubelet[3419]: W0312 23:50:02.256978 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.257123 kubelet[3419]: E0312 23:50:02.256992 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.257173 kubelet[3419]: E0312 23:50:02.257160 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.257173 kubelet[3419]: W0312 23:50:02.257169 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.257302 kubelet[3419]: E0312 23:50:02.257179 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.257524 kubelet[3419]: E0312 23:50:02.257512 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.257655 kubelet[3419]: W0312 23:50:02.257585 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.257655 kubelet[3419]: E0312 23:50:02.257599 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.259043 kubelet[3419]: E0312 23:50:02.259027 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.259150 kubelet[3419]: W0312 23:50:02.259101 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.259150 kubelet[3419]: E0312 23:50:02.259118 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.259441 kubelet[3419]: E0312 23:50:02.259400 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.259441 kubelet[3419]: W0312 23:50:02.259413 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.259441 kubelet[3419]: E0312 23:50:02.259425 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.259788 kubelet[3419]: E0312 23:50:02.259720 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.259788 kubelet[3419]: W0312 23:50:02.259732 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.259788 kubelet[3419]: E0312 23:50:02.259742 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.260644 kubelet[3419]: E0312 23:50:02.260060 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.260730 kubelet[3419]: W0312 23:50:02.260712 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.260789 kubelet[3419]: E0312 23:50:02.260778 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.261061 kubelet[3419]: E0312 23:50:02.261029 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.261061 kubelet[3419]: W0312 23:50:02.261041 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.261061 kubelet[3419]: E0312 23:50:02.261050 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.261982 kubelet[3419]: E0312 23:50:02.261366 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.261982 kubelet[3419]: W0312 23:50:02.261381 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.261982 kubelet[3419]: E0312 23:50:02.261391 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.262598 kubelet[3419]: E0312 23:50:02.262381 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.262598 kubelet[3419]: W0312 23:50:02.262393 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.262598 kubelet[3419]: E0312 23:50:02.262403 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.262676 kubelet[3419]: E0312 23:50:02.262617 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.262676 kubelet[3419]: W0312 23:50:02.262629 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.262676 kubelet[3419]: E0312 23:50:02.262642 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:02.262829 kubelet[3419]: E0312 23:50:02.262759 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:02.262829 kubelet[3419]: W0312 23:50:02.262772 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:02.262829 kubelet[3419]: E0312 23:50:02.262779 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.127478 kubelet[3419]: E0312 23:50:03.127426 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:03.199981 kubelet[3419]: I0312 23:50:03.199950 3419 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:03.245483 kubelet[3419]: E0312 23:50:03.245347 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.245483 kubelet[3419]: W0312 23:50:03.245375 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.245483 kubelet[3419]: E0312 23:50:03.245396 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.246114 kubelet[3419]: E0312 23:50:03.245990 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.246114 kubelet[3419]: W0312 23:50:03.246004 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.246114 kubelet[3419]: E0312 23:50:03.246024 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.246393 kubelet[3419]: E0312 23:50:03.246184 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.246393 kubelet[3419]: W0312 23:50:03.246193 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.246393 kubelet[3419]: E0312 23:50:03.246202 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.246531 kubelet[3419]: E0312 23:50:03.246492 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.246531 kubelet[3419]: W0312 23:50:03.246503 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.246531 kubelet[3419]: E0312 23:50:03.246512 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.246815 kubelet[3419]: E0312 23:50:03.246773 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.246815 kubelet[3419]: W0312 23:50:03.246783 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.246815 kubelet[3419]: E0312 23:50:03.246792 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.247073 kubelet[3419]: E0312 23:50:03.247055 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.247206 kubelet[3419]: W0312 23:50:03.247065 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.247206 kubelet[3419]: E0312 23:50:03.247144 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.247392 kubelet[3419]: E0312 23:50:03.247382 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.247532 kubelet[3419]: W0312 23:50:03.247444 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.247532 kubelet[3419]: E0312 23:50:03.247456 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.247742 kubelet[3419]: E0312 23:50:03.247697 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.247742 kubelet[3419]: W0312 23:50:03.247707 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.247742 kubelet[3419]: E0312 23:50:03.247716 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.248018 kubelet[3419]: E0312 23:50:03.248007 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.248080 kubelet[3419]: W0312 23:50:03.248071 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.248141 kubelet[3419]: E0312 23:50:03.248120 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.248389 kubelet[3419]: E0312 23:50:03.248306 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.248389 kubelet[3419]: W0312 23:50:03.248315 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.248389 kubelet[3419]: E0312 23:50:03.248323 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.248524 kubelet[3419]: E0312 23:50:03.248515 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.248582 kubelet[3419]: W0312 23:50:03.248573 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.248632 kubelet[3419]: E0312 23:50:03.248622 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.248900 kubelet[3419]: E0312 23:50:03.248835 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.248900 kubelet[3419]: W0312 23:50:03.248845 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.248900 kubelet[3419]: E0312 23:50:03.248856 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.249160 kubelet[3419]: E0312 23:50:03.249112 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.249160 kubelet[3419]: W0312 23:50:03.249123 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.249160 kubelet[3419]: E0312 23:50:03.249132 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.249405 kubelet[3419]: E0312 23:50:03.249384 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.249405 kubelet[3419]: W0312 23:50:03.249394 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.249553 kubelet[3419]: E0312 23:50:03.249483 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.249793 kubelet[3419]: E0312 23:50:03.249733 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.249793 kubelet[3419]: W0312 23:50:03.249743 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.249793 kubelet[3419]: E0312 23:50:03.249752 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.264045 kubelet[3419]: E0312 23:50:03.264016 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.264045 kubelet[3419]: W0312 23:50:03.264038 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.264290 kubelet[3419]: E0312 23:50:03.264054 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.264290 kubelet[3419]: E0312 23:50:03.264198 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.264290 kubelet[3419]: W0312 23:50:03.264206 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.264290 kubelet[3419]: E0312 23:50:03.264213 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.264556 kubelet[3419]: E0312 23:50:03.264524 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.264556 kubelet[3419]: W0312 23:50:03.264538 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.264720 kubelet[3419]: E0312 23:50:03.264628 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.264892 kubelet[3419]: E0312 23:50:03.264840 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.264892 kubelet[3419]: W0312 23:50:03.264850 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.264892 kubelet[3419]: E0312 23:50:03.264859 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.265209 kubelet[3419]: E0312 23:50:03.265120 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.265209 kubelet[3419]: W0312 23:50:03.265131 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.265209 kubelet[3419]: E0312 23:50:03.265140 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.265451 kubelet[3419]: E0312 23:50:03.265440 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.265516 kubelet[3419]: W0312 23:50:03.265506 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.265578 kubelet[3419]: E0312 23:50:03.265569 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.265815 kubelet[3419]: E0312 23:50:03.265786 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.265815 kubelet[3419]: W0312 23:50:03.265809 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.265815 kubelet[3419]: E0312 23:50:03.265818 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.266104 kubelet[3419]: E0312 23:50:03.265969 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.266104 kubelet[3419]: W0312 23:50:03.265976 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.266104 kubelet[3419]: E0312 23:50:03.265983 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.266104 kubelet[3419]: E0312 23:50:03.266069 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.266104 kubelet[3419]: W0312 23:50:03.266073 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.266104 kubelet[3419]: E0312 23:50:03.266078 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.266235 kubelet[3419]: E0312 23:50:03.266171 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.266235 kubelet[3419]: W0312 23:50:03.266176 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.266235 kubelet[3419]: E0312 23:50:03.266181 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.266282 kubelet[3419]: E0312 23:50:03.266275 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.266282 kubelet[3419]: W0312 23:50:03.266280 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.266315 kubelet[3419]: E0312 23:50:03.266286 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.266592 kubelet[3419]: E0312 23:50:03.266506 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.266592 kubelet[3419]: W0312 23:50:03.266519 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.266592 kubelet[3419]: E0312 23:50:03.266530 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.266900 kubelet[3419]: E0312 23:50:03.266832 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.266900 kubelet[3419]: W0312 23:50:03.266843 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.266900 kubelet[3419]: E0312 23:50:03.266852 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.267182 kubelet[3419]: E0312 23:50:03.267122 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.267182 kubelet[3419]: W0312 23:50:03.267136 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.267182 kubelet[3419]: E0312 23:50:03.267145 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.267452 kubelet[3419]: E0312 23:50:03.267377 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.267452 kubelet[3419]: W0312 23:50:03.267388 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.267452 kubelet[3419]: E0312 23:50:03.267398 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.267723 kubelet[3419]: E0312 23:50:03.267666 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.267723 kubelet[3419]: W0312 23:50:03.267677 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.267723 kubelet[3419]: E0312 23:50:03.267687 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.268025 kubelet[3419]: E0312 23:50:03.267961 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.268025 kubelet[3419]: W0312 23:50:03.267973 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.268025 kubelet[3419]: E0312 23:50:03.267983 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.268540 kubelet[3419]: E0312 23:50:03.268491 3419 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:50:03.268540 kubelet[3419]: W0312 23:50:03.268504 3419 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:50:03.268540 kubelet[3419]: E0312 23:50:03.268514 3419 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:50:03.414850 containerd[1902]: time="2026-03-12T23:50:03.414169622Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:03.418486 containerd[1902]: time="2026-03-12T23:50:03.418431759Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 12 23:50:03.421998 containerd[1902]: time="2026-03-12T23:50:03.421970372Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:03.426532 containerd[1902]: time="2026-03-12T23:50:03.426127536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:03.426532 containerd[1902]: time="2026-03-12T23:50:03.426414348Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.402736721s" Mar 12 23:50:03.426532 containerd[1902]: time="2026-03-12T23:50:03.426442981Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 12 23:50:03.434206 containerd[1902]: time="2026-03-12T23:50:03.434159023Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 12 23:50:03.450835 containerd[1902]: time="2026-03-12T23:50:03.450258797Z" level=info msg="Container 7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:03.469355 containerd[1902]: time="2026-03-12T23:50:03.469306208Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4\"" Mar 12 23:50:03.470036 containerd[1902]: time="2026-03-12T23:50:03.469957162Z" level=info msg="StartContainer for \"7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4\"" Mar 12 23:50:03.471141 containerd[1902]: time="2026-03-12T23:50:03.471104168Z" level=info msg="connecting to shim 7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4" address="unix:///run/containerd/s/c8332044d7c9be62d09d1c906712fc4807f5f408d00f6f5c04ffadfa1f0bf9ab" protocol=ttrpc version=3 Mar 12 23:50:03.491965 systemd[1]: Started cri-containerd-7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4.scope - libcontainer container 7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4. Mar 12 23:50:03.557941 containerd[1902]: time="2026-03-12T23:50:03.557730602Z" level=info msg="StartContainer for \"7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4\" returns successfully" Mar 12 23:50:03.566346 systemd[1]: cri-containerd-7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4.scope: Deactivated successfully. Mar 12 23:50:03.569513 containerd[1902]: time="2026-03-12T23:50:03.569469762Z" level=info msg="received container exit event container_id:\"7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4\" id:\"7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4\" pid:4132 exited_at:{seconds:1773359403 nanos:569001887}" Mar 12 23:50:03.597304 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7384e6b5f6c040ab3ff20faab651b79599dc851c6a9856c2f1447ba8f0bf5df4-rootfs.mount: Deactivated successfully. Mar 12 23:50:04.219266 kubelet[3419]: I0312 23:50:04.219106 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-779cc4db9f-96wzn" podStartSLOduration=2.958775851 podStartE2EDuration="5.219088197s" podCreationTimestamp="2026-03-12 23:49:59 +0000 UTC" firstStartedPulling="2026-03-12 23:49:59.763123839 +0000 UTC m=+17.725366807" lastFinishedPulling="2026-03-12 23:50:02.023436193 +0000 UTC m=+19.985679153" observedRunningTime="2026-03-12 23:50:02.240273271 +0000 UTC m=+20.202516231" watchObservedRunningTime="2026-03-12 23:50:04.219088197 +0000 UTC m=+22.181331165" Mar 12 23:50:05.128130 kubelet[3419]: E0312 23:50:05.127927 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:05.208007 containerd[1902]: time="2026-03-12T23:50:05.207948610Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 12 23:50:07.127360 kubelet[3419]: E0312 23:50:07.127267 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:09.127982 kubelet[3419]: E0312 23:50:09.127314 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:10.865005 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount711457248.mount: Deactivated successfully. Mar 12 23:50:11.127486 kubelet[3419]: E0312 23:50:11.127349 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:11.504860 containerd[1902]: time="2026-03-12T23:50:11.504239446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:11.509091 containerd[1902]: time="2026-03-12T23:50:11.509059419Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 12 23:50:11.512585 containerd[1902]: time="2026-03-12T23:50:11.512538017Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:11.516543 containerd[1902]: time="2026-03-12T23:50:11.516500527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:11.517166 containerd[1902]: time="2026-03-12T23:50:11.516873925Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 6.308882921s" Mar 12 23:50:11.517166 containerd[1902]: time="2026-03-12T23:50:11.516906118Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 12 23:50:11.524895 containerd[1902]: time="2026-03-12T23:50:11.524866673Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 12 23:50:11.549374 containerd[1902]: time="2026-03-12T23:50:11.548524568Z" level=info msg="Container 14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:11.566525 containerd[1902]: time="2026-03-12T23:50:11.566482604Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024\"" Mar 12 23:50:11.567620 containerd[1902]: time="2026-03-12T23:50:11.567420184Z" level=info msg="StartContainer for \"14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024\"" Mar 12 23:50:11.570000 containerd[1902]: time="2026-03-12T23:50:11.569975370Z" level=info msg="connecting to shim 14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024" address="unix:///run/containerd/s/c8332044d7c9be62d09d1c906712fc4807f5f408d00f6f5c04ffadfa1f0bf9ab" protocol=ttrpc version=3 Mar 12 23:50:11.589971 systemd[1]: Started cri-containerd-14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024.scope - libcontainer container 14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024. Mar 12 23:50:11.647148 containerd[1902]: time="2026-03-12T23:50:11.646951007Z" level=info msg="StartContainer for \"14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024\" returns successfully" Mar 12 23:50:11.673588 systemd[1]: cri-containerd-14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024.scope: Deactivated successfully. Mar 12 23:50:11.674171 containerd[1902]: time="2026-03-12T23:50:11.674082547Z" level=info msg="received container exit event container_id:\"14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024\" id:\"14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024\" pid:4191 exited_at:{seconds:1773359411 nanos:673287253}" Mar 12 23:50:11.693005 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-14af05a965b297a7ec7f58a5f373158dce522b1eda2f20dd195967503c566024-rootfs.mount: Deactivated successfully. Mar 12 23:50:13.127678 kubelet[3419]: E0312 23:50:13.127625 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:13.230270 containerd[1902]: time="2026-03-12T23:50:13.230152180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 12 23:50:15.127389 kubelet[3419]: E0312 23:50:15.127204 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:16.735312 kubelet[3419]: I0312 23:50:16.735266 3419 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:16.911665 containerd[1902]: time="2026-03-12T23:50:16.911594322Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:16.915253 containerd[1902]: time="2026-03-12T23:50:16.915090521Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 12 23:50:16.918526 containerd[1902]: time="2026-03-12T23:50:16.918491628Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:16.922907 containerd[1902]: time="2026-03-12T23:50:16.922753384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:16.923237 containerd[1902]: time="2026-03-12T23:50:16.923212402Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 3.692999812s" Mar 12 23:50:16.923237 containerd[1902]: time="2026-03-12T23:50:16.923237459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 12 23:50:16.931754 containerd[1902]: time="2026-03-12T23:50:16.931715617Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 12 23:50:16.955853 containerd[1902]: time="2026-03-12T23:50:16.955125375Z" level=info msg="Container fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:16.978534 containerd[1902]: time="2026-03-12T23:50:16.978483387Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01\"" Mar 12 23:50:16.979844 containerd[1902]: time="2026-03-12T23:50:16.979452552Z" level=info msg="StartContainer for \"fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01\"" Mar 12 23:50:16.980729 containerd[1902]: time="2026-03-12T23:50:16.980696712Z" level=info msg="connecting to shim fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01" address="unix:///run/containerd/s/c8332044d7c9be62d09d1c906712fc4807f5f408d00f6f5c04ffadfa1f0bf9ab" protocol=ttrpc version=3 Mar 12 23:50:17.003989 systemd[1]: Started cri-containerd-fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01.scope - libcontainer container fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01. Mar 12 23:50:17.086358 containerd[1902]: time="2026-03-12T23:50:17.086295843Z" level=info msg="StartContainer for \"fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01\" returns successfully" Mar 12 23:50:17.128426 kubelet[3419]: E0312 23:50:17.127325 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:18.423828 containerd[1902]: time="2026-03-12T23:50:18.423773129Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 23:50:18.426919 systemd[1]: cri-containerd-fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01.scope: Deactivated successfully. Mar 12 23:50:18.427160 systemd[1]: cri-containerd-fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01.scope: Consumed 376ms CPU time, 191.2M memory peak, 171.3M written to disk. Mar 12 23:50:18.429177 containerd[1902]: time="2026-03-12T23:50:18.429074605Z" level=info msg="received container exit event container_id:\"fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01\" id:\"fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01\" pid:4250 exited_at:{seconds:1773359418 nanos:428408339}" Mar 12 23:50:18.445769 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fb97d3d0673baa91c7f8a8605dc56cf883ef981c3d3ed18e382a003cdd79af01-rootfs.mount: Deactivated successfully. Mar 12 23:50:18.501179 kubelet[3419]: I0312 23:50:18.501145 3419 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 12 23:50:19.347110 systemd[1]: Created slice kubepods-burstable-pod436fdece_c6ec_472d_9cd2_fa384b83e17b.slice - libcontainer container kubepods-burstable-pod436fdece_c6ec_472d_9cd2_fa384b83e17b.slice. Mar 12 23:50:19.357867 systemd[1]: Created slice kubepods-besteffort-pod7deee45c_5c09_47fa_842b_662682ec010b.slice - libcontainer container kubepods-besteffort-pod7deee45c_5c09_47fa_842b_662682ec010b.slice. Mar 12 23:50:19.366945 systemd[1]: Created slice kubepods-besteffort-pod3120a7db_f697_4e98_8359_ab8a71eeeec4.slice - libcontainer container kubepods-besteffort-pod3120a7db_f697_4e98_8359_ab8a71eeeec4.slice. Mar 12 23:50:19.373751 kubelet[3419]: I0312 23:50:19.373023 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/436fdece-c6ec-472d-9cd2-fa384b83e17b-config-volume\") pod \"coredns-66bc5c9577-w2ldq\" (UID: \"436fdece-c6ec-472d-9cd2-fa384b83e17b\") " pod="kube-system/coredns-66bc5c9577-w2ldq" Mar 12 23:50:19.373751 kubelet[3419]: I0312 23:50:19.373062 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8btt2\" (UniqueName: \"kubernetes.io/projected/436fdece-c6ec-472d-9cd2-fa384b83e17b-kube-api-access-8btt2\") pod \"coredns-66bc5c9577-w2ldq\" (UID: \"436fdece-c6ec-472d-9cd2-fa384b83e17b\") " pod="kube-system/coredns-66bc5c9577-w2ldq" Mar 12 23:50:19.377898 systemd[1]: Created slice kubepods-besteffort-podfc362139_094a_4907_98c0_8c9e87d14519.slice - libcontainer container kubepods-besteffort-podfc362139_094a_4907_98c0_8c9e87d14519.slice. Mar 12 23:50:19.385527 systemd[1]: Created slice kubepods-burstable-pod8511de64_19de_4dbf_bfdc_44a7333cd73c.slice - libcontainer container kubepods-burstable-pod8511de64_19de_4dbf_bfdc_44a7333cd73c.slice. Mar 12 23:50:19.394921 containerd[1902]: time="2026-03-12T23:50:19.394625833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g8rv,Uid:fc362139-094a-4907-98c0-8c9e87d14519,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:19.399347 systemd[1]: Created slice kubepods-besteffort-pod64aac116_a096_4d74_bfd5_22c32d47e0c7.slice - libcontainer container kubepods-besteffort-pod64aac116_a096_4d74_bfd5_22c32d47e0c7.slice. Mar 12 23:50:19.412550 systemd[1]: Created slice kubepods-besteffort-pod12f9d181_efd0_4c73_821a_8f2f07ff2e48.slice - libcontainer container kubepods-besteffort-pod12f9d181_efd0_4c73_821a_8f2f07ff2e48.slice. Mar 12 23:50:19.420121 systemd[1]: Created slice kubepods-besteffort-poda782722a_cf65_4007_829d_5a3c3581e01a.slice - libcontainer container kubepods-besteffort-poda782722a_cf65_4007_829d_5a3c3581e01a.slice. Mar 12 23:50:19.471463 containerd[1902]: time="2026-03-12T23:50:19.471412630Z" level=error msg="Failed to destroy network for sandbox \"aa67e67465b8cd1a5cd8458220f64ca9b6d9e15aabb07ecd62233e5ce959dfec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.473447 kubelet[3419]: I0312 23:50:19.473392 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7deee45c-5c09-47fa-842b-662682ec010b-calico-apiserver-certs\") pod \"calico-apiserver-7d4f588999-2qj8l\" (UID: \"7deee45c-5c09-47fa-842b-662682ec010b\") " pod="calico-system/calico-apiserver-7d4f588999-2qj8l" Mar 12 23:50:19.473652 kubelet[3419]: I0312 23:50:19.473638 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rckx\" (UniqueName: \"kubernetes.io/projected/7deee45c-5c09-47fa-842b-662682ec010b-kube-api-access-6rckx\") pod \"calico-apiserver-7d4f588999-2qj8l\" (UID: \"7deee45c-5c09-47fa-842b-662682ec010b\") " pod="calico-system/calico-apiserver-7d4f588999-2qj8l" Mar 12 23:50:19.474091 systemd[1]: run-netns-cni\x2da104ef78\x2db7de\x2dee51\x2d3528\x2db15e36a238c7.mount: Deactivated successfully. Mar 12 23:50:19.474980 kubelet[3419]: I0312 23:50:19.474341 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-ca-bundle\") pod \"whisker-767b8f59c5-nlj7p\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " pod="calico-system/whisker-767b8f59c5-nlj7p" Mar 12 23:50:19.474980 kubelet[3419]: I0312 23:50:19.474382 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfsph\" (UniqueName: \"kubernetes.io/projected/a782722a-cf65-4007-829d-5a3c3581e01a-kube-api-access-qfsph\") pod \"whisker-767b8f59c5-nlj7p\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " pod="calico-system/whisker-767b8f59c5-nlj7p" Mar 12 23:50:19.474980 kubelet[3419]: I0312 23:50:19.474404 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12f9d181-efd0-4c73-821a-8f2f07ff2e48-config\") pod \"goldmane-cccfbd5cf-2h6dr\" (UID: \"12f9d181-efd0-4c73-821a-8f2f07ff2e48\") " pod="calico-system/goldmane-cccfbd5cf-2h6dr" Mar 12 23:50:19.474980 kubelet[3419]: I0312 23:50:19.474424 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8nz4\" (UniqueName: \"kubernetes.io/projected/12f9d181-efd0-4c73-821a-8f2f07ff2e48-kube-api-access-w8nz4\") pod \"goldmane-cccfbd5cf-2h6dr\" (UID: \"12f9d181-efd0-4c73-821a-8f2f07ff2e48\") " pod="calico-system/goldmane-cccfbd5cf-2h6dr" Mar 12 23:50:19.474980 kubelet[3419]: I0312 23:50:19.474437 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8511de64-19de-4dbf-bfdc-44a7333cd73c-config-volume\") pod \"coredns-66bc5c9577-2s2cx\" (UID: \"8511de64-19de-4dbf-bfdc-44a7333cd73c\") " pod="kube-system/coredns-66bc5c9577-2s2cx" Mar 12 23:50:19.475083 kubelet[3419]: I0312 23:50:19.474462 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-nginx-config\") pod \"whisker-767b8f59c5-nlj7p\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " pod="calico-system/whisker-767b8f59c5-nlj7p" Mar 12 23:50:19.475083 kubelet[3419]: I0312 23:50:19.474474 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-backend-key-pair\") pod \"whisker-767b8f59c5-nlj7p\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " pod="calico-system/whisker-767b8f59c5-nlj7p" Mar 12 23:50:19.475083 kubelet[3419]: I0312 23:50:19.474485 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3120a7db-f697-4e98-8359-ab8a71eeeec4-calico-apiserver-certs\") pod \"calico-apiserver-7d4f588999-9q596\" (UID: \"3120a7db-f697-4e98-8359-ab8a71eeeec4\") " pod="calico-system/calico-apiserver-7d4f588999-9q596" Mar 12 23:50:19.475083 kubelet[3419]: I0312 23:50:19.474496 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12f9d181-efd0-4c73-821a-8f2f07ff2e48-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-2h6dr\" (UID: \"12f9d181-efd0-4c73-821a-8f2f07ff2e48\") " pod="calico-system/goldmane-cccfbd5cf-2h6dr" Mar 12 23:50:19.475083 kubelet[3419]: I0312 23:50:19.474505 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6wpt\" (UniqueName: \"kubernetes.io/projected/64aac116-a096-4d74-bfd5-22c32d47e0c7-kube-api-access-n6wpt\") pod \"calico-kube-controllers-59c8ddc645-5c5s6\" (UID: \"64aac116-a096-4d74-bfd5-22c32d47e0c7\") " pod="calico-system/calico-kube-controllers-59c8ddc645-5c5s6" Mar 12 23:50:19.475177 kubelet[3419]: I0312 23:50:19.474518 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/12f9d181-efd0-4c73-821a-8f2f07ff2e48-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-2h6dr\" (UID: \"12f9d181-efd0-4c73-821a-8f2f07ff2e48\") " pod="calico-system/goldmane-cccfbd5cf-2h6dr" Mar 12 23:50:19.475177 kubelet[3419]: I0312 23:50:19.474529 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64aac116-a096-4d74-bfd5-22c32d47e0c7-tigera-ca-bundle\") pod \"calico-kube-controllers-59c8ddc645-5c5s6\" (UID: \"64aac116-a096-4d74-bfd5-22c32d47e0c7\") " pod="calico-system/calico-kube-controllers-59c8ddc645-5c5s6" Mar 12 23:50:19.475177 kubelet[3419]: I0312 23:50:19.474542 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkvwg\" (UniqueName: \"kubernetes.io/projected/3120a7db-f697-4e98-8359-ab8a71eeeec4-kube-api-access-qkvwg\") pod \"calico-apiserver-7d4f588999-9q596\" (UID: \"3120a7db-f697-4e98-8359-ab8a71eeeec4\") " pod="calico-system/calico-apiserver-7d4f588999-9q596" Mar 12 23:50:19.475177 kubelet[3419]: I0312 23:50:19.474551 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8p7t\" (UniqueName: \"kubernetes.io/projected/8511de64-19de-4dbf-bfdc-44a7333cd73c-kube-api-access-p8p7t\") pod \"coredns-66bc5c9577-2s2cx\" (UID: \"8511de64-19de-4dbf-bfdc-44a7333cd73c\") " pod="kube-system/coredns-66bc5c9577-2s2cx" Mar 12 23:50:19.476483 containerd[1902]: time="2026-03-12T23:50:19.475882828Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g8rv,Uid:fc362139-094a-4907-98c0-8c9e87d14519,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa67e67465b8cd1a5cd8458220f64ca9b6d9e15aabb07ecd62233e5ce959dfec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.480324 kubelet[3419]: E0312 23:50:19.480273 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa67e67465b8cd1a5cd8458220f64ca9b6d9e15aabb07ecd62233e5ce959dfec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.480494 kubelet[3419]: E0312 23:50:19.480464 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa67e67465b8cd1a5cd8458220f64ca9b6d9e15aabb07ecd62233e5ce959dfec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7g8rv" Mar 12 23:50:19.480587 kubelet[3419]: E0312 23:50:19.480572 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa67e67465b8cd1a5cd8458220f64ca9b6d9e15aabb07ecd62233e5ce959dfec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7g8rv" Mar 12 23:50:19.480722 kubelet[3419]: E0312 23:50:19.480693 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7g8rv_calico-system(fc362139-094a-4907-98c0-8c9e87d14519)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7g8rv_calico-system(fc362139-094a-4907-98c0-8c9e87d14519)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa67e67465b8cd1a5cd8458220f64ca9b6d9e15aabb07ecd62233e5ce959dfec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7g8rv" podUID="fc362139-094a-4907-98c0-8c9e87d14519" Mar 12 23:50:19.658707 containerd[1902]: time="2026-03-12T23:50:19.658582417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w2ldq,Uid:436fdece-c6ec-472d-9cd2-fa384b83e17b,Namespace:kube-system,Attempt:0,}" Mar 12 23:50:19.670284 containerd[1902]: time="2026-03-12T23:50:19.670249135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-2qj8l,Uid:7deee45c-5c09-47fa-842b-662682ec010b,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:19.681071 containerd[1902]: time="2026-03-12T23:50:19.680886880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-9q596,Uid:3120a7db-f697-4e98-8359-ab8a71eeeec4,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:19.707187 containerd[1902]: time="2026-03-12T23:50:19.707111710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2s2cx,Uid:8511de64-19de-4dbf-bfdc-44a7333cd73c,Namespace:kube-system,Attempt:0,}" Mar 12 23:50:19.710632 containerd[1902]: time="2026-03-12T23:50:19.710592148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8ddc645-5c5s6,Uid:64aac116-a096-4d74-bfd5-22c32d47e0c7,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:19.726780 containerd[1902]: time="2026-03-12T23:50:19.726732725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-2h6dr,Uid:12f9d181-efd0-4c73-821a-8f2f07ff2e48,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:19.731639 containerd[1902]: time="2026-03-12T23:50:19.731603685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-767b8f59c5-nlj7p,Uid:a782722a-cf65-4007-829d-5a3c3581e01a,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:19.766820 containerd[1902]: time="2026-03-12T23:50:19.766412082Z" level=error msg="Failed to destroy network for sandbox \"464c641389ef5f242bf4ab61e0cdb03f412b8946a27b202af98cfade4ef5cc8a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.770034 containerd[1902]: time="2026-03-12T23:50:19.769993476Z" level=error msg="Failed to destroy network for sandbox \"969b0b8274f84b621e5057f92b58f5388cff9c9f3f9ff838a6df4e56c190a139\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.772914 containerd[1902]: time="2026-03-12T23:50:19.770115368Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w2ldq,Uid:436fdece-c6ec-472d-9cd2-fa384b83e17b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"464c641389ef5f242bf4ab61e0cdb03f412b8946a27b202af98cfade4ef5cc8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.773291 kubelet[3419]: E0312 23:50:19.773253 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"464c641389ef5f242bf4ab61e0cdb03f412b8946a27b202af98cfade4ef5cc8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.773584 kubelet[3419]: E0312 23:50:19.773307 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"464c641389ef5f242bf4ab61e0cdb03f412b8946a27b202af98cfade4ef5cc8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-w2ldq" Mar 12 23:50:19.773584 kubelet[3419]: E0312 23:50:19.773324 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"464c641389ef5f242bf4ab61e0cdb03f412b8946a27b202af98cfade4ef5cc8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-w2ldq" Mar 12 23:50:19.773584 kubelet[3419]: E0312 23:50:19.773368 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-w2ldq_kube-system(436fdece-c6ec-472d-9cd2-fa384b83e17b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-w2ldq_kube-system(436fdece-c6ec-472d-9cd2-fa384b83e17b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"464c641389ef5f242bf4ab61e0cdb03f412b8946a27b202af98cfade4ef5cc8a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-w2ldq" podUID="436fdece-c6ec-472d-9cd2-fa384b83e17b" Mar 12 23:50:19.776772 containerd[1902]: time="2026-03-12T23:50:19.776343562Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-2qj8l,Uid:7deee45c-5c09-47fa-842b-662682ec010b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"969b0b8274f84b621e5057f92b58f5388cff9c9f3f9ff838a6df4e56c190a139\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.776859 kubelet[3419]: E0312 23:50:19.776684 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"969b0b8274f84b621e5057f92b58f5388cff9c9f3f9ff838a6df4e56c190a139\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.776859 kubelet[3419]: E0312 23:50:19.776726 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"969b0b8274f84b621e5057f92b58f5388cff9c9f3f9ff838a6df4e56c190a139\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7d4f588999-2qj8l" Mar 12 23:50:19.776859 kubelet[3419]: E0312 23:50:19.776739 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"969b0b8274f84b621e5057f92b58f5388cff9c9f3f9ff838a6df4e56c190a139\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7d4f588999-2qj8l" Mar 12 23:50:19.776942 kubelet[3419]: E0312 23:50:19.776783 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d4f588999-2qj8l_calico-system(7deee45c-5c09-47fa-842b-662682ec010b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d4f588999-2qj8l_calico-system(7deee45c-5c09-47fa-842b-662682ec010b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"969b0b8274f84b621e5057f92b58f5388cff9c9f3f9ff838a6df4e56c190a139\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7d4f588999-2qj8l" podUID="7deee45c-5c09-47fa-842b-662682ec010b" Mar 12 23:50:19.796304 containerd[1902]: time="2026-03-12T23:50:19.795620940Z" level=error msg="Failed to destroy network for sandbox \"01e81a6712d64fc006b1fa617d0c651b6912aef67b72b5f6e1314636384139e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.801951 containerd[1902]: time="2026-03-12T23:50:19.801907320Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-9q596,Uid:3120a7db-f697-4e98-8359-ab8a71eeeec4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"01e81a6712d64fc006b1fa617d0c651b6912aef67b72b5f6e1314636384139e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.802354 kubelet[3419]: E0312 23:50:19.802326 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01e81a6712d64fc006b1fa617d0c651b6912aef67b72b5f6e1314636384139e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.802753 kubelet[3419]: E0312 23:50:19.802644 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01e81a6712d64fc006b1fa617d0c651b6912aef67b72b5f6e1314636384139e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7d4f588999-9q596" Mar 12 23:50:19.802753 kubelet[3419]: E0312 23:50:19.802668 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01e81a6712d64fc006b1fa617d0c651b6912aef67b72b5f6e1314636384139e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7d4f588999-9q596" Mar 12 23:50:19.802753 kubelet[3419]: E0312 23:50:19.802717 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d4f588999-9q596_calico-system(3120a7db-f697-4e98-8359-ab8a71eeeec4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d4f588999-9q596_calico-system(3120a7db-f697-4e98-8359-ab8a71eeeec4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01e81a6712d64fc006b1fa617d0c651b6912aef67b72b5f6e1314636384139e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7d4f588999-9q596" podUID="3120a7db-f697-4e98-8359-ab8a71eeeec4" Mar 12 23:50:19.820428 containerd[1902]: time="2026-03-12T23:50:19.820369068Z" level=error msg="Failed to destroy network for sandbox \"0999fd4fff5d861eaf20f8377c83599998ddef5f46b5eda80624b8d48debdb06\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.824648 containerd[1902]: time="2026-03-12T23:50:19.824531715Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2s2cx,Uid:8511de64-19de-4dbf-bfdc-44a7333cd73c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0999fd4fff5d861eaf20f8377c83599998ddef5f46b5eda80624b8d48debdb06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.825467 kubelet[3419]: E0312 23:50:19.824946 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0999fd4fff5d861eaf20f8377c83599998ddef5f46b5eda80624b8d48debdb06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.825467 kubelet[3419]: E0312 23:50:19.825002 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0999fd4fff5d861eaf20f8377c83599998ddef5f46b5eda80624b8d48debdb06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-2s2cx" Mar 12 23:50:19.825467 kubelet[3419]: E0312 23:50:19.825017 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0999fd4fff5d861eaf20f8377c83599998ddef5f46b5eda80624b8d48debdb06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-2s2cx" Mar 12 23:50:19.825593 kubelet[3419]: E0312 23:50:19.825062 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-2s2cx_kube-system(8511de64-19de-4dbf-bfdc-44a7333cd73c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-2s2cx_kube-system(8511de64-19de-4dbf-bfdc-44a7333cd73c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0999fd4fff5d861eaf20f8377c83599998ddef5f46b5eda80624b8d48debdb06\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-2s2cx" podUID="8511de64-19de-4dbf-bfdc-44a7333cd73c" Mar 12 23:50:19.839965 containerd[1902]: time="2026-03-12T23:50:19.839920193Z" level=error msg="Failed to destroy network for sandbox \"69402321bcd883e0493f717392303a1d96df9c07b3084655605a062cd935bb50\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.844452 containerd[1902]: time="2026-03-12T23:50:19.844406571Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8ddc645-5c5s6,Uid:64aac116-a096-4d74-bfd5-22c32d47e0c7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"69402321bcd883e0493f717392303a1d96df9c07b3084655605a062cd935bb50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.844968 kubelet[3419]: E0312 23:50:19.844907 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69402321bcd883e0493f717392303a1d96df9c07b3084655605a062cd935bb50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.846091 kubelet[3419]: E0312 23:50:19.845164 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69402321bcd883e0493f717392303a1d96df9c07b3084655605a062cd935bb50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59c8ddc645-5c5s6" Mar 12 23:50:19.846091 kubelet[3419]: E0312 23:50:19.845187 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69402321bcd883e0493f717392303a1d96df9c07b3084655605a062cd935bb50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59c8ddc645-5c5s6" Mar 12 23:50:19.846091 kubelet[3419]: E0312 23:50:19.845239 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-59c8ddc645-5c5s6_calico-system(64aac116-a096-4d74-bfd5-22c32d47e0c7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-59c8ddc645-5c5s6_calico-system(64aac116-a096-4d74-bfd5-22c32d47e0c7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69402321bcd883e0493f717392303a1d96df9c07b3084655605a062cd935bb50\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-59c8ddc645-5c5s6" podUID="64aac116-a096-4d74-bfd5-22c32d47e0c7" Mar 12 23:50:19.849632 containerd[1902]: time="2026-03-12T23:50:19.849598319Z" level=error msg="Failed to destroy network for sandbox \"1ba2fdf3c42292bef754884ef7f668363181c86963761aa857ed8a4f5173917f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.850980 containerd[1902]: time="2026-03-12T23:50:19.850948648Z" level=error msg="Failed to destroy network for sandbox \"b2a78d989d1be279df8129a263b8d9a94d6226c0fa01cf22ba55d767147eda90\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.854080 containerd[1902]: time="2026-03-12T23:50:19.854046736Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-767b8f59c5-nlj7p,Uid:a782722a-cf65-4007-829d-5a3c3581e01a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ba2fdf3c42292bef754884ef7f668363181c86963761aa857ed8a4f5173917f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.854506 kubelet[3419]: E0312 23:50:19.854462 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ba2fdf3c42292bef754884ef7f668363181c86963761aa857ed8a4f5173917f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.854576 kubelet[3419]: E0312 23:50:19.854517 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ba2fdf3c42292bef754884ef7f668363181c86963761aa857ed8a4f5173917f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-767b8f59c5-nlj7p" Mar 12 23:50:19.854576 kubelet[3419]: E0312 23:50:19.854532 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ba2fdf3c42292bef754884ef7f668363181c86963761aa857ed8a4f5173917f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-767b8f59c5-nlj7p" Mar 12 23:50:19.854619 kubelet[3419]: E0312 23:50:19.854581 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-767b8f59c5-nlj7p_calico-system(a782722a-cf65-4007-829d-5a3c3581e01a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-767b8f59c5-nlj7p_calico-system(a782722a-cf65-4007-829d-5a3c3581e01a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ba2fdf3c42292bef754884ef7f668363181c86963761aa857ed8a4f5173917f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-767b8f59c5-nlj7p" podUID="a782722a-cf65-4007-829d-5a3c3581e01a" Mar 12 23:50:19.857218 containerd[1902]: time="2026-03-12T23:50:19.857181466Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-2h6dr,Uid:12f9d181-efd0-4c73-821a-8f2f07ff2e48,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2a78d989d1be279df8129a263b8d9a94d6226c0fa01cf22ba55d767147eda90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.857496 kubelet[3419]: E0312 23:50:19.857469 3419 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2a78d989d1be279df8129a263b8d9a94d6226c0fa01cf22ba55d767147eda90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:50:19.857687 kubelet[3419]: E0312 23:50:19.857589 3419 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2a78d989d1be279df8129a263b8d9a94d6226c0fa01cf22ba55d767147eda90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-2h6dr" Mar 12 23:50:19.857687 kubelet[3419]: E0312 23:50:19.857607 3419 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2a78d989d1be279df8129a263b8d9a94d6226c0fa01cf22ba55d767147eda90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-2h6dr" Mar 12 23:50:19.857687 kubelet[3419]: E0312 23:50:19.857655 3419 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-2h6dr_calico-system(12f9d181-efd0-4c73-821a-8f2f07ff2e48)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-2h6dr_calico-system(12f9d181-efd0-4c73-821a-8f2f07ff2e48)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2a78d989d1be279df8129a263b8d9a94d6226c0fa01cf22ba55d767147eda90\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-2h6dr" podUID="12f9d181-efd0-4c73-821a-8f2f07ff2e48" Mar 12 23:50:20.261585 containerd[1902]: time="2026-03-12T23:50:20.261536143Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 12 23:50:20.280593 containerd[1902]: time="2026-03-12T23:50:20.280553232Z" level=info msg="Container 5dc7ef189dd1fa5a1184fe8cd5e1b6e160a73802d351f63dcdd223eea42009a0: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:20.296716 containerd[1902]: time="2026-03-12T23:50:20.296666288Z" level=info msg="CreateContainer within sandbox \"323736d96a9004b0f36e8590cb860212e10678db2c36bbb6154c1fb1570e0e8b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5dc7ef189dd1fa5a1184fe8cd5e1b6e160a73802d351f63dcdd223eea42009a0\"" Mar 12 23:50:20.297477 containerd[1902]: time="2026-03-12T23:50:20.297452116Z" level=info msg="StartContainer for \"5dc7ef189dd1fa5a1184fe8cd5e1b6e160a73802d351f63dcdd223eea42009a0\"" Mar 12 23:50:20.298817 containerd[1902]: time="2026-03-12T23:50:20.298719394Z" level=info msg="connecting to shim 5dc7ef189dd1fa5a1184fe8cd5e1b6e160a73802d351f63dcdd223eea42009a0" address="unix:///run/containerd/s/c8332044d7c9be62d09d1c906712fc4807f5f408d00f6f5c04ffadfa1f0bf9ab" protocol=ttrpc version=3 Mar 12 23:50:20.315972 systemd[1]: Started cri-containerd-5dc7ef189dd1fa5a1184fe8cd5e1b6e160a73802d351f63dcdd223eea42009a0.scope - libcontainer container 5dc7ef189dd1fa5a1184fe8cd5e1b6e160a73802d351f63dcdd223eea42009a0. Mar 12 23:50:20.383103 containerd[1902]: time="2026-03-12T23:50:20.383052665Z" level=info msg="StartContainer for \"5dc7ef189dd1fa5a1184fe8cd5e1b6e160a73802d351f63dcdd223eea42009a0\" returns successfully" Mar 12 23:50:20.581239 kubelet[3419]: I0312 23:50:20.581195 3419 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-nginx-config\") pod \"a782722a-cf65-4007-829d-5a3c3581e01a\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " Mar 12 23:50:20.582049 kubelet[3419]: I0312 23:50:20.581262 3419 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-backend-key-pair\") pod \"a782722a-cf65-4007-829d-5a3c3581e01a\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " Mar 12 23:50:20.582049 kubelet[3419]: I0312 23:50:20.581284 3419 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfsph\" (UniqueName: \"kubernetes.io/projected/a782722a-cf65-4007-829d-5a3c3581e01a-kube-api-access-qfsph\") pod \"a782722a-cf65-4007-829d-5a3c3581e01a\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " Mar 12 23:50:20.582049 kubelet[3419]: I0312 23:50:20.581556 3419 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-ca-bundle\") pod \"a782722a-cf65-4007-829d-5a3c3581e01a\" (UID: \"a782722a-cf65-4007-829d-5a3c3581e01a\") " Mar 12 23:50:20.583018 kubelet[3419]: I0312 23:50:20.582985 3419 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "a782722a-cf65-4007-829d-5a3c3581e01a" (UID: "a782722a-cf65-4007-829d-5a3c3581e01a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:50:20.583824 kubelet[3419]: I0312 23:50:20.583626 3419 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "a782722a-cf65-4007-829d-5a3c3581e01a" (UID: "a782722a-cf65-4007-829d-5a3c3581e01a"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:50:20.586794 systemd[1]: var-lib-kubelet-pods-a782722a\x2dcf65\x2d4007\x2d829d\x2d5a3c3581e01a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 12 23:50:20.590589 systemd[1]: var-lib-kubelet-pods-a782722a\x2dcf65\x2d4007\x2d829d\x2d5a3c3581e01a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqfsph.mount: Deactivated successfully. Mar 12 23:50:20.591015 kubelet[3419]: I0312 23:50:20.590987 3419 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a782722a-cf65-4007-829d-5a3c3581e01a-kube-api-access-qfsph" (OuterVolumeSpecName: "kube-api-access-qfsph") pod "a782722a-cf65-4007-829d-5a3c3581e01a" (UID: "a782722a-cf65-4007-829d-5a3c3581e01a"). InnerVolumeSpecName "kube-api-access-qfsph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 23:50:20.591364 kubelet[3419]: I0312 23:50:20.591343 3419 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "a782722a-cf65-4007-829d-5a3c3581e01a" (UID: "a782722a-cf65-4007-829d-5a3c3581e01a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 23:50:20.682973 kubelet[3419]: I0312 23:50:20.682790 3419 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-backend-key-pair\") on node \"ci-4459.2.4-n-6470b86a4c\" DevicePath \"\"" Mar 12 23:50:20.682973 kubelet[3419]: I0312 23:50:20.682963 3419 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfsph\" (UniqueName: \"kubernetes.io/projected/a782722a-cf65-4007-829d-5a3c3581e01a-kube-api-access-qfsph\") on node \"ci-4459.2.4-n-6470b86a4c\" DevicePath \"\"" Mar 12 23:50:20.682973 kubelet[3419]: I0312 23:50:20.682972 3419 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-whisker-ca-bundle\") on node \"ci-4459.2.4-n-6470b86a4c\" DevicePath \"\"" Mar 12 23:50:20.682973 kubelet[3419]: I0312 23:50:20.682980 3419 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a782722a-cf65-4007-829d-5a3c3581e01a-nginx-config\") on node \"ci-4459.2.4-n-6470b86a4c\" DevicePath \"\"" Mar 12 23:50:21.259434 systemd[1]: Removed slice kubepods-besteffort-poda782722a_cf65_4007_829d_5a3c3581e01a.slice - libcontainer container kubepods-besteffort-poda782722a_cf65_4007_829d_5a3c3581e01a.slice. Mar 12 23:50:21.276022 kubelet[3419]: I0312 23:50:21.275736 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-n7l7d" podStartSLOduration=5.17613817 podStartE2EDuration="22.275709044s" podCreationTimestamp="2026-03-12 23:49:59 +0000 UTC" firstStartedPulling="2026-03-12 23:49:59.824445335 +0000 UTC m=+17.786688295" lastFinishedPulling="2026-03-12 23:50:16.924016209 +0000 UTC m=+34.886259169" observedRunningTime="2026-03-12 23:50:21.27456733 +0000 UTC m=+39.236810330" watchObservedRunningTime="2026-03-12 23:50:21.275709044 +0000 UTC m=+39.237952004" Mar 12 23:50:21.352238 systemd[1]: Created slice kubepods-besteffort-podb25b03b8_7215_4365_af8b_8944ab8c2ae6.slice - libcontainer container kubepods-besteffort-podb25b03b8_7215_4365_af8b_8944ab8c2ae6.slice. Mar 12 23:50:21.387274 kubelet[3419]: I0312 23:50:21.387222 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twwtl\" (UniqueName: \"kubernetes.io/projected/b25b03b8-7215-4365-af8b-8944ab8c2ae6-kube-api-access-twwtl\") pod \"whisker-77899559bd-94xh8\" (UID: \"b25b03b8-7215-4365-af8b-8944ab8c2ae6\") " pod="calico-system/whisker-77899559bd-94xh8" Mar 12 23:50:21.387274 kubelet[3419]: I0312 23:50:21.387267 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b25b03b8-7215-4365-af8b-8944ab8c2ae6-nginx-config\") pod \"whisker-77899559bd-94xh8\" (UID: \"b25b03b8-7215-4365-af8b-8944ab8c2ae6\") " pod="calico-system/whisker-77899559bd-94xh8" Mar 12 23:50:21.387274 kubelet[3419]: I0312 23:50:21.387277 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b25b03b8-7215-4365-af8b-8944ab8c2ae6-whisker-ca-bundle\") pod \"whisker-77899559bd-94xh8\" (UID: \"b25b03b8-7215-4365-af8b-8944ab8c2ae6\") " pod="calico-system/whisker-77899559bd-94xh8" Mar 12 23:50:21.387522 kubelet[3419]: I0312 23:50:21.387290 3419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b25b03b8-7215-4365-af8b-8944ab8c2ae6-whisker-backend-key-pair\") pod \"whisker-77899559bd-94xh8\" (UID: \"b25b03b8-7215-4365-af8b-8944ab8c2ae6\") " pod="calico-system/whisker-77899559bd-94xh8" Mar 12 23:50:21.662559 containerd[1902]: time="2026-03-12T23:50:21.662239556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77899559bd-94xh8,Uid:b25b03b8-7215-4365-af8b-8944ab8c2ae6,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:21.842201 systemd-networkd[1479]: cali7db7597d748: Link UP Mar 12 23:50:21.842323 systemd-networkd[1479]: cali7db7597d748: Gained carrier Mar 12 23:50:21.859887 containerd[1902]: 2026-03-12 23:50:21.685 [ERROR][4560] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 23:50:21.859887 containerd[1902]: 2026-03-12 23:50:21.706 [INFO][4560] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0 whisker-77899559bd- calico-system b25b03b8-7215-4365-af8b-8944ab8c2ae6 925 0 2026-03-12 23:50:21 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:77899559bd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c whisker-77899559bd-94xh8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali7db7597d748 [] [] }} ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-" Mar 12 23:50:21.859887 containerd[1902]: 2026-03-12 23:50:21.706 [INFO][4560] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" Mar 12 23:50:21.859887 containerd[1902]: 2026-03-12 23:50:21.745 [INFO][4581] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" HandleID="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Workload="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.758 [INFO][4581] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" HandleID="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Workload="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"whisker-77899559bd-94xh8", "timestamp":"2026-03-12 23:50:21.745260363 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030b080)} Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.758 [INFO][4581] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.758 [INFO][4581] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.758 [INFO][4581] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.760 [INFO][4581] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.772 [INFO][4581] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.778 [INFO][4581] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.781 [INFO][4581] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860078 containerd[1902]: 2026-03-12 23:50:21.786 [INFO][4581] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860215 containerd[1902]: 2026-03-12 23:50:21.786 [INFO][4581] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860215 containerd[1902]: 2026-03-12 23:50:21.787 [INFO][4581] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0 Mar 12 23:50:21.860215 containerd[1902]: 2026-03-12 23:50:21.795 [INFO][4581] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860215 containerd[1902]: 2026-03-12 23:50:21.807 [INFO][4581] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.193/26] block=192.168.98.192/26 handle="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860215 containerd[1902]: 2026-03-12 23:50:21.807 [INFO][4581] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.193/26] handle="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:21.860215 containerd[1902]: 2026-03-12 23:50:21.807 [INFO][4581] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:21.860215 containerd[1902]: 2026-03-12 23:50:21.808 [INFO][4581] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.193/26] IPv6=[] ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" HandleID="k8s-pod-network.0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Workload="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" Mar 12 23:50:21.860309 containerd[1902]: 2026-03-12 23:50:21.811 [INFO][4560] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0", GenerateName:"whisker-77899559bd-", Namespace:"calico-system", SelfLink:"", UID:"b25b03b8-7215-4365-af8b-8944ab8c2ae6", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 50, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77899559bd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"whisker-77899559bd-94xh8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7db7597d748", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:21.860309 containerd[1902]: 2026-03-12 23:50:21.811 [INFO][4560] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.193/32] ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" Mar 12 23:50:21.860357 containerd[1902]: 2026-03-12 23:50:21.811 [INFO][4560] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7db7597d748 ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" Mar 12 23:50:21.860357 containerd[1902]: 2026-03-12 23:50:21.841 [INFO][4560] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" Mar 12 23:50:21.860387 containerd[1902]: 2026-03-12 23:50:21.842 [INFO][4560] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0", GenerateName:"whisker-77899559bd-", Namespace:"calico-system", SelfLink:"", UID:"b25b03b8-7215-4365-af8b-8944ab8c2ae6", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 50, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77899559bd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0", Pod:"whisker-77899559bd-94xh8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7db7597d748", MAC:"0a:da:37:90:af:64", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:21.860417 containerd[1902]: 2026-03-12 23:50:21.855 [INFO][4560] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" Namespace="calico-system" Pod="whisker-77899559bd-94xh8" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-whisker--77899559bd--94xh8-eth0" Mar 12 23:50:21.911827 containerd[1902]: time="2026-03-12T23:50:21.911775778Z" level=info msg="connecting to shim 0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0" address="unix:///run/containerd/s/9bedf715d275d04db538017808c8db77826a820fc238b49eb1db245e8682aa8d" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:21.949035 systemd[1]: Started cri-containerd-0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0.scope - libcontainer container 0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0. Mar 12 23:50:22.004264 containerd[1902]: time="2026-03-12T23:50:22.004221726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77899559bd-94xh8,Uid:b25b03b8-7215-4365-af8b-8944ab8c2ae6,Namespace:calico-system,Attempt:0,} returns sandbox id \"0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0\"" Mar 12 23:50:22.009308 containerd[1902]: time="2026-03-12T23:50:22.009271741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 12 23:50:22.131167 kubelet[3419]: I0312 23:50:22.131103 3419 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a782722a-cf65-4007-829d-5a3c3581e01a" path="/var/lib/kubelet/pods/a782722a-cf65-4007-829d-5a3c3581e01a/volumes" Mar 12 23:50:22.518195 systemd-networkd[1479]: vxlan.calico: Link UP Mar 12 23:50:22.518202 systemd-networkd[1479]: vxlan.calico: Gained carrier Mar 12 23:50:23.422499 containerd[1902]: time="2026-03-12T23:50:23.422390466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:23.425291 containerd[1902]: time="2026-03-12T23:50:23.425255418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 12 23:50:23.428886 containerd[1902]: time="2026-03-12T23:50:23.428848796Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:23.434682 containerd[1902]: time="2026-03-12T23:50:23.434317122Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:23.435291 containerd[1902]: time="2026-03-12T23:50:23.435270276Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.425819545s" Mar 12 23:50:23.435434 containerd[1902]: time="2026-03-12T23:50:23.435395209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 12 23:50:23.444214 containerd[1902]: time="2026-03-12T23:50:23.444184839Z" level=info msg="CreateContainer within sandbox \"0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 12 23:50:23.464867 containerd[1902]: time="2026-03-12T23:50:23.464826099Z" level=info msg="Container 3cecbc3879eb1c7e7ae06b2fa11f261a7e6346498ca4c8d9c782013879c6b51e: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:23.483651 containerd[1902]: time="2026-03-12T23:50:23.483580466Z" level=info msg="CreateContainer within sandbox \"0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"3cecbc3879eb1c7e7ae06b2fa11f261a7e6346498ca4c8d9c782013879c6b51e\"" Mar 12 23:50:23.485168 containerd[1902]: time="2026-03-12T23:50:23.485128546Z" level=info msg="StartContainer for \"3cecbc3879eb1c7e7ae06b2fa11f261a7e6346498ca4c8d9c782013879c6b51e\"" Mar 12 23:50:23.487185 containerd[1902]: time="2026-03-12T23:50:23.487114890Z" level=info msg="connecting to shim 3cecbc3879eb1c7e7ae06b2fa11f261a7e6346498ca4c8d9c782013879c6b51e" address="unix:///run/containerd/s/9bedf715d275d04db538017808c8db77826a820fc238b49eb1db245e8682aa8d" protocol=ttrpc version=3 Mar 12 23:50:23.512974 systemd[1]: Started cri-containerd-3cecbc3879eb1c7e7ae06b2fa11f261a7e6346498ca4c8d9c782013879c6b51e.scope - libcontainer container 3cecbc3879eb1c7e7ae06b2fa11f261a7e6346498ca4c8d9c782013879c6b51e. Mar 12 23:50:23.547439 containerd[1902]: time="2026-03-12T23:50:23.547396185Z" level=info msg="StartContainer for \"3cecbc3879eb1c7e7ae06b2fa11f261a7e6346498ca4c8d9c782013879c6b51e\" returns successfully" Mar 12 23:50:23.549286 containerd[1902]: time="2026-03-12T23:50:23.549002571Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 12 23:50:23.588936 systemd-networkd[1479]: vxlan.calico: Gained IPv6LL Mar 12 23:50:23.654468 systemd-networkd[1479]: cali7db7597d748: Gained IPv6LL Mar 12 23:50:24.191234 kubelet[3419]: I0312 23:50:24.191186 3419 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:25.478951 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4288243373.mount: Deactivated successfully. Mar 12 23:50:25.540501 containerd[1902]: time="2026-03-12T23:50:25.540446019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:25.544514 containerd[1902]: time="2026-03-12T23:50:25.544469965Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 12 23:50:25.547842 containerd[1902]: time="2026-03-12T23:50:25.547802995Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:25.552687 containerd[1902]: time="2026-03-12T23:50:25.552622541Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:25.552990 containerd[1902]: time="2026-03-12T23:50:25.552875456Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.003839627s" Mar 12 23:50:25.552990 containerd[1902]: time="2026-03-12T23:50:25.552902297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 12 23:50:25.565109 containerd[1902]: time="2026-03-12T23:50:25.565077459Z" level=info msg="CreateContainer within sandbox \"0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 12 23:50:25.587646 containerd[1902]: time="2026-03-12T23:50:25.587601702Z" level=info msg="Container 01eca326771dc710331f2c4a90b553e4999224cb77702ebaeb4a0bbd7d9b1725: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:25.593292 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3216894391.mount: Deactivated successfully. Mar 12 23:50:25.610314 containerd[1902]: time="2026-03-12T23:50:25.610263798Z" level=info msg="CreateContainer within sandbox \"0cb4a13929d5f0eb8ed3ef9f2708f1ab56fad4b596ababaa2220ab2c38fc8eb0\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"01eca326771dc710331f2c4a90b553e4999224cb77702ebaeb4a0bbd7d9b1725\"" Mar 12 23:50:25.612362 containerd[1902]: time="2026-03-12T23:50:25.611384147Z" level=info msg="StartContainer for \"01eca326771dc710331f2c4a90b553e4999224cb77702ebaeb4a0bbd7d9b1725\"" Mar 12 23:50:25.613314 containerd[1902]: time="2026-03-12T23:50:25.613291456Z" level=info msg="connecting to shim 01eca326771dc710331f2c4a90b553e4999224cb77702ebaeb4a0bbd7d9b1725" address="unix:///run/containerd/s/9bedf715d275d04db538017808c8db77826a820fc238b49eb1db245e8682aa8d" protocol=ttrpc version=3 Mar 12 23:50:25.642277 systemd[1]: Started cri-containerd-01eca326771dc710331f2c4a90b553e4999224cb77702ebaeb4a0bbd7d9b1725.scope - libcontainer container 01eca326771dc710331f2c4a90b553e4999224cb77702ebaeb4a0bbd7d9b1725. Mar 12 23:50:25.706885 containerd[1902]: time="2026-03-12T23:50:25.706822094Z" level=info msg="StartContainer for \"01eca326771dc710331f2c4a90b553e4999224cb77702ebaeb4a0bbd7d9b1725\" returns successfully" Mar 12 23:50:30.138086 containerd[1902]: time="2026-03-12T23:50:30.138044075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2s2cx,Uid:8511de64-19de-4dbf-bfdc-44a7333cd73c,Namespace:kube-system,Attempt:0,}" Mar 12 23:50:30.235145 systemd-networkd[1479]: calia97626e0e4d: Link UP Mar 12 23:50:30.236035 systemd-networkd[1479]: calia97626e0e4d: Gained carrier Mar 12 23:50:30.250930 containerd[1902]: 2026-03-12 23:50:30.174 [INFO][4991] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0 coredns-66bc5c9577- kube-system 8511de64-19de-4dbf-bfdc-44a7333cd73c 869 0 2026-03-12 23:49:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c coredns-66bc5c9577-2s2cx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia97626e0e4d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-" Mar 12 23:50:30.250930 containerd[1902]: 2026-03-12 23:50:30.174 [INFO][4991] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" Mar 12 23:50:30.250930 containerd[1902]: 2026-03-12 23:50:30.194 [INFO][5004] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" HandleID="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Workload="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.200 [INFO][5004] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" HandleID="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Workload="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed870), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"coredns-66bc5c9577-2s2cx", "timestamp":"2026-03-12 23:50:30.194390889 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003a8f20)} Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.200 [INFO][5004] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.200 [INFO][5004] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.200 [INFO][5004] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.202 [INFO][5004] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.206 [INFO][5004] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.211 [INFO][5004] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.212 [INFO][5004] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251143 containerd[1902]: 2026-03-12 23:50:30.214 [INFO][5004] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251296 containerd[1902]: 2026-03-12 23:50:30.214 [INFO][5004] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251296 containerd[1902]: 2026-03-12 23:50:30.215 [INFO][5004] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c Mar 12 23:50:30.251296 containerd[1902]: 2026-03-12 23:50:30.224 [INFO][5004] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251296 containerd[1902]: 2026-03-12 23:50:30.229 [INFO][5004] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.194/26] block=192.168.98.192/26 handle="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251296 containerd[1902]: 2026-03-12 23:50:30.229 [INFO][5004] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.194/26] handle="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:30.251296 containerd[1902]: 2026-03-12 23:50:30.229 [INFO][5004] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:30.251296 containerd[1902]: 2026-03-12 23:50:30.229 [INFO][5004] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.194/26] IPv6=[] ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" HandleID="k8s-pod-network.be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Workload="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" Mar 12 23:50:30.251392 containerd[1902]: 2026-03-12 23:50:30.231 [INFO][4991] cni-plugin/k8s.go 418: Populated endpoint ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8511de64-19de-4dbf-bfdc-44a7333cd73c", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"coredns-66bc5c9577-2s2cx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia97626e0e4d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:30.251392 containerd[1902]: 2026-03-12 23:50:30.232 [INFO][4991] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.194/32] ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" Mar 12 23:50:30.251392 containerd[1902]: 2026-03-12 23:50:30.232 [INFO][4991] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia97626e0e4d ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" Mar 12 23:50:30.251392 containerd[1902]: 2026-03-12 23:50:30.236 [INFO][4991] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" Mar 12 23:50:30.251392 containerd[1902]: 2026-03-12 23:50:30.237 [INFO][4991] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8511de64-19de-4dbf-bfdc-44a7333cd73c", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c", Pod:"coredns-66bc5c9577-2s2cx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia97626e0e4d", MAC:"9e:f8:c5:0a:9b:87", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:30.251511 containerd[1902]: 2026-03-12 23:50:30.248 [INFO][4991] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" Namespace="kube-system" Pod="coredns-66bc5c9577-2s2cx" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--2s2cx-eth0" Mar 12 23:50:30.256289 kubelet[3419]: I0312 23:50:30.256216 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-77899559bd-94xh8" podStartSLOduration=5.709048904 podStartE2EDuration="9.256198272s" podCreationTimestamp="2026-03-12 23:50:21 +0000 UTC" firstStartedPulling="2026-03-12 23:50:22.009021508 +0000 UTC m=+39.971264476" lastFinishedPulling="2026-03-12 23:50:25.556170884 +0000 UTC m=+43.518413844" observedRunningTime="2026-03-12 23:50:26.28912618 +0000 UTC m=+44.251369140" watchObservedRunningTime="2026-03-12 23:50:30.256198272 +0000 UTC m=+48.218441232" Mar 12 23:50:30.312763 containerd[1902]: time="2026-03-12T23:50:30.312714453Z" level=info msg="connecting to shim be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c" address="unix:///run/containerd/s/844b3573efc09c37452584e1dd0dce332cb6ff16cacdf6163bd6d6d44856ae21" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:30.343287 systemd[1]: Started cri-containerd-be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c.scope - libcontainer container be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c. Mar 12 23:50:30.382493 containerd[1902]: time="2026-03-12T23:50:30.382450958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2s2cx,Uid:8511de64-19de-4dbf-bfdc-44a7333cd73c,Namespace:kube-system,Attempt:0,} returns sandbox id \"be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c\"" Mar 12 23:50:30.393185 containerd[1902]: time="2026-03-12T23:50:30.392635427Z" level=info msg="CreateContainer within sandbox \"be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:50:30.416032 containerd[1902]: time="2026-03-12T23:50:30.415938464Z" level=info msg="Container eaa40ddf3b3aa736d0e2b53cd74b71f5252ca0aa62f34c4dce1a369a63d0e3a3: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:30.419523 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3330200236.mount: Deactivated successfully. Mar 12 23:50:30.430234 containerd[1902]: time="2026-03-12T23:50:30.430189766Z" level=info msg="CreateContainer within sandbox \"be111a96d3386265abd52ebe109fe3f118f2cbf89be309f77ed479bf30910d1c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"eaa40ddf3b3aa736d0e2b53cd74b71f5252ca0aa62f34c4dce1a369a63d0e3a3\"" Mar 12 23:50:30.431066 containerd[1902]: time="2026-03-12T23:50:30.431041384Z" level=info msg="StartContainer for \"eaa40ddf3b3aa736d0e2b53cd74b71f5252ca0aa62f34c4dce1a369a63d0e3a3\"" Mar 12 23:50:30.432004 containerd[1902]: time="2026-03-12T23:50:30.431941204Z" level=info msg="connecting to shim eaa40ddf3b3aa736d0e2b53cd74b71f5252ca0aa62f34c4dce1a369a63d0e3a3" address="unix:///run/containerd/s/844b3573efc09c37452584e1dd0dce332cb6ff16cacdf6163bd6d6d44856ae21" protocol=ttrpc version=3 Mar 12 23:50:30.448973 systemd[1]: Started cri-containerd-eaa40ddf3b3aa736d0e2b53cd74b71f5252ca0aa62f34c4dce1a369a63d0e3a3.scope - libcontainer container eaa40ddf3b3aa736d0e2b53cd74b71f5252ca0aa62f34c4dce1a369a63d0e3a3. Mar 12 23:50:30.479403 containerd[1902]: time="2026-03-12T23:50:30.479360023Z" level=info msg="StartContainer for \"eaa40ddf3b3aa736d0e2b53cd74b71f5252ca0aa62f34c4dce1a369a63d0e3a3\" returns successfully" Mar 12 23:50:31.313768 kubelet[3419]: I0312 23:50:31.313685 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-2s2cx" podStartSLOduration=43.313402646 podStartE2EDuration="43.313402646s" podCreationTimestamp="2026-03-12 23:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:50:31.298952888 +0000 UTC m=+49.261195848" watchObservedRunningTime="2026-03-12 23:50:31.313402646 +0000 UTC m=+49.275645606" Mar 12 23:50:31.781310 systemd-networkd[1479]: calia97626e0e4d: Gained IPv6LL Mar 12 23:50:32.137208 containerd[1902]: time="2026-03-12T23:50:32.136904082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-9q596,Uid:3120a7db-f697-4e98-8359-ab8a71eeeec4,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:32.142396 containerd[1902]: time="2026-03-12T23:50:32.142327921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g8rv,Uid:fc362139-094a-4907-98c0-8c9e87d14519,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:32.149198 containerd[1902]: time="2026-03-12T23:50:32.149155040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-2qj8l,Uid:7deee45c-5c09-47fa-842b-662682ec010b,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:32.319450 systemd-networkd[1479]: cali60f2b80b0cf: Link UP Mar 12 23:50:32.321179 systemd-networkd[1479]: cali60f2b80b0cf: Gained carrier Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.203 [INFO][5119] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0 calico-apiserver-7d4f588999- calico-system 3120a7db-f697-4e98-8359-ab8a71eeeec4 868 0 2026-03-12 23:49:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d4f588999 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c calico-apiserver-7d4f588999-9q596 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali60f2b80b0cf [] [] }} ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.203 [INFO][5119] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.250 [INFO][5153] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" HandleID="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.265 [INFO][5153] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" HandleID="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbe80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"calico-apiserver-7d4f588999-9q596", "timestamp":"2026-03-12 23:50:32.250072352 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030fa20)} Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.265 [INFO][5153] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.265 [INFO][5153] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.265 [INFO][5153] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.268 [INFO][5153] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.274 [INFO][5153] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.281 [INFO][5153] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.284 [INFO][5153] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.288 [INFO][5153] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.288 [INFO][5153] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.290 [INFO][5153] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256 Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.299 [INFO][5153] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.313 [INFO][5153] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.195/26] block=192.168.98.192/26 handle="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.313 [INFO][5153] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.195/26] handle="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.313 [INFO][5153] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:32.340867 containerd[1902]: 2026-03-12 23:50:32.313 [INFO][5153] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.195/26] IPv6=[] ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" HandleID="k8s-pod-network.60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" Mar 12 23:50:32.341301 containerd[1902]: 2026-03-12 23:50:32.316 [INFO][5119] cni-plugin/k8s.go 418: Populated endpoint ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0", GenerateName:"calico-apiserver-7d4f588999-", Namespace:"calico-system", SelfLink:"", UID:"3120a7db-f697-4e98-8359-ab8a71eeeec4", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d4f588999", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"calico-apiserver-7d4f588999-9q596", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali60f2b80b0cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:32.341301 containerd[1902]: 2026-03-12 23:50:32.316 [INFO][5119] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.195/32] ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" Mar 12 23:50:32.341301 containerd[1902]: 2026-03-12 23:50:32.316 [INFO][5119] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60f2b80b0cf ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" Mar 12 23:50:32.341301 containerd[1902]: 2026-03-12 23:50:32.320 [INFO][5119] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" Mar 12 23:50:32.341301 containerd[1902]: 2026-03-12 23:50:32.320 [INFO][5119] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0", GenerateName:"calico-apiserver-7d4f588999-", Namespace:"calico-system", SelfLink:"", UID:"3120a7db-f697-4e98-8359-ab8a71eeeec4", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d4f588999", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256", Pod:"calico-apiserver-7d4f588999-9q596", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali60f2b80b0cf", MAC:"7a:70:8d:a6:ab:35", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:32.341301 containerd[1902]: 2026-03-12 23:50:32.338 [INFO][5119] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-9q596" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--9q596-eth0" Mar 12 23:50:32.390040 containerd[1902]: time="2026-03-12T23:50:32.389911842Z" level=info msg="connecting to shim 60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256" address="unix:///run/containerd/s/f08a9668905e5b29c2a1150bcbcea6e49ad84ff2146f2df04705d3e4e6baa0a6" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:32.423348 systemd-networkd[1479]: calicbb3c4e468d: Link UP Mar 12 23:50:32.424434 systemd-networkd[1479]: calicbb3c4e468d: Gained carrier Mar 12 23:50:32.431297 systemd[1]: Started cri-containerd-60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256.scope - libcontainer container 60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256. Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.224 [INFO][5140] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0 csi-node-driver- calico-system fc362139-094a-4907-98c0-8c9e87d14519 721 0 2026-03-12 23:49:59 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c csi-node-driver-7g8rv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calicbb3c4e468d [] [] }} ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.224 [INFO][5140] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.257 [INFO][5163] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" HandleID="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Workload="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.267 [INFO][5163] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" HandleID="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Workload="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f9250), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"csi-node-driver-7g8rv", "timestamp":"2026-03-12 23:50:32.257222012 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003771e0)} Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.268 [INFO][5163] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.313 [INFO][5163] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.313 [INFO][5163] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.368 [INFO][5163] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.375 [INFO][5163] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.384 [INFO][5163] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.387 [INFO][5163] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.390 [INFO][5163] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.391 [INFO][5163] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.393 [INFO][5163] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955 Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.397 [INFO][5163] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.406 [INFO][5163] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.196/26] block=192.168.98.192/26 handle="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.406 [INFO][5163] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.196/26] handle="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.407 [INFO][5163] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:32.449178 containerd[1902]: 2026-03-12 23:50:32.407 [INFO][5163] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.196/26] IPv6=[] ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" HandleID="k8s-pod-network.2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Workload="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" Mar 12 23:50:32.450528 containerd[1902]: 2026-03-12 23:50:32.410 [INFO][5140] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fc362139-094a-4907-98c0-8c9e87d14519", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"csi-node-driver-7g8rv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicbb3c4e468d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:32.450528 containerd[1902]: 2026-03-12 23:50:32.410 [INFO][5140] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.196/32] ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" Mar 12 23:50:32.450528 containerd[1902]: 2026-03-12 23:50:32.410 [INFO][5140] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicbb3c4e468d ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" Mar 12 23:50:32.450528 containerd[1902]: 2026-03-12 23:50:32.425 [INFO][5140] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" Mar 12 23:50:32.450528 containerd[1902]: 2026-03-12 23:50:32.427 [INFO][5140] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fc362139-094a-4907-98c0-8c9e87d14519", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955", Pod:"csi-node-driver-7g8rv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicbb3c4e468d", MAC:"8a:21:d9:03:2d:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:32.450528 containerd[1902]: 2026-03-12 23:50:32.443 [INFO][5140] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" Namespace="calico-system" Pod="csi-node-driver-7g8rv" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-csi--node--driver--7g8rv-eth0" Mar 12 23:50:32.488071 containerd[1902]: time="2026-03-12T23:50:32.488025131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-9q596,Uid:3120a7db-f697-4e98-8359-ab8a71eeeec4,Namespace:calico-system,Attempt:0,} returns sandbox id \"60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256\"" Mar 12 23:50:32.493171 containerd[1902]: time="2026-03-12T23:50:32.493110893Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:50:32.511401 containerd[1902]: time="2026-03-12T23:50:32.511241389Z" level=info msg="connecting to shim 2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955" address="unix:///run/containerd/s/195d980eab186186371f4a830ba5df9801a7edbd5b1be565d0dade6625c51085" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:32.546132 systemd-networkd[1479]: cali3c8cf1de97a: Link UP Mar 12 23:50:32.548356 systemd-networkd[1479]: cali3c8cf1de97a: Gained carrier Mar 12 23:50:32.552988 systemd[1]: Started cri-containerd-2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955.scope - libcontainer container 2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955. Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.230 [INFO][5130] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0 calico-apiserver-7d4f588999- calico-system 7deee45c-5c09-47fa-842b-662682ec010b 867 0 2026-03-12 23:49:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d4f588999 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c calico-apiserver-7d4f588999-2qj8l eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali3c8cf1de97a [] [] }} ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.230 [INFO][5130] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.272 [INFO][5168] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" HandleID="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.287 [INFO][5168] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" HandleID="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002e5a10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"calico-apiserver-7d4f588999-2qj8l", "timestamp":"2026-03-12 23:50:32.272432584 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004331e0)} Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.288 [INFO][5168] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.407 [INFO][5168] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.407 [INFO][5168] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.470 [INFO][5168] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.479 [INFO][5168] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.487 [INFO][5168] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.491 [INFO][5168] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.497 [INFO][5168] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.498 [INFO][5168] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.502 [INFO][5168] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.512 [INFO][5168] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.528 [INFO][5168] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.197/26] block=192.168.98.192/26 handle="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.529 [INFO][5168] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.197/26] handle="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.529 [INFO][5168] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:32.567013 containerd[1902]: 2026-03-12 23:50:32.529 [INFO][5168] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.197/26] IPv6=[] ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" HandleID="k8s-pod-network.b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" Mar 12 23:50:32.567453 containerd[1902]: 2026-03-12 23:50:32.537 [INFO][5130] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0", GenerateName:"calico-apiserver-7d4f588999-", Namespace:"calico-system", SelfLink:"", UID:"7deee45c-5c09-47fa-842b-662682ec010b", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d4f588999", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"calico-apiserver-7d4f588999-2qj8l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3c8cf1de97a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:32.567453 containerd[1902]: 2026-03-12 23:50:32.537 [INFO][5130] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.197/32] ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" Mar 12 23:50:32.567453 containerd[1902]: 2026-03-12 23:50:32.537 [INFO][5130] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3c8cf1de97a ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" Mar 12 23:50:32.567453 containerd[1902]: 2026-03-12 23:50:32.546 [INFO][5130] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" Mar 12 23:50:32.567453 containerd[1902]: 2026-03-12 23:50:32.549 [INFO][5130] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0", GenerateName:"calico-apiserver-7d4f588999-", Namespace:"calico-system", SelfLink:"", UID:"7deee45c-5c09-47fa-842b-662682ec010b", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d4f588999", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b", Pod:"calico-apiserver-7d4f588999-2qj8l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3c8cf1de97a", MAC:"ba:da:ea:82:59:e6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:32.567453 containerd[1902]: 2026-03-12 23:50:32.563 [INFO][5130] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" Namespace="calico-system" Pod="calico-apiserver-7d4f588999-2qj8l" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--apiserver--7d4f588999--2qj8l-eth0" Mar 12 23:50:32.600428 containerd[1902]: time="2026-03-12T23:50:32.600375177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g8rv,Uid:fc362139-094a-4907-98c0-8c9e87d14519,Namespace:calico-system,Attempt:0,} returns sandbox id \"2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955\"" Mar 12 23:50:32.616998 containerd[1902]: time="2026-03-12T23:50:32.616948676Z" level=info msg="connecting to shim b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b" address="unix:///run/containerd/s/ad6c965df07b78b3436da51d41b9b82db8ca127b635f43c8273ce41993aeb986" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:32.635972 systemd[1]: Started cri-containerd-b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b.scope - libcontainer container b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b. Mar 12 23:50:32.680622 containerd[1902]: time="2026-03-12T23:50:32.680581347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d4f588999-2qj8l,Uid:7deee45c-5c09-47fa-842b-662682ec010b,Namespace:calico-system,Attempt:0,} returns sandbox id \"b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b\"" Mar 12 23:50:33.133216 containerd[1902]: time="2026-03-12T23:50:33.133161794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8ddc645-5c5s6,Uid:64aac116-a096-4d74-bfd5-22c32d47e0c7,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:33.237842 systemd-networkd[1479]: calib12cdbac9be: Link UP Mar 12 23:50:33.238994 systemd-networkd[1479]: calib12cdbac9be: Gained carrier Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.170 [INFO][5393] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0 calico-kube-controllers-59c8ddc645- calico-system 64aac116-a096-4d74-bfd5-22c32d47e0c7 872 0 2026-03-12 23:49:59 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:59c8ddc645 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c calico-kube-controllers-59c8ddc645-5c5s6 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib12cdbac9be [] [] }} ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.170 [INFO][5393] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.190 [INFO][5404] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" HandleID="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.196 [INFO][5404] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" HandleID="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"calico-kube-controllers-59c8ddc645-5c5s6", "timestamp":"2026-03-12 23:50:33.190564242 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030d080)} Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.196 [INFO][5404] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.196 [INFO][5404] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.196 [INFO][5404] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.199 [INFO][5404] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.204 [INFO][5404] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.209 [INFO][5404] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.211 [INFO][5404] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.213 [INFO][5404] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.213 [INFO][5404] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.215 [INFO][5404] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.221 [INFO][5404] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.230 [INFO][5404] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.198/26] block=192.168.98.192/26 handle="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.230 [INFO][5404] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.198/26] handle="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.230 [INFO][5404] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:33.256954 containerd[1902]: 2026-03-12 23:50:33.230 [INFO][5404] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.198/26] IPv6=[] ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" HandleID="k8s-pod-network.ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Workload="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" Mar 12 23:50:33.258543 containerd[1902]: 2026-03-12 23:50:33.233 [INFO][5393] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0", GenerateName:"calico-kube-controllers-59c8ddc645-", Namespace:"calico-system", SelfLink:"", UID:"64aac116-a096-4d74-bfd5-22c32d47e0c7", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59c8ddc645", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"calico-kube-controllers-59c8ddc645-5c5s6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib12cdbac9be", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:33.258543 containerd[1902]: 2026-03-12 23:50:33.233 [INFO][5393] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.198/32] ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" Mar 12 23:50:33.258543 containerd[1902]: 2026-03-12 23:50:33.233 [INFO][5393] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib12cdbac9be ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" Mar 12 23:50:33.258543 containerd[1902]: 2026-03-12 23:50:33.239 [INFO][5393] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" Mar 12 23:50:33.258543 containerd[1902]: 2026-03-12 23:50:33.239 [INFO][5393] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0", GenerateName:"calico-kube-controllers-59c8ddc645-", Namespace:"calico-system", SelfLink:"", UID:"64aac116-a096-4d74-bfd5-22c32d47e0c7", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59c8ddc645", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf", Pod:"calico-kube-controllers-59c8ddc645-5c5s6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib12cdbac9be", MAC:"22:01:f9:e9:9b:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:33.258543 containerd[1902]: 2026-03-12 23:50:33.253 [INFO][5393] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" Namespace="calico-system" Pod="calico-kube-controllers-59c8ddc645-5c5s6" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-calico--kube--controllers--59c8ddc645--5c5s6-eth0" Mar 12 23:50:33.306338 containerd[1902]: time="2026-03-12T23:50:33.306294894Z" level=info msg="connecting to shim ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf" address="unix:///run/containerd/s/6d0ca4b859f6fc5b337845cfa153850ae9520fda74be68476c893bda92db1fa9" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:33.325961 systemd[1]: Started cri-containerd-ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf.scope - libcontainer container ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf. Mar 12 23:50:33.367467 containerd[1902]: time="2026-03-12T23:50:33.367409490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8ddc645-5c5s6,Uid:64aac116-a096-4d74-bfd5-22c32d47e0c7,Namespace:calico-system,Attempt:0,} returns sandbox id \"ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf\"" Mar 12 23:50:33.829003 systemd-networkd[1479]: calicbb3c4e468d: Gained IPv6LL Mar 12 23:50:34.085936 systemd-networkd[1479]: cali3c8cf1de97a: Gained IPv6LL Mar 12 23:50:34.135262 containerd[1902]: time="2026-03-12T23:50:34.135223641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-2h6dr,Uid:12f9d181-efd0-4c73-821a-8f2f07ff2e48,Namespace:calico-system,Attempt:0,}" Mar 12 23:50:34.148937 systemd-networkd[1479]: cali60f2b80b0cf: Gained IPv6LL Mar 12 23:50:34.243220 systemd-networkd[1479]: cali5671e4ac2a4: Link UP Mar 12 23:50:34.244375 systemd-networkd[1479]: cali5671e4ac2a4: Gained carrier Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.176 [INFO][5486] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0 goldmane-cccfbd5cf- calico-system 12f9d181-efd0-4c73-821a-8f2f07ff2e48 870 0 2026-03-12 23:49:58 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c goldmane-cccfbd5cf-2h6dr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5671e4ac2a4 [] [] }} ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.176 [INFO][5486] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.196 [INFO][5502] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" HandleID="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Workload="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.203 [INFO][5502] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" HandleID="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Workload="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273270), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"goldmane-cccfbd5cf-2h6dr", "timestamp":"2026-03-12 23:50:34.196673178 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002e3080)} Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.203 [INFO][5502] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.203 [INFO][5502] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.203 [INFO][5502] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.205 [INFO][5502] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.210 [INFO][5502] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.215 [INFO][5502] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.217 [INFO][5502] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.220 [INFO][5502] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.220 [INFO][5502] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.221 [INFO][5502] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.228 [INFO][5502] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.236 [INFO][5502] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.199/26] block=192.168.98.192/26 handle="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.236 [INFO][5502] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.199/26] handle="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.237 [INFO][5502] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:34.264225 containerd[1902]: 2026-03-12 23:50:34.237 [INFO][5502] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.199/26] IPv6=[] ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" HandleID="k8s-pod-network.3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Workload="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" Mar 12 23:50:34.266598 containerd[1902]: 2026-03-12 23:50:34.239 [INFO][5486] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"12f9d181-efd0-4c73-821a-8f2f07ff2e48", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"goldmane-cccfbd5cf-2h6dr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5671e4ac2a4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:34.266598 containerd[1902]: 2026-03-12 23:50:34.239 [INFO][5486] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.199/32] ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" Mar 12 23:50:34.266598 containerd[1902]: 2026-03-12 23:50:34.239 [INFO][5486] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5671e4ac2a4 ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" Mar 12 23:50:34.266598 containerd[1902]: 2026-03-12 23:50:34.244 [INFO][5486] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" Mar 12 23:50:34.266598 containerd[1902]: 2026-03-12 23:50:34.244 [INFO][5486] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"12f9d181-efd0-4c73-821a-8f2f07ff2e48", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e", Pod:"goldmane-cccfbd5cf-2h6dr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5671e4ac2a4", MAC:"4a:fe:9c:cf:f2:4c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:34.266598 containerd[1902]: 2026-03-12 23:50:34.259 [INFO][5486] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" Namespace="calico-system" Pod="goldmane-cccfbd5cf-2h6dr" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-goldmane--cccfbd5cf--2h6dr-eth0" Mar 12 23:50:34.309174 containerd[1902]: time="2026-03-12T23:50:34.309128196Z" level=info msg="connecting to shim 3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e" address="unix:///run/containerd/s/299bc4bc8e14e7abeacf9bec875123dd52c9eff8218cb70efde5771fbba1553e" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:34.337958 systemd[1]: Started cri-containerd-3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e.scope - libcontainer container 3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e. Mar 12 23:50:34.383688 containerd[1902]: time="2026-03-12T23:50:34.383616339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-2h6dr,Uid:12f9d181-efd0-4c73-821a-8f2f07ff2e48,Namespace:calico-system,Attempt:0,} returns sandbox id \"3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e\"" Mar 12 23:50:34.469007 systemd-networkd[1479]: calib12cdbac9be: Gained IPv6LL Mar 12 23:50:35.193276 containerd[1902]: time="2026-03-12T23:50:35.193226279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w2ldq,Uid:436fdece-c6ec-472d-9cd2-fa384b83e17b,Namespace:kube-system,Attempt:0,}" Mar 12 23:50:35.342435 systemd-networkd[1479]: calidc1328660f8: Link UP Mar 12 23:50:35.346015 systemd-networkd[1479]: calidc1328660f8: Gained carrier Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.243 [INFO][5587] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0 coredns-66bc5c9577- kube-system 436fdece-c6ec-472d-9cd2-fa384b83e17b 866 0 2026-03-12 23:49:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.4-n-6470b86a4c coredns-66bc5c9577-w2ldq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calidc1328660f8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.244 [INFO][5587] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.273 [INFO][5601] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" HandleID="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Workload="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.283 [INFO][5601] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" HandleID="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Workload="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed510), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.4-n-6470b86a4c", "pod":"coredns-66bc5c9577-w2ldq", "timestamp":"2026-03-12 23:50:35.273546541 +0000 UTC"}, Hostname:"ci-4459.2.4-n-6470b86a4c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000364f20)} Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.283 [INFO][5601] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.283 [INFO][5601] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.284 [INFO][5601] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.4-n-6470b86a4c' Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.287 [INFO][5601] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.293 [INFO][5601] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.301 [INFO][5601] ipam/ipam.go 526: Trying affinity for 192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.304 [INFO][5601] ipam/ipam.go 160: Attempting to load block cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.310 [INFO][5601] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.310 [INFO][5601] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.313 [INFO][5601] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83 Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.320 [INFO][5601] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.332 [INFO][5601] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.98.200/26] block=192.168.98.192/26 handle="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.332 [INFO][5601] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.98.200/26] handle="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" host="ci-4459.2.4-n-6470b86a4c" Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.332 [INFO][5601] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:50:35.368783 containerd[1902]: 2026-03-12 23:50:35.332 [INFO][5601] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.98.200/26] IPv6=[] ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" HandleID="k8s-pod-network.a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Workload="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" Mar 12 23:50:35.369602 containerd[1902]: 2026-03-12 23:50:35.336 [INFO][5587] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"436fdece-c6ec-472d-9cd2-fa384b83e17b", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"", Pod:"coredns-66bc5c9577-w2ldq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidc1328660f8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:35.369602 containerd[1902]: 2026-03-12 23:50:35.337 [INFO][5587] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.200/32] ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" Mar 12 23:50:35.369602 containerd[1902]: 2026-03-12 23:50:35.337 [INFO][5587] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc1328660f8 ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" Mar 12 23:50:35.369602 containerd[1902]: 2026-03-12 23:50:35.347 [INFO][5587] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" Mar 12 23:50:35.369602 containerd[1902]: 2026-03-12 23:50:35.349 [INFO][5587] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"436fdece-c6ec-472d-9cd2-fa384b83e17b", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.4-n-6470b86a4c", ContainerID:"a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83", Pod:"coredns-66bc5c9577-w2ldq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidc1328660f8", MAC:"76:3d:9a:e1:2f:ef", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:50:35.369719 containerd[1902]: 2026-03-12 23:50:35.365 [INFO][5587] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" Namespace="kube-system" Pod="coredns-66bc5c9577-w2ldq" WorkloadEndpoint="ci--4459.2.4--n--6470b86a4c-k8s-coredns--66bc5c9577--w2ldq-eth0" Mar 12 23:50:35.419628 containerd[1902]: time="2026-03-12T23:50:35.419574373Z" level=info msg="connecting to shim a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83" address="unix:///run/containerd/s/a2632d7824b81aa5543b17aa166927af6640ca16af680a0dc347c6f123d680ac" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:50:35.451187 systemd[1]: Started cri-containerd-a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83.scope - libcontainer container a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83. Mar 12 23:50:35.503080 containerd[1902]: time="2026-03-12T23:50:35.503002614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w2ldq,Uid:436fdece-c6ec-472d-9cd2-fa384b83e17b,Namespace:kube-system,Attempt:0,} returns sandbox id \"a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83\"" Mar 12 23:50:35.519713 containerd[1902]: time="2026-03-12T23:50:35.519451891Z" level=info msg="CreateContainer within sandbox \"a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:50:35.557112 containerd[1902]: time="2026-03-12T23:50:35.557065049Z" level=info msg="Container e864c2e3377fc7faab1c95721e1fb0e23d9abed9bb2e5168294fd7586e430458: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:35.557482 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1537967215.mount: Deactivated successfully. Mar 12 23:50:35.591094 containerd[1902]: time="2026-03-12T23:50:35.591053062Z" level=info msg="CreateContainer within sandbox \"a96dd3886d25598e0e3ba7d0670752f5a4c20a529bdb4e1ecc612db0ccdb5c83\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e864c2e3377fc7faab1c95721e1fb0e23d9abed9bb2e5168294fd7586e430458\"" Mar 12 23:50:35.591975 containerd[1902]: time="2026-03-12T23:50:35.591945253Z" level=info msg="StartContainer for \"e864c2e3377fc7faab1c95721e1fb0e23d9abed9bb2e5168294fd7586e430458\"" Mar 12 23:50:35.592418 containerd[1902]: time="2026-03-12T23:50:35.592373372Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:35.594518 containerd[1902]: time="2026-03-12T23:50:35.594490206Z" level=info msg="connecting to shim e864c2e3377fc7faab1c95721e1fb0e23d9abed9bb2e5168294fd7586e430458" address="unix:///run/containerd/s/a2632d7824b81aa5543b17aa166927af6640ca16af680a0dc347c6f123d680ac" protocol=ttrpc version=3 Mar 12 23:50:35.595552 containerd[1902]: time="2026-03-12T23:50:35.595515722Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 12 23:50:35.600160 containerd[1902]: time="2026-03-12T23:50:35.600124827Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:35.606332 containerd[1902]: time="2026-03-12T23:50:35.606204567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:35.606732 containerd[1902]: time="2026-03-12T23:50:35.606689752Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.113546802s" Mar 12 23:50:35.606732 containerd[1902]: time="2026-03-12T23:50:35.606720665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:50:35.608882 containerd[1902]: time="2026-03-12T23:50:35.608253791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 12 23:50:35.614962 systemd[1]: Started cri-containerd-e864c2e3377fc7faab1c95721e1fb0e23d9abed9bb2e5168294fd7586e430458.scope - libcontainer container e864c2e3377fc7faab1c95721e1fb0e23d9abed9bb2e5168294fd7586e430458. Mar 12 23:50:35.616502 containerd[1902]: time="2026-03-12T23:50:35.616473766Z" level=info msg="CreateContainer within sandbox \"60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:50:35.635465 containerd[1902]: time="2026-03-12T23:50:35.635421884Z" level=info msg="Container a424dea884a54f8f84811cbdfb4347c260202a57a57dde2d0c8def16fdbb2573: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:35.648354 containerd[1902]: time="2026-03-12T23:50:35.648316295Z" level=info msg="StartContainer for \"e864c2e3377fc7faab1c95721e1fb0e23d9abed9bb2e5168294fd7586e430458\" returns successfully" Mar 12 23:50:35.654966 containerd[1902]: time="2026-03-12T23:50:35.654918710Z" level=info msg="CreateContainer within sandbox \"60cdb45758960a452a73ec1b81f2c2f92ec1bf84709e22610c1a0beb1e35b256\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a424dea884a54f8f84811cbdfb4347c260202a57a57dde2d0c8def16fdbb2573\"" Mar 12 23:50:35.655606 containerd[1902]: time="2026-03-12T23:50:35.655557628Z" level=info msg="StartContainer for \"a424dea884a54f8f84811cbdfb4347c260202a57a57dde2d0c8def16fdbb2573\"" Mar 12 23:50:35.657319 containerd[1902]: time="2026-03-12T23:50:35.657289745Z" level=info msg="connecting to shim a424dea884a54f8f84811cbdfb4347c260202a57a57dde2d0c8def16fdbb2573" address="unix:///run/containerd/s/f08a9668905e5b29c2a1150bcbcea6e49ad84ff2146f2df04705d3e4e6baa0a6" protocol=ttrpc version=3 Mar 12 23:50:35.679029 systemd[1]: Started cri-containerd-a424dea884a54f8f84811cbdfb4347c260202a57a57dde2d0c8def16fdbb2573.scope - libcontainer container a424dea884a54f8f84811cbdfb4347c260202a57a57dde2d0c8def16fdbb2573. Mar 12 23:50:35.721428 containerd[1902]: time="2026-03-12T23:50:35.721318679Z" level=info msg="StartContainer for \"a424dea884a54f8f84811cbdfb4347c260202a57a57dde2d0c8def16fdbb2573\" returns successfully" Mar 12 23:50:35.877052 systemd-networkd[1479]: cali5671e4ac2a4: Gained IPv6LL Mar 12 23:50:36.323603 kubelet[3419]: I0312 23:50:36.323450 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7d4f588999-9q596" podStartSLOduration=35.208143062 podStartE2EDuration="38.323433822s" podCreationTimestamp="2026-03-12 23:49:58 +0000 UTC" firstStartedPulling="2026-03-12 23:50:32.492780776 +0000 UTC m=+50.455023736" lastFinishedPulling="2026-03-12 23:50:35.608071528 +0000 UTC m=+53.570314496" observedRunningTime="2026-03-12 23:50:36.323118515 +0000 UTC m=+54.285361531" watchObservedRunningTime="2026-03-12 23:50:36.323433822 +0000 UTC m=+54.285676782" Mar 12 23:50:36.346816 kubelet[3419]: I0312 23:50:36.346166 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-w2ldq" podStartSLOduration=48.346148072 podStartE2EDuration="48.346148072s" podCreationTimestamp="2026-03-12 23:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:50:36.345446727 +0000 UTC m=+54.307689695" watchObservedRunningTime="2026-03-12 23:50:36.346148072 +0000 UTC m=+54.308391032" Mar 12 23:50:37.029130 systemd-networkd[1479]: calidc1328660f8: Gained IPv6LL Mar 12 23:50:37.147206 containerd[1902]: time="2026-03-12T23:50:37.147153815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:37.150730 containerd[1902]: time="2026-03-12T23:50:37.150689939Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 12 23:50:37.154323 containerd[1902]: time="2026-03-12T23:50:37.154285209Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:37.159025 containerd[1902]: time="2026-03-12T23:50:37.158991333Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:37.159875 containerd[1902]: time="2026-03-12T23:50:37.159848507Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.551568996s" Mar 12 23:50:37.159906 containerd[1902]: time="2026-03-12T23:50:37.159879772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 12 23:50:37.161083 containerd[1902]: time="2026-03-12T23:50:37.160784508Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:50:37.171196 containerd[1902]: time="2026-03-12T23:50:37.171161223Z" level=info msg="CreateContainer within sandbox \"2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 12 23:50:37.198508 containerd[1902]: time="2026-03-12T23:50:37.198473265Z" level=info msg="Container 2361409771784fb600269d2e96d387fa441dd0bc94d2b50e7ca578228ffb37da: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:37.219606 containerd[1902]: time="2026-03-12T23:50:37.219472327Z" level=info msg="CreateContainer within sandbox \"2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2361409771784fb600269d2e96d387fa441dd0bc94d2b50e7ca578228ffb37da\"" Mar 12 23:50:37.220306 containerd[1902]: time="2026-03-12T23:50:37.220281756Z" level=info msg="StartContainer for \"2361409771784fb600269d2e96d387fa441dd0bc94d2b50e7ca578228ffb37da\"" Mar 12 23:50:37.225124 containerd[1902]: time="2026-03-12T23:50:37.225015993Z" level=info msg="connecting to shim 2361409771784fb600269d2e96d387fa441dd0bc94d2b50e7ca578228ffb37da" address="unix:///run/containerd/s/195d980eab186186371f4a830ba5df9801a7edbd5b1be565d0dade6625c51085" protocol=ttrpc version=3 Mar 12 23:50:37.252025 systemd[1]: Started cri-containerd-2361409771784fb600269d2e96d387fa441dd0bc94d2b50e7ca578228ffb37da.scope - libcontainer container 2361409771784fb600269d2e96d387fa441dd0bc94d2b50e7ca578228ffb37da. Mar 12 23:50:37.319027 kubelet[3419]: I0312 23:50:37.318986 3419 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:37.320775 containerd[1902]: time="2026-03-12T23:50:37.320738571Z" level=info msg="StartContainer for \"2361409771784fb600269d2e96d387fa441dd0bc94d2b50e7ca578228ffb37da\" returns successfully" Mar 12 23:50:37.526885 containerd[1902]: time="2026-03-12T23:50:37.526780229Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:37.530184 containerd[1902]: time="2026-03-12T23:50:37.530147347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 12 23:50:37.531241 containerd[1902]: time="2026-03-12T23:50:37.531209336Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 370.389371ms" Mar 12 23:50:37.531291 containerd[1902]: time="2026-03-12T23:50:37.531253106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:50:37.532776 containerd[1902]: time="2026-03-12T23:50:37.532447579Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 12 23:50:37.540357 containerd[1902]: time="2026-03-12T23:50:37.540325151Z" level=info msg="CreateContainer within sandbox \"b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:50:37.565540 containerd[1902]: time="2026-03-12T23:50:37.564940579Z" level=info msg="Container 827d41bae13cec04753a1fb7e6d57c4257ae2c2fc940f9bfbf739950231829ff: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:37.586899 containerd[1902]: time="2026-03-12T23:50:37.586757654Z" level=info msg="CreateContainer within sandbox \"b9c2f001c36c86b1135ee0312cc019ccf1e7c1c67f2b8967c291ed58d449656b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"827d41bae13cec04753a1fb7e6d57c4257ae2c2fc940f9bfbf739950231829ff\"" Mar 12 23:50:37.588220 containerd[1902]: time="2026-03-12T23:50:37.588191328Z" level=info msg="StartContainer for \"827d41bae13cec04753a1fb7e6d57c4257ae2c2fc940f9bfbf739950231829ff\"" Mar 12 23:50:37.590092 containerd[1902]: time="2026-03-12T23:50:37.590062529Z" level=info msg="connecting to shim 827d41bae13cec04753a1fb7e6d57c4257ae2c2fc940f9bfbf739950231829ff" address="unix:///run/containerd/s/ad6c965df07b78b3436da51d41b9b82db8ca127b635f43c8273ce41993aeb986" protocol=ttrpc version=3 Mar 12 23:50:37.622000 systemd[1]: Started cri-containerd-827d41bae13cec04753a1fb7e6d57c4257ae2c2fc940f9bfbf739950231829ff.scope - libcontainer container 827d41bae13cec04753a1fb7e6d57c4257ae2c2fc940f9bfbf739950231829ff. Mar 12 23:50:37.687382 containerd[1902]: time="2026-03-12T23:50:37.687338098Z" level=info msg="StartContainer for \"827d41bae13cec04753a1fb7e6d57c4257ae2c2fc940f9bfbf739950231829ff\" returns successfully" Mar 12 23:50:38.341820 kubelet[3419]: I0312 23:50:38.341564 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7d4f588999-2qj8l" podStartSLOduration=35.492342956 podStartE2EDuration="40.341546749s" podCreationTimestamp="2026-03-12 23:49:58 +0000 UTC" firstStartedPulling="2026-03-12 23:50:32.682771842 +0000 UTC m=+50.645014802" lastFinishedPulling="2026-03-12 23:50:37.531975635 +0000 UTC m=+55.494218595" observedRunningTime="2026-03-12 23:50:38.341414369 +0000 UTC m=+56.303657377" watchObservedRunningTime="2026-03-12 23:50:38.341546749 +0000 UTC m=+56.303789709" Mar 12 23:50:39.327569 kubelet[3419]: I0312 23:50:39.327528 3419 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:40.137561 containerd[1902]: time="2026-03-12T23:50:40.137522008Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:40.140523 containerd[1902]: time="2026-03-12T23:50:40.140494176Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 12 23:50:40.143902 containerd[1902]: time="2026-03-12T23:50:40.143854013Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:40.147977 containerd[1902]: time="2026-03-12T23:50:40.147930508Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:40.148351 containerd[1902]: time="2026-03-12T23:50:40.148324322Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 2.615849589s" Mar 12 23:50:40.148399 containerd[1902]: time="2026-03-12T23:50:40.148354291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 12 23:50:40.151382 containerd[1902]: time="2026-03-12T23:50:40.151353348Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 12 23:50:40.176948 containerd[1902]: time="2026-03-12T23:50:40.176905569Z" level=info msg="CreateContainer within sandbox \"ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 12 23:50:40.197879 containerd[1902]: time="2026-03-12T23:50:40.197067322Z" level=info msg="Container 38aed7144e2d5a8cccb3b8dd0a6efb86de4493e2f49da48c67846616ef15edbf: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:40.215504 containerd[1902]: time="2026-03-12T23:50:40.215459036Z" level=info msg="CreateContainer within sandbox \"ba28bc3bc7f00f477a10e48234c0cf9258bd30eef727e1c38b4fb88e1d874bdf\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"38aed7144e2d5a8cccb3b8dd0a6efb86de4493e2f49da48c67846616ef15edbf\"" Mar 12 23:50:40.216860 containerd[1902]: time="2026-03-12T23:50:40.216051105Z" level=info msg="StartContainer for \"38aed7144e2d5a8cccb3b8dd0a6efb86de4493e2f49da48c67846616ef15edbf\"" Mar 12 23:50:40.217014 containerd[1902]: time="2026-03-12T23:50:40.216994778Z" level=info msg="connecting to shim 38aed7144e2d5a8cccb3b8dd0a6efb86de4493e2f49da48c67846616ef15edbf" address="unix:///run/containerd/s/6d0ca4b859f6fc5b337845cfa153850ae9520fda74be68476c893bda92db1fa9" protocol=ttrpc version=3 Mar 12 23:50:40.237438 systemd[1]: Started cri-containerd-38aed7144e2d5a8cccb3b8dd0a6efb86de4493e2f49da48c67846616ef15edbf.scope - libcontainer container 38aed7144e2d5a8cccb3b8dd0a6efb86de4493e2f49da48c67846616ef15edbf. Mar 12 23:50:40.693831 containerd[1902]: time="2026-03-12T23:50:40.693765059Z" level=info msg="StartContainer for \"38aed7144e2d5a8cccb3b8dd0a6efb86de4493e2f49da48c67846616ef15edbf\" returns successfully" Mar 12 23:50:41.754741 kubelet[3419]: I0312 23:50:41.753610 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-59c8ddc645-5c5s6" podStartSLOduration=35.972262384 podStartE2EDuration="42.75359147s" podCreationTimestamp="2026-03-12 23:49:59 +0000 UTC" firstStartedPulling="2026-03-12 23:50:33.369161663 +0000 UTC m=+51.331404623" lastFinishedPulling="2026-03-12 23:50:40.150490741 +0000 UTC m=+58.112733709" observedRunningTime="2026-03-12 23:50:41.735711094 +0000 UTC m=+59.697954086" watchObservedRunningTime="2026-03-12 23:50:41.75359147 +0000 UTC m=+59.715834430" Mar 12 23:50:42.436378 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1926765764.mount: Deactivated successfully. Mar 12 23:50:44.039931 containerd[1902]: time="2026-03-12T23:50:44.039877301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:44.043096 containerd[1902]: time="2026-03-12T23:50:44.043059884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 12 23:50:44.046143 containerd[1902]: time="2026-03-12T23:50:44.046109430Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:44.050957 containerd[1902]: time="2026-03-12T23:50:44.050904006Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:44.051559 containerd[1902]: time="2026-03-12T23:50:44.051173512Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 3.899787892s" Mar 12 23:50:44.051559 containerd[1902]: time="2026-03-12T23:50:44.051202898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 12 23:50:44.052656 containerd[1902]: time="2026-03-12T23:50:44.052481725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 12 23:50:44.058687 containerd[1902]: time="2026-03-12T23:50:44.058655748Z" level=info msg="CreateContainer within sandbox \"3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 12 23:50:44.081934 containerd[1902]: time="2026-03-12T23:50:44.080382040Z" level=info msg="Container 7c37fed3cd263589e3b8e1bb6efbb8cd337d924c7d60b279a99d6eda9f56c249: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:44.113682 containerd[1902]: time="2026-03-12T23:50:44.113629554Z" level=info msg="CreateContainer within sandbox \"3644593926024a9e897d9514d351e14e9d5fe0957cce4ac403c321b648fc683e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"7c37fed3cd263589e3b8e1bb6efbb8cd337d924c7d60b279a99d6eda9f56c249\"" Mar 12 23:50:44.116337 containerd[1902]: time="2026-03-12T23:50:44.115511557Z" level=info msg="StartContainer for \"7c37fed3cd263589e3b8e1bb6efbb8cd337d924c7d60b279a99d6eda9f56c249\"" Mar 12 23:50:44.116511 containerd[1902]: time="2026-03-12T23:50:44.116484716Z" level=info msg="connecting to shim 7c37fed3cd263589e3b8e1bb6efbb8cd337d924c7d60b279a99d6eda9f56c249" address="unix:///run/containerd/s/299bc4bc8e14e7abeacf9bec875123dd52c9eff8218cb70efde5771fbba1553e" protocol=ttrpc version=3 Mar 12 23:50:44.153957 systemd[1]: Started cri-containerd-7c37fed3cd263589e3b8e1bb6efbb8cd337d924c7d60b279a99d6eda9f56c249.scope - libcontainer container 7c37fed3cd263589e3b8e1bb6efbb8cd337d924c7d60b279a99d6eda9f56c249. Mar 12 23:50:44.200241 containerd[1902]: time="2026-03-12T23:50:44.200158758Z" level=info msg="StartContainer for \"7c37fed3cd263589e3b8e1bb6efbb8cd337d924c7d60b279a99d6eda9f56c249\" returns successfully" Mar 12 23:50:45.787414 kubelet[3419]: I0312 23:50:45.787339 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-2h6dr" podStartSLOduration=38.120638792 podStartE2EDuration="47.787320875s" podCreationTimestamp="2026-03-12 23:49:58 +0000 UTC" firstStartedPulling="2026-03-12 23:50:34.385238027 +0000 UTC m=+52.347480987" lastFinishedPulling="2026-03-12 23:50:44.051920102 +0000 UTC m=+62.014163070" observedRunningTime="2026-03-12 23:50:44.744476077 +0000 UTC m=+62.706719037" watchObservedRunningTime="2026-03-12 23:50:45.787320875 +0000 UTC m=+63.749563843" Mar 12 23:50:46.281041 containerd[1902]: time="2026-03-12T23:50:46.280980311Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:46.284162 containerd[1902]: time="2026-03-12T23:50:46.284016329Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 12 23:50:46.287485 containerd[1902]: time="2026-03-12T23:50:46.287455914Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:46.291704 containerd[1902]: time="2026-03-12T23:50:46.291650786Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:50:46.292101 containerd[1902]: time="2026-03-12T23:50:46.292019025Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 2.239504971s" Mar 12 23:50:46.292101 containerd[1902]: time="2026-03-12T23:50:46.292051394Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 12 23:50:46.300263 containerd[1902]: time="2026-03-12T23:50:46.300226657Z" level=info msg="CreateContainer within sandbox \"2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 12 23:50:46.322136 containerd[1902]: time="2026-03-12T23:50:46.322089699Z" level=info msg="Container bde66b769ff2c62848802312a8017a8f245d569219cd224d003aa9406e9ad8d8: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:50:46.344161 containerd[1902]: time="2026-03-12T23:50:46.344111668Z" level=info msg="CreateContainer within sandbox \"2eef8fff215cceb732777ec2f32503cad59f164413b18189bf80ab3cc12e6955\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"bde66b769ff2c62848802312a8017a8f245d569219cd224d003aa9406e9ad8d8\"" Mar 12 23:50:46.345213 containerd[1902]: time="2026-03-12T23:50:46.345183031Z" level=info msg="StartContainer for \"bde66b769ff2c62848802312a8017a8f245d569219cd224d003aa9406e9ad8d8\"" Mar 12 23:50:46.347114 containerd[1902]: time="2026-03-12T23:50:46.347071242Z" level=info msg="connecting to shim bde66b769ff2c62848802312a8017a8f245d569219cd224d003aa9406e9ad8d8" address="unix:///run/containerd/s/195d980eab186186371f4a830ba5df9801a7edbd5b1be565d0dade6625c51085" protocol=ttrpc version=3 Mar 12 23:50:46.373001 systemd[1]: Started cri-containerd-bde66b769ff2c62848802312a8017a8f245d569219cd224d003aa9406e9ad8d8.scope - libcontainer container bde66b769ff2c62848802312a8017a8f245d569219cd224d003aa9406e9ad8d8. Mar 12 23:50:46.435468 containerd[1902]: time="2026-03-12T23:50:46.435396326Z" level=info msg="StartContainer for \"bde66b769ff2c62848802312a8017a8f245d569219cd224d003aa9406e9ad8d8\" returns successfully" Mar 12 23:50:46.731003 kubelet[3419]: I0312 23:50:46.730944 3419 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-7g8rv" podStartSLOduration=34.040501304 podStartE2EDuration="47.730835132s" podCreationTimestamp="2026-03-12 23:49:59 +0000 UTC" firstStartedPulling="2026-03-12 23:50:32.602539127 +0000 UTC m=+50.564782087" lastFinishedPulling="2026-03-12 23:50:46.292872955 +0000 UTC m=+64.255115915" observedRunningTime="2026-03-12 23:50:46.729545665 +0000 UTC m=+64.691788625" watchObservedRunningTime="2026-03-12 23:50:46.730835132 +0000 UTC m=+64.693078092" Mar 12 23:50:47.209035 kubelet[3419]: I0312 23:50:47.208994 3419 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 12 23:50:47.212367 kubelet[3419]: I0312 23:50:47.212339 3419 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 12 23:50:53.975031 kubelet[3419]: I0312 23:50:53.974872 3419 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:54.032679 systemd[1]: Started sshd@7-10.200.20.40:22-10.200.16.10:52774.service - OpenSSH per-connection server daemon (10.200.16.10:52774). Mar 12 23:50:54.459330 sshd[6082]: Accepted publickey for core from 10.200.16.10 port 52774 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:50:54.461263 sshd-session[6082]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:50:54.465721 systemd-logind[1877]: New session 10 of user core. Mar 12 23:50:54.471958 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 12 23:50:54.774003 sshd[6109]: Connection closed by 10.200.16.10 port 52774 Mar 12 23:50:54.777003 sshd-session[6082]: pam_unix(sshd:session): session closed for user core Mar 12 23:50:54.780512 systemd[1]: sshd@7-10.200.20.40:22-10.200.16.10:52774.service: Deactivated successfully. Mar 12 23:50:54.783400 systemd[1]: session-10.scope: Deactivated successfully. Mar 12 23:50:54.786663 systemd-logind[1877]: Session 10 logged out. Waiting for processes to exit. Mar 12 23:50:54.789310 systemd-logind[1877]: Removed session 10. Mar 12 23:50:59.880980 systemd[1]: Started sshd@8-10.200.20.40:22-10.200.16.10:52782.service - OpenSSH per-connection server daemon (10.200.16.10:52782). Mar 12 23:51:00.299503 sshd[6129]: Accepted publickey for core from 10.200.16.10 port 52782 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:00.330321 sshd-session[6129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:00.334859 systemd-logind[1877]: New session 11 of user core. Mar 12 23:51:00.339939 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 12 23:51:00.606886 sshd[6132]: Connection closed by 10.200.16.10 port 52782 Mar 12 23:51:00.607500 sshd-session[6129]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:00.610919 systemd[1]: sshd@8-10.200.20.40:22-10.200.16.10:52782.service: Deactivated successfully. Mar 12 23:51:00.613113 systemd[1]: session-11.scope: Deactivated successfully. Mar 12 23:51:00.613831 systemd-logind[1877]: Session 11 logged out. Waiting for processes to exit. Mar 12 23:51:00.614875 systemd-logind[1877]: Removed session 11. Mar 12 23:51:05.696068 systemd[1]: Started sshd@9-10.200.20.40:22-10.200.16.10:33394.service - OpenSSH per-connection server daemon (10.200.16.10:33394). Mar 12 23:51:06.121398 sshd[6166]: Accepted publickey for core from 10.200.16.10 port 33394 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:06.122535 sshd-session[6166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:06.126704 systemd-logind[1877]: New session 12 of user core. Mar 12 23:51:06.134982 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 12 23:51:06.409121 sshd[6169]: Connection closed by 10.200.16.10 port 33394 Mar 12 23:51:06.408945 sshd-session[6166]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:06.413538 systemd[1]: sshd@9-10.200.20.40:22-10.200.16.10:33394.service: Deactivated successfully. Mar 12 23:51:06.416393 systemd[1]: session-12.scope: Deactivated successfully. Mar 12 23:51:06.418887 systemd-logind[1877]: Session 12 logged out. Waiting for processes to exit. Mar 12 23:51:06.420528 systemd-logind[1877]: Removed session 12. Mar 12 23:51:10.250129 kubelet[3419]: I0312 23:51:10.250068 3419 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:51:11.501202 systemd[1]: Started sshd@10-10.200.20.40:22-10.200.16.10:39510.service - OpenSSH per-connection server daemon (10.200.16.10:39510). Mar 12 23:51:11.927593 sshd[6183]: Accepted publickey for core from 10.200.16.10 port 39510 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:11.928730 sshd-session[6183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:11.932937 systemd-logind[1877]: New session 13 of user core. Mar 12 23:51:11.947005 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 12 23:51:12.216441 sshd[6216]: Connection closed by 10.200.16.10 port 39510 Mar 12 23:51:12.215884 sshd-session[6183]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:12.219795 systemd[1]: sshd@10-10.200.20.40:22-10.200.16.10:39510.service: Deactivated successfully. Mar 12 23:51:12.221403 systemd[1]: session-13.scope: Deactivated successfully. Mar 12 23:51:12.222603 systemd-logind[1877]: Session 13 logged out. Waiting for processes to exit. Mar 12 23:51:12.223930 systemd-logind[1877]: Removed session 13. Mar 12 23:51:12.310052 systemd[1]: Started sshd@11-10.200.20.40:22-10.200.16.10:39524.service - OpenSSH per-connection server daemon (10.200.16.10:39524). Mar 12 23:51:12.730038 sshd[6232]: Accepted publickey for core from 10.200.16.10 port 39524 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:12.731219 sshd-session[6232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:12.735444 systemd-logind[1877]: New session 14 of user core. Mar 12 23:51:12.740963 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 12 23:51:13.057930 sshd[6235]: Connection closed by 10.200.16.10 port 39524 Mar 12 23:51:13.059002 sshd-session[6232]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:13.062901 systemd-logind[1877]: Session 14 logged out. Waiting for processes to exit. Mar 12 23:51:13.063332 systemd[1]: sshd@11-10.200.20.40:22-10.200.16.10:39524.service: Deactivated successfully. Mar 12 23:51:13.065383 systemd[1]: session-14.scope: Deactivated successfully. Mar 12 23:51:13.067428 systemd-logind[1877]: Removed session 14. Mar 12 23:51:13.138613 systemd[1]: Started sshd@12-10.200.20.40:22-10.200.16.10:39526.service - OpenSSH per-connection server daemon (10.200.16.10:39526). Mar 12 23:51:13.536359 sshd[6253]: Accepted publickey for core from 10.200.16.10 port 39526 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:13.537544 sshd-session[6253]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:13.543450 systemd-logind[1877]: New session 15 of user core. Mar 12 23:51:13.549938 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 12 23:51:13.808120 sshd[6256]: Connection closed by 10.200.16.10 port 39526 Mar 12 23:51:13.808657 sshd-session[6253]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:13.812647 systemd-logind[1877]: Session 15 logged out. Waiting for processes to exit. Mar 12 23:51:13.813087 systemd[1]: sshd@12-10.200.20.40:22-10.200.16.10:39526.service: Deactivated successfully. Mar 12 23:51:13.815783 systemd[1]: session-15.scope: Deactivated successfully. Mar 12 23:51:13.817758 systemd-logind[1877]: Removed session 15. Mar 12 23:51:18.904097 systemd[1]: Started sshd@13-10.200.20.40:22-10.200.16.10:39532.service - OpenSSH per-connection server daemon (10.200.16.10:39532). Mar 12 23:51:19.322005 sshd[6297]: Accepted publickey for core from 10.200.16.10 port 39532 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:19.323125 sshd-session[6297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:19.327036 systemd-logind[1877]: New session 16 of user core. Mar 12 23:51:19.333964 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 12 23:51:19.600875 sshd[6300]: Connection closed by 10.200.16.10 port 39532 Mar 12 23:51:19.601423 sshd-session[6297]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:19.605090 systemd-logind[1877]: Session 16 logged out. Waiting for processes to exit. Mar 12 23:51:19.606201 systemd[1]: sshd@13-10.200.20.40:22-10.200.16.10:39532.service: Deactivated successfully. Mar 12 23:51:19.608736 systemd[1]: session-16.scope: Deactivated successfully. Mar 12 23:51:19.611673 systemd-logind[1877]: Removed session 16. Mar 12 23:51:19.691085 systemd[1]: Started sshd@14-10.200.20.40:22-10.200.16.10:39548.service - OpenSSH per-connection server daemon (10.200.16.10:39548). Mar 12 23:51:20.111507 sshd[6312]: Accepted publickey for core from 10.200.16.10 port 39548 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:20.112684 sshd-session[6312]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:20.116717 systemd-logind[1877]: New session 17 of user core. Mar 12 23:51:20.129070 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 12 23:51:20.561384 sshd[6315]: Connection closed by 10.200.16.10 port 39548 Mar 12 23:51:20.560713 sshd-session[6312]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:20.564788 systemd[1]: sshd@14-10.200.20.40:22-10.200.16.10:39548.service: Deactivated successfully. Mar 12 23:51:20.567648 systemd[1]: session-17.scope: Deactivated successfully. Mar 12 23:51:20.568698 systemd-logind[1877]: Session 17 logged out. Waiting for processes to exit. Mar 12 23:51:20.570603 systemd-logind[1877]: Removed session 17. Mar 12 23:51:20.649139 systemd[1]: Started sshd@15-10.200.20.40:22-10.200.16.10:45408.service - OpenSSH per-connection server daemon (10.200.16.10:45408). Mar 12 23:51:21.078736 sshd[6324]: Accepted publickey for core from 10.200.16.10 port 45408 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:21.079941 sshd-session[6324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:21.084543 systemd-logind[1877]: New session 18 of user core. Mar 12 23:51:21.095969 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 12 23:51:21.701188 sshd[6327]: Connection closed by 10.200.16.10 port 45408 Mar 12 23:51:21.701556 sshd-session[6324]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:21.705412 systemd[1]: sshd@15-10.200.20.40:22-10.200.16.10:45408.service: Deactivated successfully. Mar 12 23:51:21.707013 systemd[1]: session-18.scope: Deactivated successfully. Mar 12 23:51:21.707733 systemd-logind[1877]: Session 18 logged out. Waiting for processes to exit. Mar 12 23:51:21.709654 systemd-logind[1877]: Removed session 18. Mar 12 23:51:21.788898 systemd[1]: Started sshd@16-10.200.20.40:22-10.200.16.10:45412.service - OpenSSH per-connection server daemon (10.200.16.10:45412). Mar 12 23:51:22.210104 sshd[6350]: Accepted publickey for core from 10.200.16.10 port 45412 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:22.211933 sshd-session[6350]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:22.215826 systemd-logind[1877]: New session 19 of user core. Mar 12 23:51:22.226990 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 12 23:51:22.598227 sshd[6353]: Connection closed by 10.200.16.10 port 45412 Mar 12 23:51:22.598861 sshd-session[6350]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:22.602587 systemd[1]: sshd@16-10.200.20.40:22-10.200.16.10:45412.service: Deactivated successfully. Mar 12 23:51:22.605653 systemd[1]: session-19.scope: Deactivated successfully. Mar 12 23:51:22.607430 systemd-logind[1877]: Session 19 logged out. Waiting for processes to exit. Mar 12 23:51:22.609296 systemd-logind[1877]: Removed session 19. Mar 12 23:51:22.693249 systemd[1]: Started sshd@17-10.200.20.40:22-10.200.16.10:45424.service - OpenSSH per-connection server daemon (10.200.16.10:45424). Mar 12 23:51:23.110403 sshd[6365]: Accepted publickey for core from 10.200.16.10 port 45424 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:23.111537 sshd-session[6365]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:23.115445 systemd-logind[1877]: New session 20 of user core. Mar 12 23:51:23.121965 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 12 23:51:23.385993 sshd[6391]: Connection closed by 10.200.16.10 port 45424 Mar 12 23:51:23.386558 sshd-session[6365]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:23.390388 systemd[1]: sshd@17-10.200.20.40:22-10.200.16.10:45424.service: Deactivated successfully. Mar 12 23:51:23.394435 systemd[1]: session-20.scope: Deactivated successfully. Mar 12 23:51:23.395237 systemd-logind[1877]: Session 20 logged out. Waiting for processes to exit. Mar 12 23:51:23.397180 systemd-logind[1877]: Removed session 20. Mar 12 23:51:28.478938 systemd[1]: Started sshd@18-10.200.20.40:22-10.200.16.10:45430.service - OpenSSH per-connection server daemon (10.200.16.10:45430). Mar 12 23:51:28.898872 sshd[6450]: Accepted publickey for core from 10.200.16.10 port 45430 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:28.899822 sshd-session[6450]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:28.903652 systemd-logind[1877]: New session 21 of user core. Mar 12 23:51:28.909962 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 12 23:51:29.186898 sshd[6453]: Connection closed by 10.200.16.10 port 45430 Mar 12 23:51:29.187685 sshd-session[6450]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:29.190616 systemd[1]: sshd@18-10.200.20.40:22-10.200.16.10:45430.service: Deactivated successfully. Mar 12 23:51:29.193153 systemd[1]: session-21.scope: Deactivated successfully. Mar 12 23:51:29.195922 systemd-logind[1877]: Session 21 logged out. Waiting for processes to exit. Mar 12 23:51:29.197286 systemd-logind[1877]: Removed session 21. Mar 12 23:51:34.268704 systemd[1]: Started sshd@19-10.200.20.40:22-10.200.16.10:49544.service - OpenSSH per-connection server daemon (10.200.16.10:49544). Mar 12 23:51:34.668097 sshd[6466]: Accepted publickey for core from 10.200.16.10 port 49544 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:34.669289 sshd-session[6466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:34.673485 systemd-logind[1877]: New session 22 of user core. Mar 12 23:51:34.681975 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 12 23:51:34.936100 sshd[6469]: Connection closed by 10.200.16.10 port 49544 Mar 12 23:51:34.938261 sshd-session[6466]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:34.941959 systemd[1]: sshd@19-10.200.20.40:22-10.200.16.10:49544.service: Deactivated successfully. Mar 12 23:51:34.944514 systemd[1]: session-22.scope: Deactivated successfully. Mar 12 23:51:34.945619 systemd-logind[1877]: Session 22 logged out. Waiting for processes to exit. Mar 12 23:51:34.947080 systemd-logind[1877]: Removed session 22. Mar 12 23:51:40.034600 systemd[1]: Started sshd@20-10.200.20.40:22-10.200.16.10:47658.service - OpenSSH per-connection server daemon (10.200.16.10:47658). Mar 12 23:51:40.448865 sshd[6482]: Accepted publickey for core from 10.200.16.10 port 47658 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:40.449985 sshd-session[6482]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:40.454031 systemd-logind[1877]: New session 23 of user core. Mar 12 23:51:40.461970 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 12 23:51:40.749007 sshd[6485]: Connection closed by 10.200.16.10 port 47658 Mar 12 23:51:40.749791 sshd-session[6482]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:40.754040 systemd[1]: sshd@20-10.200.20.40:22-10.200.16.10:47658.service: Deactivated successfully. Mar 12 23:51:40.756240 systemd[1]: session-23.scope: Deactivated successfully. Mar 12 23:51:40.757559 systemd-logind[1877]: Session 23 logged out. Waiting for processes to exit. Mar 12 23:51:40.759322 systemd-logind[1877]: Removed session 23. Mar 12 23:51:45.841337 systemd[1]: Started sshd@21-10.200.20.40:22-10.200.16.10:47666.service - OpenSSH per-connection server daemon (10.200.16.10:47666). Mar 12 23:51:46.255994 sshd[6553]: Accepted publickey for core from 10.200.16.10 port 47666 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:46.257188 sshd-session[6553]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:46.262992 systemd-logind[1877]: New session 24 of user core. Mar 12 23:51:46.269082 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 12 23:51:46.540049 sshd[6556]: Connection closed by 10.200.16.10 port 47666 Mar 12 23:51:46.539440 sshd-session[6553]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:46.542765 systemd-logind[1877]: Session 24 logged out. Waiting for processes to exit. Mar 12 23:51:46.542936 systemd[1]: sshd@21-10.200.20.40:22-10.200.16.10:47666.service: Deactivated successfully. Mar 12 23:51:46.544897 systemd[1]: session-24.scope: Deactivated successfully. Mar 12 23:51:46.547388 systemd-logind[1877]: Removed session 24. Mar 12 23:51:51.629149 systemd[1]: Started sshd@22-10.200.20.40:22-10.200.16.10:35386.service - OpenSSH per-connection server daemon (10.200.16.10:35386). Mar 12 23:51:52.052939 sshd[6570]: Accepted publickey for core from 10.200.16.10 port 35386 ssh2: RSA SHA256:6aU++dO24JR26imLZPVleiSKFVQ+cc7yURo8Zcft0hY Mar 12 23:51:52.053990 sshd-session[6570]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:51:52.057380 systemd-logind[1877]: New session 25 of user core. Mar 12 23:51:52.065943 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 12 23:51:52.330968 sshd[6573]: Connection closed by 10.200.16.10 port 35386 Mar 12 23:51:52.330454 sshd-session[6570]: pam_unix(sshd:session): session closed for user core Mar 12 23:51:52.333434 systemd-logind[1877]: Session 25 logged out. Waiting for processes to exit. Mar 12 23:51:52.333740 systemd[1]: sshd@22-10.200.20.40:22-10.200.16.10:35386.service: Deactivated successfully. Mar 12 23:51:52.335454 systemd[1]: session-25.scope: Deactivated successfully. Mar 12 23:51:52.337386 systemd-logind[1877]: Removed session 25.