Mar 7 00:46:28.080218 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Mar 7 00:46:28.080237 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Mar 6 22:32:57 -00 2026 Mar 7 00:46:28.080243 kernel: KASLR enabled Mar 7 00:46:28.080247 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 7 00:46:28.080251 kernel: printk: legacy bootconsole [pl11] enabled Mar 7 00:46:28.080256 kernel: efi: EFI v2.7 by EDK II Mar 7 00:46:28.080261 kernel: efi: ACPI 2.0=0x3f979018 SMBIOS=0x3f8a0000 SMBIOS 3.0=0x3f880000 MEMATTR=0x3e89c018 RNG=0x3f979998 MEMRESERVE=0x3db83598 Mar 7 00:46:28.080266 kernel: random: crng init done Mar 7 00:46:28.080269 kernel: secureboot: Secure boot disabled Mar 7 00:46:28.080273 kernel: ACPI: Early table checksum verification disabled Mar 7 00:46:28.080278 kernel: ACPI: RSDP 0x000000003F979018 000024 (v02 VRTUAL) Mar 7 00:46:28.080282 kernel: ACPI: XSDT 0x000000003F979F18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080286 kernel: ACPI: FACP 0x000000003F979C18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080290 kernel: ACPI: DSDT 0x000000003F95A018 01E046 (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 7 00:46:28.080296 kernel: ACPI: DBG2 0x000000003F979B18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080301 kernel: ACPI: GTDT 0x000000003F979D98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080305 kernel: ACPI: OEM0 0x000000003F979098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080309 kernel: ACPI: SPCR 0x000000003F979A98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080314 kernel: ACPI: APIC 0x000000003F979818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080319 kernel: ACPI: SRAT 0x000000003F979198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080323 kernel: ACPI: PPTT 0x000000003F979418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 7 00:46:28.080327 kernel: ACPI: BGRT 0x000000003F979E98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 7 00:46:28.080331 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 7 00:46:28.080335 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 7 00:46:28.080340 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 7 00:46:28.080344 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Mar 7 00:46:28.080348 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Mar 7 00:46:28.080352 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 7 00:46:28.080356 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 7 00:46:28.080360 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 7 00:46:28.080366 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 7 00:46:28.080370 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 7 00:46:28.080374 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 7 00:46:28.080378 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 7 00:46:28.080382 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 7 00:46:28.080386 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 7 00:46:28.080390 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Mar 7 00:46:28.080395 kernel: NODE_DATA(0) allocated [mem 0x1bf7ffa00-0x1bf806fff] Mar 7 00:46:28.080399 kernel: Zone ranges: Mar 7 00:46:28.080403 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 7 00:46:28.080410 kernel: DMA32 empty Mar 7 00:46:28.080415 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 7 00:46:28.080419 kernel: Device empty Mar 7 00:46:28.080423 kernel: Movable zone start for each node Mar 7 00:46:28.080428 kernel: Early memory node ranges Mar 7 00:46:28.080432 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 7 00:46:28.080437 kernel: node 0: [mem 0x0000000000824000-0x000000003f38ffff] Mar 7 00:46:28.080442 kernel: node 0: [mem 0x000000003f390000-0x000000003f93ffff] Mar 7 00:46:28.080446 kernel: node 0: [mem 0x000000003f940000-0x000000003f9effff] Mar 7 00:46:28.080451 kernel: node 0: [mem 0x000000003f9f0000-0x000000003fdeffff] Mar 7 00:46:28.080455 kernel: node 0: [mem 0x000000003fdf0000-0x000000003fffffff] Mar 7 00:46:28.080459 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 7 00:46:28.080464 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 7 00:46:28.080468 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 7 00:46:28.080473 kernel: cma: Reserved 16 MiB at 0x000000003ca00000 on node -1 Mar 7 00:46:28.080477 kernel: psci: probing for conduit method from ACPI. Mar 7 00:46:28.080481 kernel: psci: PSCIv1.3 detected in firmware. Mar 7 00:46:28.080486 kernel: psci: Using standard PSCI v0.2 function IDs Mar 7 00:46:28.080491 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 7 00:46:28.080495 kernel: psci: SMC Calling Convention v1.4 Mar 7 00:46:28.080500 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 7 00:46:28.080504 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 7 00:46:28.080508 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 7 00:46:28.080513 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 7 00:46:28.080517 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 7 00:46:28.080522 kernel: Detected PIPT I-cache on CPU0 Mar 7 00:46:28.080526 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Mar 7 00:46:28.080531 kernel: CPU features: detected: GIC system register CPU interface Mar 7 00:46:28.080535 kernel: CPU features: detected: Spectre-v4 Mar 7 00:46:28.080539 kernel: CPU features: detected: Spectre-BHB Mar 7 00:46:28.080545 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 7 00:46:28.080549 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 7 00:46:28.080553 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Mar 7 00:46:28.080558 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 7 00:46:28.080562 kernel: alternatives: applying boot alternatives Mar 7 00:46:28.080567 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9c226afb416af9ef4d18a1b0d3e269f0ccb0a864e96b716716d400068481d58c Mar 7 00:46:28.080572 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 00:46:28.080577 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 00:46:28.080581 kernel: Fallback order for Node 0: 0 Mar 7 00:46:28.080585 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Mar 7 00:46:28.080590 kernel: Policy zone: Normal Mar 7 00:46:28.080595 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 00:46:28.080599 kernel: software IO TLB: area num 2. Mar 7 00:46:28.080604 kernel: software IO TLB: mapped [mem 0x0000000035900000-0x0000000039900000] (64MB) Mar 7 00:46:28.080608 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 00:46:28.080612 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 00:46:28.080618 kernel: rcu: RCU event tracing is enabled. Mar 7 00:46:28.080622 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 00:46:28.080627 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 00:46:28.080631 kernel: Tracing variant of Tasks RCU enabled. Mar 7 00:46:28.080635 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 00:46:28.080640 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 00:46:28.080646 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:46:28.080650 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:46:28.080654 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 7 00:46:28.080659 kernel: GICv3: 960 SPIs implemented Mar 7 00:46:28.080663 kernel: GICv3: 0 Extended SPIs implemented Mar 7 00:46:28.080668 kernel: Root IRQ handler: gic_handle_irq Mar 7 00:46:28.080672 kernel: GICv3: GICv3 features: 16 PPIs, RSS Mar 7 00:46:28.080676 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Mar 7 00:46:28.080681 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 7 00:46:28.080685 kernel: ITS: No ITS available, not enabling LPIs Mar 7 00:46:28.080690 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 00:46:28.080695 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Mar 7 00:46:28.080700 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 7 00:46:28.080704 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Mar 7 00:46:28.080709 kernel: Console: colour dummy device 80x25 Mar 7 00:46:28.080713 kernel: printk: legacy console [tty1] enabled Mar 7 00:46:28.080718 kernel: ACPI: Core revision 20240827 Mar 7 00:46:28.080723 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Mar 7 00:46:28.080727 kernel: pid_max: default: 32768 minimum: 301 Mar 7 00:46:28.080732 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 7 00:46:28.080736 kernel: landlock: Up and running. Mar 7 00:46:28.080742 kernel: SELinux: Initializing. Mar 7 00:46:28.080746 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:46:28.080751 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:46:28.080756 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0xa0000e, misc 0x31e1 Mar 7 00:46:28.080760 kernel: Hyper-V: Host Build 10.0.26102.1212-1-0 Mar 7 00:46:28.080769 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 7 00:46:28.080775 kernel: rcu: Hierarchical SRCU implementation. Mar 7 00:46:28.080780 kernel: rcu: Max phase no-delay instances is 400. Mar 7 00:46:28.080784 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 7 00:46:28.080789 kernel: Remapping and enabling EFI services. Mar 7 00:46:28.080794 kernel: smp: Bringing up secondary CPUs ... Mar 7 00:46:28.080799 kernel: Detected PIPT I-cache on CPU1 Mar 7 00:46:28.080804 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 7 00:46:28.080809 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Mar 7 00:46:28.080814 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 00:46:28.080819 kernel: SMP: Total of 2 processors activated. Mar 7 00:46:28.080823 kernel: CPU: All CPU(s) started at EL1 Mar 7 00:46:28.080829 kernel: CPU features: detected: 32-bit EL0 Support Mar 7 00:46:28.080834 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 7 00:46:28.080839 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 7 00:46:28.080844 kernel: CPU features: detected: Common not Private translations Mar 7 00:46:28.080848 kernel: CPU features: detected: CRC32 instructions Mar 7 00:46:28.080853 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Mar 7 00:46:28.080858 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 7 00:46:28.080863 kernel: CPU features: detected: LSE atomic instructions Mar 7 00:46:28.080868 kernel: CPU features: detected: Privileged Access Never Mar 7 00:46:28.080873 kernel: CPU features: detected: Speculation barrier (SB) Mar 7 00:46:28.080879 kernel: CPU features: detected: TLB range maintenance instructions Mar 7 00:46:28.080883 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 7 00:46:28.080888 kernel: CPU features: detected: Scalable Vector Extension Mar 7 00:46:28.080893 kernel: alternatives: applying system-wide alternatives Mar 7 00:46:28.080898 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Mar 7 00:46:28.080903 kernel: SVE: maximum available vector length 16 bytes per vector Mar 7 00:46:28.080907 kernel: SVE: default vector length 16 bytes per vector Mar 7 00:46:28.080912 kernel: Memory: 3952828K/4194160K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 220144K reserved, 16384K cma-reserved) Mar 7 00:46:28.080920 kernel: devtmpfs: initialized Mar 7 00:46:28.080925 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 00:46:28.080930 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 00:46:28.080935 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 7 00:46:28.080940 kernel: 0 pages in range for non-PLT usage Mar 7 00:46:28.080944 kernel: 508400 pages in range for PLT usage Mar 7 00:46:28.080949 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 00:46:28.080954 kernel: SMBIOS 3.1.0 present. Mar 7 00:46:28.080959 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 06/10/2025 Mar 7 00:46:28.080964 kernel: DMI: Memory slots populated: 2/2 Mar 7 00:46:28.080969 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 00:46:28.080974 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 7 00:46:28.080979 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 7 00:46:28.080984 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 7 00:46:28.080988 kernel: audit: initializing netlink subsys (disabled) Mar 7 00:46:28.080993 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Mar 7 00:46:28.080998 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 00:46:28.081004 kernel: cpuidle: using governor menu Mar 7 00:46:28.081009 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 7 00:46:28.081014 kernel: ASID allocator initialised with 32768 entries Mar 7 00:46:28.081019 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 00:46:28.081023 kernel: Serial: AMBA PL011 UART driver Mar 7 00:46:28.081028 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 00:46:28.081033 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 00:46:28.081038 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 7 00:46:28.081042 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 7 00:46:28.081049 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 00:46:28.081053 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 00:46:28.081058 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 7 00:46:28.081063 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 7 00:46:28.081068 kernel: ACPI: Added _OSI(Module Device) Mar 7 00:46:28.081072 kernel: ACPI: Added _OSI(Processor Device) Mar 7 00:46:28.081077 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 00:46:28.081082 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 00:46:28.081098 kernel: ACPI: Interpreter enabled Mar 7 00:46:28.081105 kernel: ACPI: Using GIC for interrupt routing Mar 7 00:46:28.081110 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 7 00:46:28.081115 kernel: printk: legacy console [ttyAMA0] enabled Mar 7 00:46:28.081119 kernel: printk: legacy bootconsole [pl11] disabled Mar 7 00:46:28.081124 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 7 00:46:28.081129 kernel: ACPI: CPU0 has been hot-added Mar 7 00:46:28.081134 kernel: ACPI: CPU1 has been hot-added Mar 7 00:46:28.081139 kernel: iommu: Default domain type: Translated Mar 7 00:46:28.081143 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 7 00:46:28.081148 kernel: efivars: Registered efivars operations Mar 7 00:46:28.081154 kernel: vgaarb: loaded Mar 7 00:46:28.081159 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 7 00:46:28.081163 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 00:46:28.081168 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 00:46:28.081173 kernel: pnp: PnP ACPI init Mar 7 00:46:28.081177 kernel: pnp: PnP ACPI: found 0 devices Mar 7 00:46:28.081182 kernel: NET: Registered PF_INET protocol family Mar 7 00:46:28.081187 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 00:46:28.081192 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 00:46:28.081197 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 00:46:28.081202 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 00:46:28.081207 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 00:46:28.081212 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 00:46:28.081217 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:46:28.081221 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:46:28.081226 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 00:46:28.081231 kernel: PCI: CLS 0 bytes, default 64 Mar 7 00:46:28.081236 kernel: kvm [1]: HYP mode not available Mar 7 00:46:28.081241 kernel: Initialise system trusted keyrings Mar 7 00:46:28.081246 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 00:46:28.081251 kernel: Key type asymmetric registered Mar 7 00:46:28.081255 kernel: Asymmetric key parser 'x509' registered Mar 7 00:46:28.081260 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 7 00:46:28.081265 kernel: io scheduler mq-deadline registered Mar 7 00:46:28.081270 kernel: io scheduler kyber registered Mar 7 00:46:28.081274 kernel: io scheduler bfq registered Mar 7 00:46:28.081279 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 00:46:28.081285 kernel: thunder_xcv, ver 1.0 Mar 7 00:46:28.081289 kernel: thunder_bgx, ver 1.0 Mar 7 00:46:28.081294 kernel: nicpf, ver 1.0 Mar 7 00:46:28.081299 kernel: nicvf, ver 1.0 Mar 7 00:46:28.081421 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 7 00:46:28.081471 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-07T00:46:27 UTC (1772844387) Mar 7 00:46:28.081477 kernel: efifb: probing for efifb Mar 7 00:46:28.081483 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 7 00:46:28.081488 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 7 00:46:28.081493 kernel: efifb: scrolling: redraw Mar 7 00:46:28.081498 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 7 00:46:28.081503 kernel: Console: switching to colour frame buffer device 128x48 Mar 7 00:46:28.081507 kernel: fb0: EFI VGA frame buffer device Mar 7 00:46:28.081512 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 7 00:46:28.081517 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 00:46:28.081522 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 7 00:46:28.081528 kernel: NET: Registered PF_INET6 protocol family Mar 7 00:46:28.081532 kernel: watchdog: NMI not fully supported Mar 7 00:46:28.081537 kernel: watchdog: Hard watchdog permanently disabled Mar 7 00:46:28.081542 kernel: Segment Routing with IPv6 Mar 7 00:46:28.081547 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 00:46:28.081552 kernel: NET: Registered PF_PACKET protocol family Mar 7 00:46:28.081557 kernel: Key type dns_resolver registered Mar 7 00:46:28.081561 kernel: registered taskstats version 1 Mar 7 00:46:28.081566 kernel: Loading compiled-in X.509 certificates Mar 7 00:46:28.081571 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 7eb2f80205b35f103c9dbaa59957e2e5fe845c0f' Mar 7 00:46:28.081577 kernel: Demotion targets for Node 0: null Mar 7 00:46:28.081581 kernel: Key type .fscrypt registered Mar 7 00:46:28.081586 kernel: Key type fscrypt-provisioning registered Mar 7 00:46:28.081591 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 00:46:28.081595 kernel: ima: Allocated hash algorithm: sha1 Mar 7 00:46:28.081600 kernel: ima: No architecture policies found Mar 7 00:46:28.081605 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 7 00:46:28.081610 kernel: clk: Disabling unused clocks Mar 7 00:46:28.081614 kernel: PM: genpd: Disabling unused power domains Mar 7 00:46:28.081620 kernel: Warning: unable to open an initial console. Mar 7 00:46:28.081625 kernel: Freeing unused kernel memory: 39552K Mar 7 00:46:28.081630 kernel: Run /init as init process Mar 7 00:46:28.081635 kernel: with arguments: Mar 7 00:46:28.081639 kernel: /init Mar 7 00:46:28.081644 kernel: with environment: Mar 7 00:46:28.081649 kernel: HOME=/ Mar 7 00:46:28.081653 kernel: TERM=linux Mar 7 00:46:28.081659 systemd[1]: Successfully made /usr/ read-only. Mar 7 00:46:28.081667 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 7 00:46:28.081673 systemd[1]: Detected virtualization microsoft. Mar 7 00:46:28.081678 systemd[1]: Detected architecture arm64. Mar 7 00:46:28.081683 systemd[1]: Running in initrd. Mar 7 00:46:28.081688 systemd[1]: No hostname configured, using default hostname. Mar 7 00:46:28.081694 systemd[1]: Hostname set to . Mar 7 00:46:28.081699 systemd[1]: Initializing machine ID from random generator. Mar 7 00:46:28.081705 systemd[1]: Queued start job for default target initrd.target. Mar 7 00:46:28.081710 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:46:28.081715 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:46:28.081721 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 00:46:28.081726 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:46:28.081731 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 00:46:28.081737 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 00:46:28.081744 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 00:46:28.081749 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 00:46:28.081755 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:46:28.081760 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:46:28.081765 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:46:28.081770 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:46:28.081775 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:46:28.081781 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:46:28.081787 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:46:28.081792 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:46:28.081797 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 00:46:28.081803 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 7 00:46:28.081808 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:46:28.081813 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:46:28.081818 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:46:28.081823 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:46:28.081829 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 00:46:28.081835 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:46:28.081840 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 00:46:28.081845 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 7 00:46:28.081851 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 00:46:28.081856 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:46:28.081861 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:46:28.081879 systemd-journald[225]: Collecting audit messages is disabled. Mar 7 00:46:28.081895 systemd-journald[225]: Journal started Mar 7 00:46:28.081909 systemd-journald[225]: Runtime Journal (/run/log/journal/e6cde6df7959489eb93f33a88f9c1795) is 8M, max 78.3M, 70.3M free. Mar 7 00:46:28.085127 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:46:28.090776 systemd-modules-load[227]: Inserted module 'overlay' Mar 7 00:46:28.116170 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 00:46:28.116226 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:46:28.117120 kernel: Bridge firewalling registered Mar 7 00:46:28.119027 systemd-modules-load[227]: Inserted module 'br_netfilter' Mar 7 00:46:28.128496 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 00:46:28.138429 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:46:28.143748 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 00:46:28.152389 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:46:28.160175 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:46:28.172365 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:46:28.187557 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:46:28.198220 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 00:46:28.218232 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:46:28.234321 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:46:28.245256 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:46:28.248406 systemd-tmpfiles[250]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 7 00:46:28.251966 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:46:28.261447 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:46:28.275173 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 00:46:28.300999 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:46:28.314908 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:46:28.334800 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:46:28.347730 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9c226afb416af9ef4d18a1b0d3e269f0ccb0a864e96b716716d400068481d58c Mar 7 00:46:28.381529 systemd-resolved[262]: Positive Trust Anchors: Mar 7 00:46:28.381546 systemd-resolved[262]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:46:28.381565 systemd-resolved[262]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:46:28.383883 systemd-resolved[262]: Defaulting to hostname 'linux'. Mar 7 00:46:28.384802 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:46:28.390959 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:46:28.501109 kernel: SCSI subsystem initialized Mar 7 00:46:28.507101 kernel: Loading iSCSI transport class v2.0-870. Mar 7 00:46:28.515170 kernel: iscsi: registered transport (tcp) Mar 7 00:46:28.528963 kernel: iscsi: registered transport (qla4xxx) Mar 7 00:46:28.528979 kernel: QLogic iSCSI HBA Driver Mar 7 00:46:28.547788 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 00:46:28.569682 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 00:46:28.576706 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 00:46:28.642122 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 00:46:28.651988 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 00:46:28.710109 kernel: raid6: neonx8 gen() 18563 MB/s Mar 7 00:46:28.729095 kernel: raid6: neonx4 gen() 18550 MB/s Mar 7 00:46:28.748094 kernel: raid6: neonx2 gen() 17099 MB/s Mar 7 00:46:28.769095 kernel: raid6: neonx1 gen() 15073 MB/s Mar 7 00:46:28.788095 kernel: raid6: int64x8 gen() 10552 MB/s Mar 7 00:46:28.808096 kernel: raid6: int64x4 gen() 10611 MB/s Mar 7 00:46:28.827182 kernel: raid6: int64x2 gen() 9001 MB/s Mar 7 00:46:28.849130 kernel: raid6: int64x1 gen() 7006 MB/s Mar 7 00:46:28.849141 kernel: raid6: using algorithm neonx8 gen() 18563 MB/s Mar 7 00:46:28.872861 kernel: raid6: .... xor() 14896 MB/s, rmw enabled Mar 7 00:46:28.872870 kernel: raid6: using neon recovery algorithm Mar 7 00:46:28.881093 kernel: xor: measuring software checksum speed Mar 7 00:46:28.881103 kernel: 8regs : 28641 MB/sec Mar 7 00:46:28.883564 kernel: 32regs : 28832 MB/sec Mar 7 00:46:28.886386 kernel: arm64_neon : 37549 MB/sec Mar 7 00:46:28.889489 kernel: xor: using function: arm64_neon (37549 MB/sec) Mar 7 00:46:28.928115 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 00:46:28.933328 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:46:28.942626 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:46:28.969602 systemd-udevd[474]: Using default interface naming scheme 'v255'. Mar 7 00:46:28.973991 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:46:28.988396 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 00:46:29.023888 dracut-pre-trigger[496]: rd.md=0: removing MD RAID activation Mar 7 00:46:29.048781 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:46:29.055337 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:46:29.106263 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:46:29.113267 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 00:46:29.188599 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:46:29.197776 kernel: hv_vmbus: Vmbus version:5.3 Mar 7 00:46:29.192957 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:46:29.203117 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:46:29.212803 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:46:29.225742 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 7 00:46:29.244255 kernel: hv_vmbus: registering driver hv_storvsc Mar 7 00:46:29.244275 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 7 00:46:29.244290 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 7 00:46:29.238865 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:46:29.262238 kernel: scsi host1: storvsc_host_t Mar 7 00:46:29.262378 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 7 00:46:29.262394 kernel: scsi host0: storvsc_host_t Mar 7 00:46:29.238940 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:46:29.301189 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 7 00:46:29.301212 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 7 00:46:29.301393 kernel: hv_vmbus: registering driver hv_netvsc Mar 7 00:46:29.301400 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Mar 7 00:46:29.301487 kernel: hv_vmbus: registering driver hid_hyperv Mar 7 00:46:29.272493 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:46:29.319595 kernel: PTP clock support registered Mar 7 00:46:29.319616 kernel: hv_utils: Registering HyperV Utility Driver Mar 7 00:46:29.323102 kernel: hv_vmbus: registering driver hv_utils Mar 7 00:46:29.329256 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 7 00:46:29.329302 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 7 00:46:29.339599 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 7 00:46:29.339780 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 7 00:46:29.031703 kernel: hv_utils: Heartbeat IC version 3.0 Mar 7 00:46:29.043337 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 7 00:46:29.043469 kernel: hv_utils: Shutdown IC version 3.2 Mar 7 00:46:29.043475 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 7 00:46:29.043541 kernel: hv_utils: TimeSync IC version 4.0 Mar 7 00:46:29.043547 systemd-journald[225]: Time jumped backwards, rotating. Mar 7 00:46:29.043575 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 7 00:46:29.043635 kernel: hv_netvsc 0022487b-fe3b-0022-487b-fe3b0022487b eth0: VF slot 1 added Mar 7 00:46:29.043709 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 7 00:46:29.033589 systemd-resolved[262]: Clock change detected. Flushing caches. Mar 7 00:46:29.067740 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 7 00:46:29.067898 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 7 00:46:29.052861 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:46:29.081200 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 00:46:29.085219 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 7 00:46:29.093219 kernel: hv_vmbus: registering driver hv_pci Mar 7 00:46:29.100876 kernel: hv_pci ec9bfe7f-8645-48b5-baf0-daaf7ba28642: PCI VMBus probing: Using version 0x10004 Mar 7 00:46:29.101051 kernel: hv_pci ec9bfe7f-8645-48b5-baf0-daaf7ba28642: PCI host bridge to bus 8645:00 Mar 7 00:46:29.111073 kernel: pci_bus 8645:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 7 00:46:29.111343 kernel: pci_bus 8645:00: No busn resource found for root bus, will use [bus 00-ff] Mar 7 00:46:29.123266 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#292 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 00:46:29.129331 kernel: pci 8645:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Mar 7 00:46:29.138199 kernel: pci 8645:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 7 00:46:29.144311 kernel: pci 8645:00:02.0: enabling Extended Tags Mar 7 00:46:29.162708 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#305 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 00:46:29.162928 kernel: pci 8645:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 8645:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Mar 7 00:46:29.175256 kernel: pci_bus 8645:00: busn_res: [bus 00-ff] end is updated to 00 Mar 7 00:46:29.175441 kernel: pci 8645:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Mar 7 00:46:29.236483 kernel: mlx5_core 8645:00:02.0: enabling device (0000 -> 0002) Mar 7 00:46:29.244451 kernel: mlx5_core 8645:00:02.0: PTM is not supported by PCIe Mar 7 00:46:29.244566 kernel: mlx5_core 8645:00:02.0: firmware version: 16.30.5026 Mar 7 00:46:29.425798 kernel: hv_netvsc 0022487b-fe3b-0022-487b-fe3b0022487b eth0: VF registering: eth1 Mar 7 00:46:29.426006 kernel: mlx5_core 8645:00:02.0 eth1: joined to eth0 Mar 7 00:46:29.432290 kernel: mlx5_core 8645:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 7 00:46:29.440254 kernel: mlx5_core 8645:00:02.0 enP34373s1: renamed from eth1 Mar 7 00:46:29.579569 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 7 00:46:29.629295 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 7 00:46:29.690700 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 7 00:46:29.702769 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 7 00:46:29.709397 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 7 00:46:29.719796 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 00:46:29.731728 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:46:29.741657 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:46:29.753213 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:46:29.767353 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 00:46:29.777330 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 00:46:29.800001 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:46:29.820208 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 00:46:30.840235 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 00:46:30.841251 disk-uuid[673]: The operation has completed successfully. Mar 7 00:46:30.913427 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 00:46:30.917464 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 00:46:30.941988 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 00:46:30.966791 sh[833]: Success Mar 7 00:46:31.001247 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 00:46:31.001308 kernel: device-mapper: uevent: version 1.0.3 Mar 7 00:46:31.006553 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 7 00:46:31.016214 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 7 00:46:31.375105 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 00:46:31.389704 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 00:46:31.399012 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 00:46:31.425259 kernel: BTRFS: device fsid 376b0ad0-b1fc-4099-8019-6f1f3d92d570 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (851) Mar 7 00:46:31.435539 kernel: BTRFS info (device dm-0): first mount of filesystem 376b0ad0-b1fc-4099-8019-6f1f3d92d570 Mar 7 00:46:31.435576 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:46:31.891400 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 7 00:46:31.891476 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 7 00:46:31.922279 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 00:46:31.927014 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 7 00:46:31.934734 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 00:46:31.935470 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 00:46:31.962277 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 00:46:31.991306 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (874) Mar 7 00:46:32.001965 kernel: BTRFS info (device sda6): first mount of filesystem a2920a34-fe1c-42ba-814e-fd8c35911ce4 Mar 7 00:46:32.002017 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:46:32.026077 kernel: BTRFS info (device sda6): turning on async discard Mar 7 00:46:32.026139 kernel: BTRFS info (device sda6): enabling free space tree Mar 7 00:46:32.037271 kernel: BTRFS info (device sda6): last unmount of filesystem a2920a34-fe1c-42ba-814e-fd8c35911ce4 Mar 7 00:46:32.038306 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 00:46:32.043700 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 00:46:32.097963 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:46:32.110248 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:46:32.145456 systemd-networkd[1020]: lo: Link UP Mar 7 00:46:32.145469 systemd-networkd[1020]: lo: Gained carrier Mar 7 00:46:32.146217 systemd-networkd[1020]: Enumeration completed Mar 7 00:46:32.146372 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:46:32.153536 systemd-networkd[1020]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:46:32.153539 systemd-networkd[1020]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:46:32.153982 systemd[1]: Reached target network.target - Network. Mar 7 00:46:32.226209 kernel: mlx5_core 8645:00:02.0 enP34373s1: Link up Mar 7 00:46:32.261215 kernel: hv_netvsc 0022487b-fe3b-0022-487b-fe3b0022487b eth0: Data path switched to VF: enP34373s1 Mar 7 00:46:32.261575 systemd-networkd[1020]: enP34373s1: Link UP Mar 7 00:46:32.261636 systemd-networkd[1020]: eth0: Link UP Mar 7 00:46:32.261727 systemd-networkd[1020]: eth0: Gained carrier Mar 7 00:46:32.261740 systemd-networkd[1020]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:46:32.266342 systemd-networkd[1020]: enP34373s1: Gained carrier Mar 7 00:46:32.284689 systemd-networkd[1020]: eth0: DHCPv4 address 10.200.20.26/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 7 00:46:33.046409 ignition[950]: Ignition 2.22.0 Mar 7 00:46:33.048992 ignition[950]: Stage: fetch-offline Mar 7 00:46:33.049111 ignition[950]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:46:33.054219 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:46:33.049117 ignition[950]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 00:46:33.062952 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 00:46:33.049207 ignition[950]: parsed url from cmdline: "" Mar 7 00:46:33.049209 ignition[950]: no config URL provided Mar 7 00:46:33.049212 ignition[950]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 00:46:33.049217 ignition[950]: no config at "/usr/lib/ignition/user.ign" Mar 7 00:46:33.049221 ignition[950]: failed to fetch config: resource requires networking Mar 7 00:46:33.049410 ignition[950]: Ignition finished successfully Mar 7 00:46:33.102287 ignition[1031]: Ignition 2.22.0 Mar 7 00:46:33.102297 ignition[1031]: Stage: fetch Mar 7 00:46:33.102507 ignition[1031]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:46:33.102515 ignition[1031]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 00:46:33.102577 ignition[1031]: parsed url from cmdline: "" Mar 7 00:46:33.102579 ignition[1031]: no config URL provided Mar 7 00:46:33.102583 ignition[1031]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 00:46:33.102590 ignition[1031]: no config at "/usr/lib/ignition/user.ign" Mar 7 00:46:33.102605 ignition[1031]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 7 00:46:33.228438 ignition[1031]: GET result: OK Mar 7 00:46:33.228513 ignition[1031]: config has been read from IMDS userdata Mar 7 00:46:33.231799 unknown[1031]: fetched base config from "system" Mar 7 00:46:33.228536 ignition[1031]: parsing config with SHA512: ab671195dcb7fe0e2fa433e8f2fcf55189a6df30b1c0578f758c361b217e38f62da8b51419a709e838470a44277714dab12b4ad6a4bcba73339437414379e476 Mar 7 00:46:33.231805 unknown[1031]: fetched base config from "system" Mar 7 00:46:33.232089 ignition[1031]: fetch: fetch complete Mar 7 00:46:33.231809 unknown[1031]: fetched user config from "azure" Mar 7 00:46:33.232093 ignition[1031]: fetch: fetch passed Mar 7 00:46:33.236215 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 00:46:33.232131 ignition[1031]: Ignition finished successfully Mar 7 00:46:33.243326 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 00:46:33.284924 ignition[1037]: Ignition 2.22.0 Mar 7 00:46:33.284936 ignition[1037]: Stage: kargs Mar 7 00:46:33.289458 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 00:46:33.285223 ignition[1037]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:46:33.295392 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 00:46:33.285232 ignition[1037]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 00:46:33.285710 ignition[1037]: kargs: kargs passed Mar 7 00:46:33.285752 ignition[1037]: Ignition finished successfully Mar 7 00:46:33.327309 ignition[1043]: Ignition 2.22.0 Mar 7 00:46:33.327323 ignition[1043]: Stage: disks Mar 7 00:46:33.327507 ignition[1043]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:46:33.333083 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 00:46:33.327515 ignition[1043]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 00:46:33.338588 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 00:46:33.328031 ignition[1043]: disks: disks passed Mar 7 00:46:33.343423 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 00:46:33.328072 ignition[1043]: Ignition finished successfully Mar 7 00:46:33.352941 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:46:33.361290 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:46:33.370962 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:46:33.380255 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 00:46:33.467621 systemd-fsck[1052]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Mar 7 00:46:33.475541 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 00:46:33.481609 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 00:46:33.703207 kernel: EXT4-fs (sda9): mounted filesystem dc3cd474-cc91-4aa5-8987-77b9669cedbb r/w with ordered data mode. Quota mode: none. Mar 7 00:46:33.703997 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 00:46:33.708073 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 00:46:33.732472 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:46:33.749945 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 00:46:33.768911 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 7 00:46:33.776283 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1066) Mar 7 00:46:33.784999 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 00:46:33.803728 kernel: BTRFS info (device sda6): first mount of filesystem a2920a34-fe1c-42ba-814e-fd8c35911ce4 Mar 7 00:46:33.803751 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:46:33.785037 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:46:33.795366 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 00:46:33.808933 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 00:46:33.834813 kernel: BTRFS info (device sda6): turning on async discard Mar 7 00:46:33.834843 kernel: BTRFS info (device sda6): enabling free space tree Mar 7 00:46:33.836082 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:46:34.042309 systemd-networkd[1020]: eth0: Gained IPv6LL Mar 7 00:46:34.377529 coreos-metadata[1068]: Mar 07 00:46:34.377 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 7 00:46:34.386139 coreos-metadata[1068]: Mar 07 00:46:34.386 INFO Fetch successful Mar 7 00:46:34.386139 coreos-metadata[1068]: Mar 07 00:46:34.386 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 7 00:46:34.400014 coreos-metadata[1068]: Mar 07 00:46:34.399 INFO Fetch successful Mar 7 00:46:34.415880 coreos-metadata[1068]: Mar 07 00:46:34.415 INFO wrote hostname ci-4459.2.3-n-801efb9c04 to /sysroot/etc/hostname Mar 7 00:46:34.422985 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 00:46:34.665802 initrd-setup-root[1096]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 00:46:34.687684 initrd-setup-root[1103]: cut: /sysroot/etc/group: No such file or directory Mar 7 00:46:34.694906 initrd-setup-root[1110]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 00:46:34.716737 initrd-setup-root[1117]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 00:46:35.727922 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 00:46:35.733439 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 00:46:35.750775 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 00:46:35.761350 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 00:46:35.772235 kernel: BTRFS info (device sda6): last unmount of filesystem a2920a34-fe1c-42ba-814e-fd8c35911ce4 Mar 7 00:46:35.784386 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 00:46:35.799094 ignition[1186]: INFO : Ignition 2.22.0 Mar 7 00:46:35.799094 ignition[1186]: INFO : Stage: mount Mar 7 00:46:35.806461 ignition[1186]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:46:35.806461 ignition[1186]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 00:46:35.806461 ignition[1186]: INFO : mount: mount passed Mar 7 00:46:35.806461 ignition[1186]: INFO : Ignition finished successfully Mar 7 00:46:35.806935 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 00:46:35.816017 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 00:46:35.844973 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:46:35.871969 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1196) Mar 7 00:46:35.872019 kernel: BTRFS info (device sda6): first mount of filesystem a2920a34-fe1c-42ba-814e-fd8c35911ce4 Mar 7 00:46:35.876205 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:46:35.884977 kernel: BTRFS info (device sda6): turning on async discard Mar 7 00:46:35.885019 kernel: BTRFS info (device sda6): enabling free space tree Mar 7 00:46:35.886566 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:46:35.915722 ignition[1213]: INFO : Ignition 2.22.0 Mar 7 00:46:35.915722 ignition[1213]: INFO : Stage: files Mar 7 00:46:35.921726 ignition[1213]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:46:35.921726 ignition[1213]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 00:46:35.921726 ignition[1213]: DEBUG : files: compiled without relabeling support, skipping Mar 7 00:46:35.936255 ignition[1213]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 00:46:35.936255 ignition[1213]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 00:46:35.982563 ignition[1213]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 00:46:35.988674 ignition[1213]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 00:46:35.988674 ignition[1213]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 00:46:35.982908 unknown[1213]: wrote ssh authorized keys file for user: core Mar 7 00:46:36.030439 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:46:36.038384 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 7 00:46:36.068166 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 00:46:36.237006 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:46:36.237006 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 00:46:36.252129 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 00:46:36.252129 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:46:36.252129 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:46:36.252129 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:46:36.252129 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:46:36.252129 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:46:36.252129 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:46:36.301798 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:46:36.301798 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:46:36.301798 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:46:36.301798 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:46:36.301798 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:46:36.301798 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-arm64.raw: attempt #1 Mar 7 00:46:36.549576 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 00:46:36.925275 ignition[1213]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:46:36.925275 ignition[1213]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 00:46:36.991221 ignition[1213]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:46:37.003003 ignition[1213]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:46:37.003003 ignition[1213]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 00:46:37.018066 ignition[1213]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 7 00:46:37.018066 ignition[1213]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 00:46:37.018066 ignition[1213]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:46:37.018066 ignition[1213]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:46:37.018066 ignition[1213]: INFO : files: files passed Mar 7 00:46:37.018066 ignition[1213]: INFO : Ignition finished successfully Mar 7 00:46:37.012758 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 00:46:37.024041 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 00:46:37.044755 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 00:46:37.067424 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 00:46:37.067506 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 00:46:37.110720 initrd-setup-root-after-ignition[1242]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:46:37.110720 initrd-setup-root-after-ignition[1242]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:46:37.129095 initrd-setup-root-after-ignition[1246]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:46:37.111081 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:46:37.122762 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 00:46:37.134720 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 00:46:37.176602 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 00:46:37.180681 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 00:46:37.191449 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 00:46:37.195845 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 00:46:37.204614 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 00:46:37.205273 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 00:46:37.245631 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:46:37.252267 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 00:46:37.287751 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:46:37.293037 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:46:37.303079 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 00:46:37.311700 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 00:46:37.311802 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:46:37.324156 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 00:46:37.328681 systemd[1]: Stopped target basic.target - Basic System. Mar 7 00:46:37.337545 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 00:46:37.346163 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:46:37.354460 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 00:46:37.363042 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 7 00:46:37.372621 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 00:46:37.381110 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:46:37.391026 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 00:46:37.399392 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 00:46:37.408805 systemd[1]: Stopped target swap.target - Swaps. Mar 7 00:46:37.416636 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 00:46:37.416747 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:46:37.428855 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:46:37.434017 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:46:37.443250 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 00:46:37.443314 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:46:37.452901 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 00:46:37.453001 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 00:46:37.467325 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 00:46:37.467406 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:46:37.472945 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 00:46:37.473020 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 00:46:37.481836 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 7 00:46:37.549556 ignition[1266]: INFO : Ignition 2.22.0 Mar 7 00:46:37.549556 ignition[1266]: INFO : Stage: umount Mar 7 00:46:37.549556 ignition[1266]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:46:37.549556 ignition[1266]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 7 00:46:37.549556 ignition[1266]: INFO : umount: umount passed Mar 7 00:46:37.549556 ignition[1266]: INFO : Ignition finished successfully Mar 7 00:46:37.481899 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 00:46:37.493614 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 00:46:37.520962 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 00:46:37.532104 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 00:46:37.532427 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:46:37.538004 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 00:46:37.538080 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:46:37.556019 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 00:46:37.556108 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 00:46:37.569224 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 00:46:37.569312 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 00:46:37.579545 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 00:46:37.579627 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 00:46:37.588707 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 00:46:37.588762 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 00:46:37.599592 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 00:46:37.599642 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 00:46:37.613983 systemd[1]: Stopped target network.target - Network. Mar 7 00:46:37.620258 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 00:46:37.620324 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:46:37.629459 systemd[1]: Stopped target paths.target - Path Units. Mar 7 00:46:37.634344 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 00:46:37.638569 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:46:37.644320 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 00:46:37.654671 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 00:46:37.663354 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 00:46:37.663413 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:46:37.672288 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 00:46:37.672325 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:46:37.682145 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 00:46:37.682214 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 00:46:37.691232 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 00:46:37.691265 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 00:46:37.701326 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 00:46:37.710305 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 00:46:37.721300 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 00:46:37.726376 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 00:46:37.726466 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 00:46:37.745541 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 7 00:46:37.745848 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 00:46:37.939160 kernel: hv_netvsc 0022487b-fe3b-0022-487b-fe3b0022487b eth0: Data path switched from VF: enP34373s1 Mar 7 00:46:37.745884 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:46:37.762316 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 7 00:46:37.762508 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 00:46:37.762612 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 00:46:37.776801 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 7 00:46:37.777278 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 7 00:46:37.784947 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 00:46:37.784984 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:46:37.795642 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 00:46:37.811975 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 00:46:37.812042 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:46:37.820845 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 00:46:37.820888 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:46:37.833227 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 00:46:37.833268 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 00:46:37.838244 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:46:37.852037 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 7 00:46:37.870447 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 00:46:37.870562 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 00:46:37.877535 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 00:46:37.877655 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:46:37.887847 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 00:46:37.887907 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 00:46:37.896789 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 00:46:37.896814 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:46:37.905200 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 00:46:37.905241 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:46:37.925588 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 00:46:37.925646 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 00:46:37.939236 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 00:46:37.939280 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:46:37.949620 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 00:46:37.949664 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 00:46:37.960660 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 00:46:37.975337 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 7 00:46:37.975396 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 00:46:37.991601 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 00:46:37.991642 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:46:38.001874 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 7 00:46:38.001922 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:46:38.011879 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 00:46:38.011918 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:46:38.017436 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:46:38.017474 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:46:38.225834 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Mar 7 00:46:38.031501 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 00:46:38.031588 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 00:46:38.058397 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 00:46:38.058505 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 00:46:38.067801 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 00:46:38.079600 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 00:46:38.102140 systemd[1]: Switching root. Mar 7 00:46:38.257599 systemd-journald[225]: Journal stopped Mar 7 00:46:42.700040 kernel: SELinux: policy capability network_peer_controls=1 Mar 7 00:46:42.700060 kernel: SELinux: policy capability open_perms=1 Mar 7 00:46:42.700068 kernel: SELinux: policy capability extended_socket_class=1 Mar 7 00:46:42.700073 kernel: SELinux: policy capability always_check_network=0 Mar 7 00:46:42.700078 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 7 00:46:42.700084 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 7 00:46:42.700090 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 7 00:46:42.700095 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 7 00:46:42.700100 kernel: SELinux: policy capability userspace_initial_context=0 Mar 7 00:46:42.700106 kernel: audit: type=1403 audit(1772844399.260:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 7 00:46:42.700113 systemd[1]: Successfully loaded SELinux policy in 215.123ms. Mar 7 00:46:42.700120 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.428ms. Mar 7 00:46:42.700127 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 7 00:46:42.700133 systemd[1]: Detected virtualization microsoft. Mar 7 00:46:42.700139 systemd[1]: Detected architecture arm64. Mar 7 00:46:42.700145 systemd[1]: Detected first boot. Mar 7 00:46:42.700152 systemd[1]: Hostname set to . Mar 7 00:46:42.700158 systemd[1]: Initializing machine ID from random generator. Mar 7 00:46:42.700164 zram_generator::config[1309]: No configuration found. Mar 7 00:46:42.700170 kernel: NET: Registered PF_VSOCK protocol family Mar 7 00:46:42.700176 systemd[1]: Populated /etc with preset unit settings. Mar 7 00:46:42.701472 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 7 00:46:42.701509 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 7 00:46:42.701521 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 7 00:46:42.701528 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 7 00:46:42.701535 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 7 00:46:42.701542 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 7 00:46:42.701548 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 7 00:46:42.701555 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 7 00:46:42.701561 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 7 00:46:42.701569 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 7 00:46:42.701575 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 7 00:46:42.701581 systemd[1]: Created slice user.slice - User and Session Slice. Mar 7 00:46:42.701587 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:46:42.701594 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:46:42.701600 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 7 00:46:42.701606 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 7 00:46:42.701612 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 7 00:46:42.701619 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:46:42.701626 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 7 00:46:42.701633 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:46:42.701642 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:46:42.701649 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 7 00:46:42.701655 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 7 00:46:42.701661 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 7 00:46:42.701667 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 7 00:46:42.701674 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:46:42.701680 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:46:42.701687 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:46:42.701693 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:46:42.701699 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 7 00:46:42.701705 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 7 00:46:42.701712 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 7 00:46:42.701719 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:46:42.701725 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:46:42.701731 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:46:42.701737 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 7 00:46:42.701744 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 7 00:46:42.701750 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 7 00:46:42.701757 systemd[1]: Mounting media.mount - External Media Directory... Mar 7 00:46:42.701763 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 7 00:46:42.701770 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 7 00:46:42.701777 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 7 00:46:42.701783 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 7 00:46:42.701789 systemd[1]: Reached target machines.target - Containers. Mar 7 00:46:42.701796 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 7 00:46:42.701802 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:46:42.701809 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:46:42.701815 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 7 00:46:42.701821 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:46:42.701828 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:46:42.701834 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:46:42.701840 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 7 00:46:42.701846 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:46:42.701853 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 7 00:46:42.701859 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 7 00:46:42.701866 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 7 00:46:42.701872 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 7 00:46:42.701878 systemd[1]: Stopped systemd-fsck-usr.service. Mar 7 00:46:42.701885 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 7 00:46:42.701891 kernel: fuse: init (API version 7.41) Mar 7 00:46:42.701897 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:46:42.701903 kernel: loop: module loaded Mar 7 00:46:42.701909 kernel: ACPI: bus type drm_connector registered Mar 7 00:46:42.701916 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:46:42.701922 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 00:46:42.701928 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 7 00:46:42.701935 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 7 00:46:42.701969 systemd-journald[1413]: Collecting audit messages is disabled. Mar 7 00:46:42.701986 systemd-journald[1413]: Journal started Mar 7 00:46:42.702001 systemd-journald[1413]: Runtime Journal (/run/log/journal/bb7be8b33f59474d94e49e2f97651f13) is 8M, max 78.3M, 70.3M free. Mar 7 00:46:41.948887 systemd[1]: Queued start job for default target multi-user.target. Mar 7 00:46:41.966728 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 7 00:46:41.967144 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 7 00:46:41.967463 systemd[1]: systemd-journald.service: Consumed 2.515s CPU time. Mar 7 00:46:42.714146 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:46:42.721344 systemd[1]: verity-setup.service: Deactivated successfully. Mar 7 00:46:42.721374 systemd[1]: Stopped verity-setup.service. Mar 7 00:46:42.728867 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:46:42.736503 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 7 00:46:42.741019 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 7 00:46:42.745616 systemd[1]: Mounted media.mount - External Media Directory. Mar 7 00:46:42.749605 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 7 00:46:42.754337 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 7 00:46:42.758803 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 7 00:46:42.762828 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 7 00:46:42.767654 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:46:42.774268 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 7 00:46:42.774414 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 7 00:46:42.779644 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:46:42.779775 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:46:42.785142 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:46:42.785283 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:46:42.789968 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:46:42.790086 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:46:42.795545 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 7 00:46:42.795670 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 7 00:46:42.800961 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:46:42.801082 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:46:42.805980 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:46:42.811128 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 00:46:42.817494 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 7 00:46:42.823268 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 7 00:46:42.828920 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:46:42.842587 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 00:46:42.848655 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 7 00:46:42.860629 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 7 00:46:42.865360 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 7 00:46:42.865384 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:46:42.870097 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 7 00:46:42.879309 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 7 00:46:42.883552 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:46:42.884873 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 7 00:46:42.895669 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 7 00:46:42.900919 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:46:42.903314 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 7 00:46:42.908472 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:46:42.920295 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:46:42.940368 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 7 00:46:42.948360 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 00:46:42.955967 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 7 00:46:42.961569 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 7 00:46:42.975142 systemd-journald[1413]: Time spent on flushing to /var/log/journal/bb7be8b33f59474d94e49e2f97651f13 is 36.239ms for 930 entries. Mar 7 00:46:42.975142 systemd-journald[1413]: System Journal (/var/log/journal/bb7be8b33f59474d94e49e2f97651f13) is 11.8M, max 2.6G, 2.6G free. Mar 7 00:46:43.051810 systemd-journald[1413]: Received client request to flush runtime journal. Mar 7 00:46:43.051873 systemd-journald[1413]: /var/log/journal/bb7be8b33f59474d94e49e2f97651f13/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Mar 7 00:46:43.051893 systemd-journald[1413]: Rotating system journal. Mar 7 00:46:43.051909 kernel: loop0: detected capacity change from 0 to 27936 Mar 7 00:46:42.969364 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 7 00:46:42.981362 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 7 00:46:42.998957 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 7 00:46:43.053608 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 7 00:46:43.066821 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 7 00:46:43.067639 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 7 00:46:43.091033 systemd-tmpfiles[1450]: ACLs are not supported, ignoring. Mar 7 00:46:43.091047 systemd-tmpfiles[1450]: ACLs are not supported, ignoring. Mar 7 00:46:43.093792 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:46:43.102389 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 7 00:46:43.115826 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:46:43.227241 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 7 00:46:43.233829 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:46:43.255988 systemd-tmpfiles[1467]: ACLs are not supported, ignoring. Mar 7 00:46:43.256000 systemd-tmpfiles[1467]: ACLs are not supported, ignoring. Mar 7 00:46:43.258760 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:46:43.387216 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 7 00:46:43.474212 kernel: loop1: detected capacity change from 0 to 100632 Mar 7 00:46:43.491835 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 7 00:46:43.498561 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:46:43.527127 systemd-udevd[1474]: Using default interface naming scheme 'v255'. Mar 7 00:46:43.704609 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:46:43.713492 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:46:43.771771 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 7 00:46:43.822680 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 7 00:46:43.832602 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 7 00:46:43.838474 kernel: mousedev: PS/2 mouse device common for all mice Mar 7 00:46:43.906271 kernel: hv_vmbus: registering driver hyperv_fb Mar 7 00:46:43.906354 kernel: hv_vmbus: registering driver hv_balloon Mar 7 00:46:43.906369 kernel: loop2: detected capacity change from 0 to 200864 Mar 7 00:46:43.911354 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 7 00:46:43.918304 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 7 00:46:43.925325 kernel: Console: switching to colour dummy device 80x25 Mar 7 00:46:43.934122 kernel: Console: switching to colour frame buffer device 128x48 Mar 7 00:46:43.953798 systemd-networkd[1488]: lo: Link UP Mar 7 00:46:43.953810 systemd-networkd[1488]: lo: Gained carrier Mar 7 00:46:43.954901 systemd-networkd[1488]: Enumeration completed Mar 7 00:46:43.955004 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:46:43.964176 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 7 00:46:43.960991 systemd-networkd[1488]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:46:43.960997 systemd-networkd[1488]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:46:43.969249 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 7 00:46:43.969593 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 7 00:46:43.980738 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 7 00:46:44.011206 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#292 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 7 00:46:44.016228 kernel: loop3: detected capacity change from 0 to 119840 Mar 7 00:46:44.031350 kernel: mlx5_core 8645:00:02.0 enP34373s1: Link up Mar 7 00:46:44.041931 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:46:44.059052 kernel: hv_netvsc 0022487b-fe3b-0022-487b-fe3b0022487b eth0: Data path switched to VF: enP34373s1 Mar 7 00:46:44.058880 systemd-networkd[1488]: enP34373s1: Link UP Mar 7 00:46:44.058980 systemd-networkd[1488]: eth0: Link UP Mar 7 00:46:44.058982 systemd-networkd[1488]: eth0: Gained carrier Mar 7 00:46:44.059003 systemd-networkd[1488]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:46:44.060996 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:46:44.061243 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:46:44.071401 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:46:44.072933 systemd-networkd[1488]: enP34373s1: Gained carrier Mar 7 00:46:44.082310 systemd-networkd[1488]: eth0: DHCPv4 address 10.200.20.26/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 7 00:46:44.082719 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 7 00:46:44.135215 kernel: MACsec IEEE 802.1AE Mar 7 00:46:44.164686 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 7 00:46:44.171324 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 7 00:46:44.209001 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 7 00:46:44.388212 kernel: loop4: detected capacity change from 0 to 27936 Mar 7 00:46:44.402208 kernel: loop5: detected capacity change from 0 to 100632 Mar 7 00:46:44.416212 kernel: loop6: detected capacity change from 0 to 200864 Mar 7 00:46:44.433208 kernel: loop7: detected capacity change from 0 to 119840 Mar 7 00:46:44.443597 (sd-merge)[1618]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 7 00:46:44.443993 (sd-merge)[1618]: Merged extensions into '/usr'. Mar 7 00:46:44.447615 systemd[1]: Reload requested from client PID 1449 ('systemd-sysext') (unit systemd-sysext.service)... Mar 7 00:46:44.447720 systemd[1]: Reloading... Mar 7 00:46:44.507221 zram_generator::config[1651]: No configuration found. Mar 7 00:46:44.673821 systemd[1]: Reloading finished in 225 ms. Mar 7 00:46:44.697233 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:46:44.702840 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 7 00:46:44.716153 systemd[1]: Starting ensure-sysext.service... Mar 7 00:46:44.722319 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:46:44.745467 systemd[1]: Reload requested from client PID 1706 ('systemctl') (unit ensure-sysext.service)... Mar 7 00:46:44.745482 systemd[1]: Reloading... Mar 7 00:46:44.748222 systemd-tmpfiles[1707]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 7 00:46:44.765152 systemd-tmpfiles[1707]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 7 00:46:44.765522 systemd-tmpfiles[1707]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 7 00:46:44.765765 systemd-tmpfiles[1707]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 7 00:46:44.766325 systemd-tmpfiles[1707]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 7 00:46:44.766583 systemd-tmpfiles[1707]: ACLs are not supported, ignoring. Mar 7 00:46:44.766704 systemd-tmpfiles[1707]: ACLs are not supported, ignoring. Mar 7 00:46:44.769262 systemd-tmpfiles[1707]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:46:44.769349 systemd-tmpfiles[1707]: Skipping /boot Mar 7 00:46:44.775182 systemd-tmpfiles[1707]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:46:44.775292 systemd-tmpfiles[1707]: Skipping /boot Mar 7 00:46:44.801196 zram_generator::config[1738]: No configuration found. Mar 7 00:46:44.962033 systemd[1]: Reloading finished in 216 ms. Mar 7 00:46:44.983774 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:46:45.006791 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 7 00:46:45.018407 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 7 00:46:45.025232 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:46:45.026256 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:46:45.033944 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:46:45.042352 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:46:45.048556 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:46:45.048663 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 7 00:46:45.049597 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 7 00:46:45.057407 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:46:45.066487 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 7 00:46:45.073555 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:46:45.080435 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:46:45.086485 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:46:45.086731 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:46:45.092553 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:46:45.092825 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:46:45.107354 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 7 00:46:45.117905 systemd[1]: Finished ensure-sysext.service. Mar 7 00:46:45.124208 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:46:45.126344 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:46:45.137380 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:46:45.145485 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:46:45.155493 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:46:45.161855 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:46:45.161994 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 7 00:46:45.162042 systemd[1]: Reached target time-set.target - System Time Set. Mar 7 00:46:45.167611 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:46:45.167778 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:46:45.173663 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:46:45.173797 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:46:45.181731 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:46:45.181865 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:46:45.183983 systemd-resolved[1802]: Positive Trust Anchors: Mar 7 00:46:45.184239 systemd-resolved[1802]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:46:45.184313 systemd-resolved[1802]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:46:45.188880 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:46:45.189259 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:46:45.196444 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:46:45.196515 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:46:45.202726 systemd-resolved[1802]: Using system hostname 'ci-4459.2.3-n-801efb9c04'. Mar 7 00:46:45.204502 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:46:45.209510 systemd[1]: Reached target network.target - Network. Mar 7 00:46:45.213243 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:46:45.240214 augenrules[1837]: No rules Mar 7 00:46:45.241068 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 00:46:45.241327 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 7 00:46:45.252931 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 7 00:46:45.871626 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 7 00:46:45.877709 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 00:46:45.882507 systemd-networkd[1488]: eth0: Gained IPv6LL Mar 7 00:46:45.884560 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 7 00:46:45.890503 systemd[1]: Reached target network-online.target - Network is Online. Mar 7 00:46:48.179171 ldconfig[1443]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 7 00:46:48.190086 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 7 00:46:48.196499 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 7 00:46:48.208469 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 7 00:46:48.213621 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:46:48.218480 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 7 00:46:48.224114 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 7 00:46:48.229494 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 7 00:46:48.234073 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 7 00:46:48.239863 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 7 00:46:48.245377 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 7 00:46:48.245403 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:46:48.249109 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:46:48.260339 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 7 00:46:48.266617 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 7 00:46:48.272296 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 7 00:46:48.278046 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 7 00:46:48.283764 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 7 00:46:48.290163 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 7 00:46:48.295269 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 7 00:46:48.301158 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 7 00:46:48.306032 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:46:48.310206 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:46:48.314259 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:46:48.314281 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:46:48.317328 systemd[1]: Starting chronyd.service - NTP client/server... Mar 7 00:46:48.332286 systemd[1]: Starting containerd.service - containerd container runtime... Mar 7 00:46:48.338322 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 7 00:46:48.348328 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 7 00:46:48.355008 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 7 00:46:48.363130 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 7 00:46:48.371341 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 7 00:46:48.376303 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 7 00:46:48.377121 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 7 00:46:48.382793 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 7 00:46:48.385378 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:46:48.387015 KVP[1861]: KVP starting; pid is:1861 Mar 7 00:46:48.387253 chronyd[1851]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 7 00:46:48.391396 jq[1859]: false Mar 7 00:46:48.392874 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 7 00:46:48.399566 KVP[1861]: KVP LIC Version: 3.1 Mar 7 00:46:48.400261 kernel: hv_utils: KVP IC version 4.0 Mar 7 00:46:48.402335 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 7 00:46:48.408295 chronyd[1851]: Timezone right/UTC failed leap second check, ignoring Mar 7 00:46:48.408633 chronyd[1851]: Loaded seccomp filter (level 2) Mar 7 00:46:48.411342 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 7 00:46:48.418897 extend-filesystems[1860]: Found /dev/sda6 Mar 7 00:46:48.425353 extend-filesystems[1860]: Found /dev/sda9 Mar 7 00:46:48.423346 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 7 00:46:48.439570 extend-filesystems[1860]: Checking size of /dev/sda9 Mar 7 00:46:48.436396 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 7 00:46:48.453321 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 7 00:46:48.459228 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 7 00:46:48.459831 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 7 00:46:48.461348 systemd[1]: Starting update-engine.service - Update Engine... Mar 7 00:46:48.467289 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 7 00:46:48.475821 systemd[1]: Started chronyd.service - NTP client/server. Mar 7 00:46:48.480230 extend-filesystems[1860]: Old size kept for /dev/sda9 Mar 7 00:46:48.491142 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 7 00:46:48.497011 jq[1888]: true Mar 7 00:46:48.498252 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 7 00:46:48.498535 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 7 00:46:48.498829 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 7 00:46:48.499065 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 7 00:46:48.507169 systemd[1]: motdgen.service: Deactivated successfully. Mar 7 00:46:48.508319 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 7 00:46:48.519378 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 7 00:46:48.526115 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 7 00:46:48.528231 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 7 00:46:48.554338 jq[1903]: true Mar 7 00:46:48.556820 update_engine[1887]: I20260307 00:46:48.556749 1887 main.cc:92] Flatcar Update Engine starting Mar 7 00:46:48.559772 (ntainerd)[1905]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 7 00:46:48.585443 systemd-logind[1883]: New seat seat0. Mar 7 00:46:48.589051 systemd-logind[1883]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 7 00:46:48.589243 systemd[1]: Started systemd-logind.service - User Login Management. Mar 7 00:46:48.607369 tar[1900]: linux-arm64/LICENSE Mar 7 00:46:48.612169 tar[1900]: linux-arm64/helm Mar 7 00:46:48.660847 dbus-daemon[1854]: [system] SELinux support is enabled Mar 7 00:46:48.662568 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 7 00:46:48.673626 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 7 00:46:48.673997 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 7 00:46:48.692065 update_engine[1887]: I20260307 00:46:48.686331 1887 update_check_scheduler.cc:74] Next update check in 2m13s Mar 7 00:46:48.686461 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 7 00:46:48.686478 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 7 00:46:48.700823 systemd[1]: Started update-engine.service - Update Engine. Mar 7 00:46:48.708286 dbus-daemon[1854]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 7 00:46:48.713265 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 7 00:46:48.723689 bash[1957]: Updated "/home/core/.ssh/authorized_keys" Mar 7 00:46:48.728451 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 7 00:46:48.752345 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 7 00:46:48.775178 coreos-metadata[1853]: Mar 07 00:46:48.775 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 7 00:46:48.781906 coreos-metadata[1853]: Mar 07 00:46:48.781 INFO Fetch successful Mar 7 00:46:48.781906 coreos-metadata[1853]: Mar 07 00:46:48.781 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 7 00:46:48.788061 coreos-metadata[1853]: Mar 07 00:46:48.788 INFO Fetch successful Mar 7 00:46:48.788061 coreos-metadata[1853]: Mar 07 00:46:48.788 INFO Fetching http://168.63.129.16/machine/86bc97df-4764-43a3-9405-d41bbb22c569/84daa868%2D2dfc%2D42ca%2Da186%2D43cbf2149a1f.%5Fci%2D4459.2.3%2Dn%2D801efb9c04?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 7 00:46:48.791736 coreos-metadata[1853]: Mar 07 00:46:48.791 INFO Fetch successful Mar 7 00:46:48.791736 coreos-metadata[1853]: Mar 07 00:46:48.791 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 7 00:46:48.804770 coreos-metadata[1853]: Mar 07 00:46:48.804 INFO Fetch successful Mar 7 00:46:48.854027 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 7 00:46:48.862633 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 7 00:46:48.952465 sshd_keygen[1886]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 7 00:46:48.971302 locksmithd[1984]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 7 00:46:48.976640 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 7 00:46:48.983458 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 7 00:46:48.993738 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 7 00:46:49.014128 systemd[1]: issuegen.service: Deactivated successfully. Mar 7 00:46:49.016337 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 7 00:46:49.027866 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 7 00:46:49.035013 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 7 00:46:49.061209 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 7 00:46:49.069383 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 7 00:46:49.077252 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 7 00:46:49.085594 systemd[1]: Reached target getty.target - Login Prompts. Mar 7 00:46:49.164864 tar[1900]: linux-arm64/README.md Mar 7 00:46:49.179927 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 7 00:46:49.322437 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:46:49.360766 containerd[1905]: time="2026-03-07T00:46:49Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 7 00:46:49.361249 containerd[1905]: time="2026-03-07T00:46:49.361223684Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 7 00:46:49.366932 containerd[1905]: time="2026-03-07T00:46:49.366901724Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.552µs" Mar 7 00:46:49.366932 containerd[1905]: time="2026-03-07T00:46:49.366926428Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 7 00:46:49.367011 containerd[1905]: time="2026-03-07T00:46:49.366940260Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 7 00:46:49.367095 containerd[1905]: time="2026-03-07T00:46:49.367074996Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 7 00:46:49.367095 containerd[1905]: time="2026-03-07T00:46:49.367091204Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 7 00:46:49.367121 containerd[1905]: time="2026-03-07T00:46:49.367109052Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367164 containerd[1905]: time="2026-03-07T00:46:49.367151116Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367164 containerd[1905]: time="2026-03-07T00:46:49.367161964Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367387 containerd[1905]: time="2026-03-07T00:46:49.367365844Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367409 containerd[1905]: time="2026-03-07T00:46:49.367386668Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367409 containerd[1905]: time="2026-03-07T00:46:49.367395596Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367409 containerd[1905]: time="2026-03-07T00:46:49.367400996Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367473 containerd[1905]: time="2026-03-07T00:46:49.367460516Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367639 containerd[1905]: time="2026-03-07T00:46:49.367622276Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367663 containerd[1905]: time="2026-03-07T00:46:49.367649596Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 7 00:46:49.367676 containerd[1905]: time="2026-03-07T00:46:49.367663668Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 7 00:46:49.367691 containerd[1905]: time="2026-03-07T00:46:49.367686660Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 7 00:46:49.367868 containerd[1905]: time="2026-03-07T00:46:49.367848916Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 7 00:46:49.367926 containerd[1905]: time="2026-03-07T00:46:49.367906772Z" level=info msg="metadata content store policy set" policy=shared Mar 7 00:46:49.380874 containerd[1905]: time="2026-03-07T00:46:49.380843036Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 7 00:46:49.380930 containerd[1905]: time="2026-03-07T00:46:49.380891164Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 7 00:46:49.380930 containerd[1905]: time="2026-03-07T00:46:49.380901652Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 7 00:46:49.380930 containerd[1905]: time="2026-03-07T00:46:49.380914332Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 7 00:46:49.380930 containerd[1905]: time="2026-03-07T00:46:49.380925732Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 7 00:46:49.380991 containerd[1905]: time="2026-03-07T00:46:49.380932796Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 7 00:46:49.380991 containerd[1905]: time="2026-03-07T00:46:49.380945644Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 7 00:46:49.380991 containerd[1905]: time="2026-03-07T00:46:49.380954044Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 7 00:46:49.380991 containerd[1905]: time="2026-03-07T00:46:49.380960580Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 7 00:46:49.380991 containerd[1905]: time="2026-03-07T00:46:49.380966516Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 7 00:46:49.380991 containerd[1905]: time="2026-03-07T00:46:49.380972476Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 7 00:46:49.380991 containerd[1905]: time="2026-03-07T00:46:49.380980556Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 7 00:46:49.381110 containerd[1905]: time="2026-03-07T00:46:49.381086844Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 7 00:46:49.381110 containerd[1905]: time="2026-03-07T00:46:49.381105964Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 7 00:46:49.381150 containerd[1905]: time="2026-03-07T00:46:49.381115492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 7 00:46:49.381150 containerd[1905]: time="2026-03-07T00:46:49.381124260Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 7 00:46:49.381150 containerd[1905]: time="2026-03-07T00:46:49.381131500Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 7 00:46:49.381150 containerd[1905]: time="2026-03-07T00:46:49.381138692Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 7 00:46:49.381150 containerd[1905]: time="2026-03-07T00:46:49.381145684Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 7 00:46:49.381245 containerd[1905]: time="2026-03-07T00:46:49.381152076Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 7 00:46:49.381245 containerd[1905]: time="2026-03-07T00:46:49.381159212Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 7 00:46:49.381245 containerd[1905]: time="2026-03-07T00:46:49.381166444Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 7 00:46:49.381245 containerd[1905]: time="2026-03-07T00:46:49.381172716Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 7 00:46:49.381245 containerd[1905]: time="2026-03-07T00:46:49.381237644Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 7 00:46:49.381319 containerd[1905]: time="2026-03-07T00:46:49.381250100Z" level=info msg="Start snapshots syncer" Mar 7 00:46:49.381319 containerd[1905]: time="2026-03-07T00:46:49.381281484Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 7 00:46:49.381498 containerd[1905]: time="2026-03-07T00:46:49.381462324Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 7 00:46:49.381625 containerd[1905]: time="2026-03-07T00:46:49.381503428Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 7 00:46:49.381625 containerd[1905]: time="2026-03-07T00:46:49.381539212Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 7 00:46:49.381666 containerd[1905]: time="2026-03-07T00:46:49.381633660Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 7 00:46:49.381666 containerd[1905]: time="2026-03-07T00:46:49.381648140Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 7 00:46:49.381666 containerd[1905]: time="2026-03-07T00:46:49.381654708Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 7 00:46:49.381666 containerd[1905]: time="2026-03-07T00:46:49.381662388Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 7 00:46:49.381717 containerd[1905]: time="2026-03-07T00:46:49.381671164Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 7 00:46:49.381717 containerd[1905]: time="2026-03-07T00:46:49.381678620Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 7 00:46:49.381717 containerd[1905]: time="2026-03-07T00:46:49.381684940Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 7 00:46:49.381717 containerd[1905]: time="2026-03-07T00:46:49.381701308Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 7 00:46:49.381717 containerd[1905]: time="2026-03-07T00:46:49.381709100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 7 00:46:49.381717 containerd[1905]: time="2026-03-07T00:46:49.381715620Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381747596Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381757892Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381763572Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381769148Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381773740Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381779452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381786292Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381801012Z" level=info msg="runtime interface created" Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381804404Z" level=info msg="created NRI interface" Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381811860Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381819676Z" level=info msg="Connect containerd service" Mar 7 00:46:49.381874 containerd[1905]: time="2026-03-07T00:46:49.381834332Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 7 00:46:49.382441 containerd[1905]: time="2026-03-07T00:46:49.382416756Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 00:46:49.383060 (kubelet)[2050]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:46:49.695471 kubelet[2050]: E0307 00:46:49.695349 2050 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:46:49.697603 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:46:49.697821 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:46:49.698296 systemd[1]: kubelet.service: Consumed 504ms CPU time, 248.6M memory peak. Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745438364Z" level=info msg="Start subscribing containerd event" Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745525076Z" level=info msg="Start recovering state" Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745608052Z" level=info msg="Start event monitor" Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745617396Z" level=info msg="Start cni network conf syncer for default" Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745623924Z" level=info msg="Start streaming server" Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745632788Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745639148Z" level=info msg="runtime interface starting up..." Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745642876Z" level=info msg="starting plugins..." Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745654276Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745619452Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745766084Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 7 00:46:49.751137 containerd[1905]: time="2026-03-07T00:46:49.745817332Z" level=info msg="containerd successfully booted in 0.385382s" Mar 7 00:46:49.745966 systemd[1]: Started containerd.service - containerd container runtime. Mar 7 00:46:49.752105 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 7 00:46:49.762259 systemd[1]: Startup finished in 1.747s (kernel) + 11.737s (initrd) + 10.714s (userspace) = 24.200s. Mar 7 00:46:50.108400 login[2041]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 7 00:46:50.108605 login[2040]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:46:50.113992 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 7 00:46:50.114915 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 7 00:46:50.121469 systemd-logind[1883]: New session 2 of user core. Mar 7 00:46:50.153774 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 7 00:46:50.155687 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 7 00:46:50.162896 (systemd)[2077]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 7 00:46:50.165077 systemd-logind[1883]: New session c1 of user core. Mar 7 00:46:50.296710 systemd[2077]: Queued start job for default target default.target. Mar 7 00:46:50.308040 systemd[2077]: Created slice app.slice - User Application Slice. Mar 7 00:46:50.308182 systemd[2077]: Reached target paths.target - Paths. Mar 7 00:46:50.308355 systemd[2077]: Reached target timers.target - Timers. Mar 7 00:46:50.309514 systemd[2077]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 7 00:46:50.319255 systemd[2077]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 7 00:46:50.319308 systemd[2077]: Reached target sockets.target - Sockets. Mar 7 00:46:50.319343 systemd[2077]: Reached target basic.target - Basic System. Mar 7 00:46:50.319364 systemd[2077]: Reached target default.target - Main User Target. Mar 7 00:46:50.319391 systemd[2077]: Startup finished in 149ms. Mar 7 00:46:50.319493 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 7 00:46:50.326462 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 7 00:46:50.684446 waagent[2038]: 2026-03-07T00:46:50.684372Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Mar 7 00:46:50.689582 waagent[2038]: 2026-03-07T00:46:50.689532Z INFO Daemon Daemon OS: flatcar 4459.2.3 Mar 7 00:46:50.693216 waagent[2038]: 2026-03-07T00:46:50.693152Z INFO Daemon Daemon Python: 3.11.13 Mar 7 00:46:50.698199 waagent[2038]: 2026-03-07T00:46:50.696622Z INFO Daemon Daemon Run daemon Mar 7 00:46:50.700015 waagent[2038]: 2026-03-07T00:46:50.699982Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.2.3' Mar 7 00:46:50.707310 waagent[2038]: 2026-03-07T00:46:50.707276Z INFO Daemon Daemon Using waagent for provisioning Mar 7 00:46:50.711228 waagent[2038]: 2026-03-07T00:46:50.711171Z INFO Daemon Daemon Activate resource disk Mar 7 00:46:50.715514 waagent[2038]: 2026-03-07T00:46:50.715479Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 7 00:46:50.724645 waagent[2038]: 2026-03-07T00:46:50.724605Z INFO Daemon Daemon Found device: None Mar 7 00:46:50.728487 waagent[2038]: 2026-03-07T00:46:50.728454Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 7 00:46:50.734989 waagent[2038]: 2026-03-07T00:46:50.734961Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 7 00:46:50.744486 waagent[2038]: 2026-03-07T00:46:50.744443Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 7 00:46:50.749115 waagent[2038]: 2026-03-07T00:46:50.749082Z INFO Daemon Daemon Running default provisioning handler Mar 7 00:46:50.758452 waagent[2038]: 2026-03-07T00:46:50.758403Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 7 00:46:50.769929 waagent[2038]: 2026-03-07T00:46:50.769885Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 7 00:46:50.778769 waagent[2038]: 2026-03-07T00:46:50.778727Z INFO Daemon Daemon cloud-init is enabled: False Mar 7 00:46:50.782716 waagent[2038]: 2026-03-07T00:46:50.782683Z INFO Daemon Daemon Copying ovf-env.xml Mar 7 00:46:50.834109 waagent[2038]: 2026-03-07T00:46:50.833983Z INFO Daemon Daemon Successfully mounted dvd Mar 7 00:46:50.861754 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 7 00:46:50.864566 waagent[2038]: 2026-03-07T00:46:50.864517Z INFO Daemon Daemon Detect protocol endpoint Mar 7 00:46:50.869210 waagent[2038]: 2026-03-07T00:46:50.868855Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 7 00:46:50.873838 waagent[2038]: 2026-03-07T00:46:50.873808Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 7 00:46:50.878645 waagent[2038]: 2026-03-07T00:46:50.878615Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 7 00:46:50.882521 waagent[2038]: 2026-03-07T00:46:50.882489Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 7 00:46:50.886430 waagent[2038]: 2026-03-07T00:46:50.886403Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 7 00:46:50.938786 waagent[2038]: 2026-03-07T00:46:50.938694Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 7 00:46:50.943884 waagent[2038]: 2026-03-07T00:46:50.943861Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 7 00:46:50.947824 waagent[2038]: 2026-03-07T00:46:50.947797Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 7 00:46:51.060232 waagent[2038]: 2026-03-07T00:46:51.059718Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 7 00:46:51.064929 waagent[2038]: 2026-03-07T00:46:51.064884Z INFO Daemon Daemon Forcing an update of the goal state. Mar 7 00:46:51.072605 waagent[2038]: 2026-03-07T00:46:51.072566Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 7 00:46:51.109926 login[2041]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:46:51.114252 systemd-logind[1883]: New session 1 of user core. Mar 7 00:46:51.127322 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 7 00:46:51.166994 waagent[2038]: 2026-03-07T00:46:51.166521Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.179 Mar 7 00:46:51.171773 waagent[2038]: 2026-03-07T00:46:51.171387Z INFO Daemon Mar 7 00:46:51.173853 waagent[2038]: 2026-03-07T00:46:51.173822Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 48eb2119-25e2-48c5-b37c-8a63e1917c54 eTag: 2596505943338096898 source: Fabric] Mar 7 00:46:51.181896 waagent[2038]: 2026-03-07T00:46:51.181861Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 7 00:46:51.186880 waagent[2038]: 2026-03-07T00:46:51.186849Z INFO Daemon Mar 7 00:46:51.189138 waagent[2038]: 2026-03-07T00:46:51.189082Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 7 00:46:51.198458 waagent[2038]: 2026-03-07T00:46:51.198431Z INFO Daemon Daemon Downloading artifacts profile blob Mar 7 00:46:51.299744 waagent[2038]: 2026-03-07T00:46:51.299676Z INFO Daemon Downloaded certificate {'thumbprint': '29CB6E139D4D986A0912C78381B41659BE358687', 'hasPrivateKey': True} Mar 7 00:46:51.307003 waagent[2038]: 2026-03-07T00:46:51.306963Z INFO Daemon Fetch goal state completed Mar 7 00:46:51.317545 waagent[2038]: 2026-03-07T00:46:51.317499Z INFO Daemon Daemon Starting provisioning Mar 7 00:46:51.321223 waagent[2038]: 2026-03-07T00:46:51.321191Z INFO Daemon Daemon Handle ovf-env.xml. Mar 7 00:46:51.324812 waagent[2038]: 2026-03-07T00:46:51.324785Z INFO Daemon Daemon Set hostname [ci-4459.2.3-n-801efb9c04] Mar 7 00:46:51.331123 waagent[2038]: 2026-03-07T00:46:51.331075Z INFO Daemon Daemon Publish hostname [ci-4459.2.3-n-801efb9c04] Mar 7 00:46:51.335742 waagent[2038]: 2026-03-07T00:46:51.335704Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 7 00:46:51.340336 waagent[2038]: 2026-03-07T00:46:51.340305Z INFO Daemon Daemon Primary interface is [eth0] Mar 7 00:46:51.365504 systemd-networkd[1488]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:46:51.365511 systemd-networkd[1488]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:46:51.365565 systemd-networkd[1488]: eth0: DHCP lease lost Mar 7 00:46:51.366355 waagent[2038]: 2026-03-07T00:46:51.366289Z INFO Daemon Daemon Create user account if not exists Mar 7 00:46:51.370635 waagent[2038]: 2026-03-07T00:46:51.370593Z INFO Daemon Daemon User core already exists, skip useradd Mar 7 00:46:51.374799 waagent[2038]: 2026-03-07T00:46:51.374760Z INFO Daemon Daemon Configure sudoer Mar 7 00:46:51.382293 waagent[2038]: 2026-03-07T00:46:51.382155Z INFO Daemon Daemon Configure sshd Mar 7 00:46:51.387916 waagent[2038]: 2026-03-07T00:46:51.387866Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 7 00:46:51.397322 waagent[2038]: 2026-03-07T00:46:51.397271Z INFO Daemon Daemon Deploy ssh public key. Mar 7 00:46:51.402339 systemd-networkd[1488]: eth0: DHCPv4 address 10.200.20.26/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 7 00:46:52.487210 waagent[2038]: 2026-03-07T00:46:52.486665Z INFO Daemon Daemon Provisioning complete Mar 7 00:46:52.501405 waagent[2038]: 2026-03-07T00:46:52.501364Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 7 00:46:52.506553 waagent[2038]: 2026-03-07T00:46:52.506515Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 7 00:46:52.514633 waagent[2038]: 2026-03-07T00:46:52.514600Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Mar 7 00:46:52.615235 waagent[2127]: 2026-03-07T00:46:52.614288Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Mar 7 00:46:52.615235 waagent[2127]: 2026-03-07T00:46:52.614429Z INFO ExtHandler ExtHandler OS: flatcar 4459.2.3 Mar 7 00:46:52.615235 waagent[2127]: 2026-03-07T00:46:52.614467Z INFO ExtHandler ExtHandler Python: 3.11.13 Mar 7 00:46:52.615235 waagent[2127]: 2026-03-07T00:46:52.614502Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Mar 7 00:46:52.654604 waagent[2127]: 2026-03-07T00:46:52.654534Z INFO ExtHandler ExtHandler Distro: flatcar-4459.2.3; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Mar 7 00:46:52.654774 waagent[2127]: 2026-03-07T00:46:52.654743Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 7 00:46:52.654817 waagent[2127]: 2026-03-07T00:46:52.654798Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 7 00:46:52.660744 waagent[2127]: 2026-03-07T00:46:52.660698Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 7 00:46:52.665630 waagent[2127]: 2026-03-07T00:46:52.665599Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.179 Mar 7 00:46:52.665996 waagent[2127]: 2026-03-07T00:46:52.665964Z INFO ExtHandler Mar 7 00:46:52.666050 waagent[2127]: 2026-03-07T00:46:52.666032Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 17cb905f-bf37-4d45-8b72-3a4194c86d2e eTag: 2596505943338096898 source: Fabric] Mar 7 00:46:52.666320 waagent[2127]: 2026-03-07T00:46:52.666290Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 7 00:46:52.666747 waagent[2127]: 2026-03-07T00:46:52.666717Z INFO ExtHandler Mar 7 00:46:52.666784 waagent[2127]: 2026-03-07T00:46:52.666767Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 7 00:46:52.671019 waagent[2127]: 2026-03-07T00:46:52.670988Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 7 00:46:52.737915 waagent[2127]: 2026-03-07T00:46:52.737800Z INFO ExtHandler Downloaded certificate {'thumbprint': '29CB6E139D4D986A0912C78381B41659BE358687', 'hasPrivateKey': True} Mar 7 00:46:52.738269 waagent[2127]: 2026-03-07T00:46:52.738237Z INFO ExtHandler Fetch goal state completed Mar 7 00:46:52.751129 waagent[2127]: 2026-03-07T00:46:52.751080Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.4 27 Jan 2026 (Library: OpenSSL 3.4.4 27 Jan 2026) Mar 7 00:46:52.754628 waagent[2127]: 2026-03-07T00:46:52.754579Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2127 Mar 7 00:46:52.754733 waagent[2127]: 2026-03-07T00:46:52.754705Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 7 00:46:52.754976 waagent[2127]: 2026-03-07T00:46:52.754947Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Mar 7 00:46:52.756083 waagent[2127]: 2026-03-07T00:46:52.756047Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.2.3', '', 'Flatcar Container Linux by Kinvolk'] Mar 7 00:46:52.756481 waagent[2127]: 2026-03-07T00:46:52.756446Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.2.3', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Mar 7 00:46:52.756603 waagent[2127]: 2026-03-07T00:46:52.756578Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Mar 7 00:46:52.757021 waagent[2127]: 2026-03-07T00:46:52.756989Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 7 00:46:52.925445 waagent[2127]: 2026-03-07T00:46:52.925403Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 7 00:46:52.925635 waagent[2127]: 2026-03-07T00:46:52.925606Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 7 00:46:52.930550 waagent[2127]: 2026-03-07T00:46:52.930150Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 7 00:46:52.934947 systemd[1]: Reload requested from client PID 2142 ('systemctl') (unit waagent.service)... Mar 7 00:46:52.935148 systemd[1]: Reloading... Mar 7 00:46:53.011505 zram_generator::config[2196]: No configuration found. Mar 7 00:46:53.150486 systemd[1]: Reloading finished in 214 ms. Mar 7 00:46:53.164073 waagent[2127]: 2026-03-07T00:46:53.163997Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 7 00:46:53.164166 waagent[2127]: 2026-03-07T00:46:53.164145Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 7 00:46:53.399116 waagent[2127]: 2026-03-07T00:46:53.399025Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 7 00:46:53.399428 waagent[2127]: 2026-03-07T00:46:53.399388Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Mar 7 00:46:53.400121 waagent[2127]: 2026-03-07T00:46:53.400072Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 7 00:46:53.400426 waagent[2127]: 2026-03-07T00:46:53.400388Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 7 00:46:53.401266 waagent[2127]: 2026-03-07T00:46:53.400628Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 7 00:46:53.401266 waagent[2127]: 2026-03-07T00:46:53.400702Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 7 00:46:53.401266 waagent[2127]: 2026-03-07T00:46:53.400870Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 7 00:46:53.401266 waagent[2127]: 2026-03-07T00:46:53.401015Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 7 00:46:53.401266 waagent[2127]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 7 00:46:53.401266 waagent[2127]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 7 00:46:53.401266 waagent[2127]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 7 00:46:53.401266 waagent[2127]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 7 00:46:53.401266 waagent[2127]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 7 00:46:53.401266 waagent[2127]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 7 00:46:53.401540 waagent[2127]: 2026-03-07T00:46:53.401505Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 7 00:46:53.401753 waagent[2127]: 2026-03-07T00:46:53.401721Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 7 00:46:53.401834 waagent[2127]: 2026-03-07T00:46:53.401793Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 7 00:46:53.401892 waagent[2127]: 2026-03-07T00:46:53.401860Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 7 00:46:53.401997 waagent[2127]: 2026-03-07T00:46:53.401969Z INFO EnvHandler ExtHandler Configure routes Mar 7 00:46:53.402041 waagent[2127]: 2026-03-07T00:46:53.402019Z INFO EnvHandler ExtHandler Gateway:None Mar 7 00:46:53.402443 waagent[2127]: 2026-03-07T00:46:53.402337Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 7 00:46:53.402443 waagent[2127]: 2026-03-07T00:46:53.402389Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 7 00:46:53.402509 waagent[2127]: 2026-03-07T00:46:53.402487Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 7 00:46:53.402737 waagent[2127]: 2026-03-07T00:46:53.402708Z INFO EnvHandler ExtHandler Routes:None Mar 7 00:46:53.410216 waagent[2127]: 2026-03-07T00:46:53.409010Z INFO ExtHandler ExtHandler Mar 7 00:46:53.410216 waagent[2127]: 2026-03-07T00:46:53.409089Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: bad64013-5603-48fe-b2ed-bd6c4ee648ab correlation 43bec9bf-b4cb-42d0-b906-2151a670ded3 created: 2026-03-07T00:45:55.135834Z] Mar 7 00:46:53.410216 waagent[2127]: 2026-03-07T00:46:53.409392Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 7 00:46:53.410216 waagent[2127]: 2026-03-07T00:46:53.409818Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Mar 7 00:46:53.435412 waagent[2127]: 2026-03-07T00:46:53.435363Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Mar 7 00:46:53.435412 waagent[2127]: Try `iptables -h' or 'iptables --help' for more information.) Mar 7 00:46:53.435940 waagent[2127]: 2026-03-07T00:46:53.435904Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 4D074CFE-D4B5-4FD8-8E27-A230D866C377;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Mar 7 00:46:53.456685 waagent[2127]: 2026-03-07T00:46:53.456610Z INFO MonitorHandler ExtHandler Network interfaces: Mar 7 00:46:53.456685 waagent[2127]: Executing ['ip', '-a', '-o', 'link']: Mar 7 00:46:53.456685 waagent[2127]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 7 00:46:53.456685 waagent[2127]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:7b:fe:3b brd ff:ff:ff:ff:ff:ff Mar 7 00:46:53.456685 waagent[2127]: 3: enP34373s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:7b:fe:3b brd ff:ff:ff:ff:ff:ff\ altname enP34373p0s2 Mar 7 00:46:53.456685 waagent[2127]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 7 00:46:53.456685 waagent[2127]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 7 00:46:53.456685 waagent[2127]: 2: eth0 inet 10.200.20.26/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 7 00:46:53.456685 waagent[2127]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 7 00:46:53.456685 waagent[2127]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 7 00:46:53.456685 waagent[2127]: 2: eth0 inet6 fe80::222:48ff:fe7b:fe3b/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 7 00:46:53.495270 waagent[2127]: 2026-03-07T00:46:53.495030Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Mar 7 00:46:53.495270 waagent[2127]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 7 00:46:53.495270 waagent[2127]: pkts bytes target prot opt in out source destination Mar 7 00:46:53.495270 waagent[2127]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 7 00:46:53.495270 waagent[2127]: pkts bytes target prot opt in out source destination Mar 7 00:46:53.495270 waagent[2127]: Chain OUTPUT (policy ACCEPT 6 packets, 707 bytes) Mar 7 00:46:53.495270 waagent[2127]: pkts bytes target prot opt in out source destination Mar 7 00:46:53.495270 waagent[2127]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 7 00:46:53.495270 waagent[2127]: 1 52 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 7 00:46:53.495270 waagent[2127]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 7 00:46:53.497697 waagent[2127]: 2026-03-07T00:46:53.497645Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 7 00:46:53.497697 waagent[2127]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 7 00:46:53.497697 waagent[2127]: pkts bytes target prot opt in out source destination Mar 7 00:46:53.497697 waagent[2127]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 7 00:46:53.497697 waagent[2127]: pkts bytes target prot opt in out source destination Mar 7 00:46:53.497697 waagent[2127]: Chain OUTPUT (policy ACCEPT 6 packets, 707 bytes) Mar 7 00:46:53.497697 waagent[2127]: pkts bytes target prot opt in out source destination Mar 7 00:46:53.497697 waagent[2127]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 7 00:46:53.497697 waagent[2127]: 1 52 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 7 00:46:53.497697 waagent[2127]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 7 00:46:53.497895 waagent[2127]: 2026-03-07T00:46:53.497868Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 7 00:46:59.948638 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 7 00:46:59.950376 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:00.062485 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:00.071623 (kubelet)[2276]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:47:00.173830 kubelet[2276]: E0307 00:47:00.173781 2276 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:47:00.176606 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:47:00.176718 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:47:00.176995 systemd[1]: kubelet.service: Consumed 111ms CPU time, 106.7M memory peak. Mar 7 00:47:10.427262 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 7 00:47:10.428536 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:10.528486 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:10.531740 (kubelet)[2291]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:47:10.556929 kubelet[2291]: E0307 00:47:10.556876 2291 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:47:10.559018 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:47:10.559133 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:47:10.559613 systemd[1]: kubelet.service: Consumed 106ms CPU time, 107M memory peak. Mar 7 00:47:11.422502 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 7 00:47:11.424379 systemd[1]: Started sshd@0-10.200.20.26:22-10.200.16.10:55238.service - OpenSSH per-connection server daemon (10.200.16.10:55238). Mar 7 00:47:12.206131 chronyd[1851]: Selected source PHC0 Mar 7 00:47:14.868248 sshd[2299]: Accepted publickey for core from 10.200.16.10 port 55238 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:47:14.869171 sshd-session[2299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:47:14.872836 systemd-logind[1883]: New session 3 of user core. Mar 7 00:47:14.881542 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 7 00:47:15.191803 systemd[1]: Started sshd@1-10.200.20.26:22-10.200.16.10:55248.service - OpenSSH per-connection server daemon (10.200.16.10:55248). Mar 7 00:47:15.613099 sshd[2305]: Accepted publickey for core from 10.200.16.10 port 55248 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:47:15.613918 sshd-session[2305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:47:15.617450 systemd-logind[1883]: New session 4 of user core. Mar 7 00:47:15.631347 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 7 00:47:15.849236 sshd[2308]: Connection closed by 10.200.16.10 port 55248 Mar 7 00:47:15.849821 sshd-session[2305]: pam_unix(sshd:session): session closed for user core Mar 7 00:47:15.852961 systemd[1]: sshd@1-10.200.20.26:22-10.200.16.10:55248.service: Deactivated successfully. Mar 7 00:47:15.854360 systemd[1]: session-4.scope: Deactivated successfully. Mar 7 00:47:15.855527 systemd-logind[1883]: Session 4 logged out. Waiting for processes to exit. Mar 7 00:47:15.856458 systemd-logind[1883]: Removed session 4. Mar 7 00:47:15.944896 systemd[1]: Started sshd@2-10.200.20.26:22-10.200.16.10:55254.service - OpenSSH per-connection server daemon (10.200.16.10:55254). Mar 7 00:47:16.371400 sshd[2314]: Accepted publickey for core from 10.200.16.10 port 55254 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:47:16.372374 sshd-session[2314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:47:16.376714 systemd-logind[1883]: New session 5 of user core. Mar 7 00:47:16.385341 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 7 00:47:16.601734 sshd[2317]: Connection closed by 10.200.16.10 port 55254 Mar 7 00:47:16.602317 sshd-session[2314]: pam_unix(sshd:session): session closed for user core Mar 7 00:47:16.606145 systemd-logind[1883]: Session 5 logged out. Waiting for processes to exit. Mar 7 00:47:16.606772 systemd[1]: sshd@2-10.200.20.26:22-10.200.16.10:55254.service: Deactivated successfully. Mar 7 00:47:16.608216 systemd[1]: session-5.scope: Deactivated successfully. Mar 7 00:47:16.609496 systemd-logind[1883]: Removed session 5. Mar 7 00:47:16.700976 systemd[1]: Started sshd@3-10.200.20.26:22-10.200.16.10:55262.service - OpenSSH per-connection server daemon (10.200.16.10:55262). Mar 7 00:47:17.117662 sshd[2323]: Accepted publickey for core from 10.200.16.10 port 55262 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:47:17.118841 sshd-session[2323]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:47:17.122462 systemd-logind[1883]: New session 6 of user core. Mar 7 00:47:17.127488 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 7 00:47:17.351416 sshd[2326]: Connection closed by 10.200.16.10 port 55262 Mar 7 00:47:17.351991 sshd-session[2323]: pam_unix(sshd:session): session closed for user core Mar 7 00:47:17.355736 systemd[1]: sshd@3-10.200.20.26:22-10.200.16.10:55262.service: Deactivated successfully. Mar 7 00:47:17.357599 systemd[1]: session-6.scope: Deactivated successfully. Mar 7 00:47:17.360270 systemd-logind[1883]: Session 6 logged out. Waiting for processes to exit. Mar 7 00:47:17.361252 systemd-logind[1883]: Removed session 6. Mar 7 00:47:17.452026 systemd[1]: Started sshd@4-10.200.20.26:22-10.200.16.10:55278.service - OpenSSH per-connection server daemon (10.200.16.10:55278). Mar 7 00:47:17.872790 sshd[2332]: Accepted publickey for core from 10.200.16.10 port 55278 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:47:17.873924 sshd-session[2332]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:47:17.877510 systemd-logind[1883]: New session 7 of user core. Mar 7 00:47:17.884496 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 7 00:47:18.199346 sudo[2336]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 7 00:47:18.199605 sudo[2336]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:47:18.230670 sudo[2336]: pam_unix(sudo:session): session closed for user root Mar 7 00:47:18.308336 sshd[2335]: Connection closed by 10.200.16.10 port 55278 Mar 7 00:47:18.309023 sshd-session[2332]: pam_unix(sshd:session): session closed for user core Mar 7 00:47:18.312831 systemd[1]: sshd@4-10.200.20.26:22-10.200.16.10:55278.service: Deactivated successfully. Mar 7 00:47:18.314131 systemd[1]: session-7.scope: Deactivated successfully. Mar 7 00:47:18.315676 systemd-logind[1883]: Session 7 logged out. Waiting for processes to exit. Mar 7 00:47:18.317407 systemd-logind[1883]: Removed session 7. Mar 7 00:47:18.396343 systemd[1]: Started sshd@5-10.200.20.26:22-10.200.16.10:55280.service - OpenSSH per-connection server daemon (10.200.16.10:55280). Mar 7 00:47:18.815740 sshd[2342]: Accepted publickey for core from 10.200.16.10 port 55280 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:47:18.816533 sshd-session[2342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:47:18.820122 systemd-logind[1883]: New session 8 of user core. Mar 7 00:47:18.829539 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 7 00:47:18.973182 sudo[2347]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 7 00:47:18.973790 sudo[2347]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:47:18.980551 sudo[2347]: pam_unix(sudo:session): session closed for user root Mar 7 00:47:18.984653 sudo[2346]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 7 00:47:18.984867 sudo[2346]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:47:18.992613 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 7 00:47:19.024658 augenrules[2369]: No rules Mar 7 00:47:19.026132 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 00:47:19.026495 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 7 00:47:19.029786 sudo[2346]: pam_unix(sudo:session): session closed for user root Mar 7 00:47:19.107392 sshd[2345]: Connection closed by 10.200.16.10 port 55280 Mar 7 00:47:19.107884 sshd-session[2342]: pam_unix(sshd:session): session closed for user core Mar 7 00:47:19.111864 systemd-logind[1883]: Session 8 logged out. Waiting for processes to exit. Mar 7 00:47:19.112634 systemd[1]: sshd@5-10.200.20.26:22-10.200.16.10:55280.service: Deactivated successfully. Mar 7 00:47:19.115477 systemd[1]: session-8.scope: Deactivated successfully. Mar 7 00:47:19.116710 systemd-logind[1883]: Removed session 8. Mar 7 00:47:19.195890 systemd[1]: Started sshd@6-10.200.20.26:22-10.200.16.10:55288.service - OpenSSH per-connection server daemon (10.200.16.10:55288). Mar 7 00:47:19.613243 sshd[2378]: Accepted publickey for core from 10.200.16.10 port 55288 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:47:19.613980 sshd-session[2378]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:47:19.617475 systemd-logind[1883]: New session 9 of user core. Mar 7 00:47:19.625499 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 7 00:47:19.770504 sudo[2382]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 7 00:47:19.770716 sudo[2382]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:47:20.809609 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 7 00:47:20.811245 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:20.995447 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:21.009512 (kubelet)[2403]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:47:21.035464 kubelet[2403]: E0307 00:47:21.035408 2403 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:47:21.037564 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:47:21.037794 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:47:21.038325 systemd[1]: kubelet.service: Consumed 110ms CPU time, 106.5M memory peak. Mar 7 00:47:21.998320 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 7 00:47:22.010482 (dockerd)[2413]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 7 00:47:23.014541 dockerd[2413]: time="2026-03-07T00:47:23.014483279Z" level=info msg="Starting up" Mar 7 00:47:23.015099 dockerd[2413]: time="2026-03-07T00:47:23.015077006Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 7 00:47:23.025026 dockerd[2413]: time="2026-03-07T00:47:23.024956583Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 7 00:47:23.117815 dockerd[2413]: time="2026-03-07T00:47:23.117771985Z" level=info msg="Loading containers: start." Mar 7 00:47:23.145207 kernel: Initializing XFRM netlink socket Mar 7 00:47:23.444839 systemd-networkd[1488]: docker0: Link UP Mar 7 00:47:23.459319 dockerd[2413]: time="2026-03-07T00:47:23.459272080Z" level=info msg="Loading containers: done." Mar 7 00:47:23.469666 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2238498061-merged.mount: Deactivated successfully. Mar 7 00:47:23.482515 dockerd[2413]: time="2026-03-07T00:47:23.482472112Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 7 00:47:23.482608 dockerd[2413]: time="2026-03-07T00:47:23.482574100Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 7 00:47:23.482682 dockerd[2413]: time="2026-03-07T00:47:23.482665744Z" level=info msg="Initializing buildkit" Mar 7 00:47:23.532489 dockerd[2413]: time="2026-03-07T00:47:23.532445044Z" level=info msg="Completed buildkit initialization" Mar 7 00:47:23.539030 dockerd[2413]: time="2026-03-07T00:47:23.538988771Z" level=info msg="Daemon has completed initialization" Mar 7 00:47:23.539241 dockerd[2413]: time="2026-03-07T00:47:23.539045637Z" level=info msg="API listen on /run/docker.sock" Mar 7 00:47:23.539622 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 7 00:47:23.928345 containerd[1905]: time="2026-03-07T00:47:23.928305192Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 7 00:47:24.798573 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount613776693.mount: Deactivated successfully. Mar 7 00:47:25.901968 containerd[1905]: time="2026-03-07T00:47:25.901318118Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:25.903828 containerd[1905]: time="2026-03-07T00:47:25.903796907Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=24583252" Mar 7 00:47:25.907762 containerd[1905]: time="2026-03-07T00:47:25.907730723Z" level=info msg="ImageCreate event name:\"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:25.913282 containerd[1905]: time="2026-03-07T00:47:25.913246754Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:25.913946 containerd[1905]: time="2026-03-07T00:47:25.913899769Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"24579851\" in 1.985556535s" Mar 7 00:47:25.913946 containerd[1905]: time="2026-03-07T00:47:25.913939506Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\"" Mar 7 00:47:25.914611 containerd[1905]: time="2026-03-07T00:47:25.914563896Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 7 00:47:26.994167 containerd[1905]: time="2026-03-07T00:47:26.993554060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:26.995629 containerd[1905]: time="2026-03-07T00:47:26.995583442Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=19139641" Mar 7 00:47:26.998319 containerd[1905]: time="2026-03-07T00:47:26.998295304Z" level=info msg="ImageCreate event name:\"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:27.002224 containerd[1905]: time="2026-03-07T00:47:27.002193487Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:27.003154 containerd[1905]: time="2026-03-07T00:47:27.002683912Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"20724045\" in 1.088045277s" Mar 7 00:47:27.003154 containerd[1905]: time="2026-03-07T00:47:27.002714433Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\"" Mar 7 00:47:27.003520 containerd[1905]: time="2026-03-07T00:47:27.003500812Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 7 00:47:27.878225 containerd[1905]: time="2026-03-07T00:47:27.878129884Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:27.880555 containerd[1905]: time="2026-03-07T00:47:27.880388658Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=14195544" Mar 7 00:47:27.883723 containerd[1905]: time="2026-03-07T00:47:27.883693180Z" level=info msg="ImageCreate event name:\"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:27.888550 containerd[1905]: time="2026-03-07T00:47:27.888515771Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:27.889100 containerd[1905]: time="2026-03-07T00:47:27.889027405Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"15779966\" in 885.447055ms" Mar 7 00:47:27.889100 containerd[1905]: time="2026-03-07T00:47:27.889057654Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\"" Mar 7 00:47:27.889611 containerd[1905]: time="2026-03-07T00:47:27.889591032Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 7 00:47:28.771102 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3051283046.mount: Deactivated successfully. Mar 7 00:47:28.957112 containerd[1905]: time="2026-03-07T00:47:28.956635120Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:28.960385 containerd[1905]: time="2026-03-07T00:47:28.960337098Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=22697088" Mar 7 00:47:28.963334 containerd[1905]: time="2026-03-07T00:47:28.963281699Z" level=info msg="ImageCreate event name:\"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:28.966710 containerd[1905]: time="2026-03-07T00:47:28.966661888Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:28.967146 containerd[1905]: time="2026-03-07T00:47:28.966923356Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"22696107\" in 1.077307722s" Mar 7 00:47:28.967146 containerd[1905]: time="2026-03-07T00:47:28.966955373Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\"" Mar 7 00:47:28.967516 containerd[1905]: time="2026-03-07T00:47:28.967490644Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 7 00:47:30.185115 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1759954251.mount: Deactivated successfully. Mar 7 00:47:31.042746 containerd[1905]: time="2026-03-07T00:47:31.042688822Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:31.044686 containerd[1905]: time="2026-03-07T00:47:31.044651484Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395406" Mar 7 00:47:31.047175 containerd[1905]: time="2026-03-07T00:47:31.047127833Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:31.050780 containerd[1905]: time="2026-03-07T00:47:31.050730951Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:31.051565 containerd[1905]: time="2026-03-07T00:47:31.051393548Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 2.083873927s" Mar 7 00:47:31.051565 containerd[1905]: time="2026-03-07T00:47:31.051424190Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Mar 7 00:47:31.051888 containerd[1905]: time="2026-03-07T00:47:31.051861049Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 7 00:47:31.055612 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 7 00:47:31.056952 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:31.169631 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:31.175628 (kubelet)[2755]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:47:31.265751 kubelet[2755]: E0307 00:47:31.265698 2755 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:47:31.267833 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:47:31.267947 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:47:31.269275 systemd[1]: kubelet.service: Consumed 114ms CPU time, 106.7M memory peak. Mar 7 00:47:32.027798 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4215875638.mount: Deactivated successfully. Mar 7 00:47:32.045739 containerd[1905]: time="2026-03-07T00:47:32.045219241Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:32.049230 containerd[1905]: time="2026-03-07T00:47:32.049174775Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709" Mar 7 00:47:32.051642 containerd[1905]: time="2026-03-07T00:47:32.051594665Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:32.055694 containerd[1905]: time="2026-03-07T00:47:32.055645059Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:32.056695 containerd[1905]: time="2026-03-07T00:47:32.056571868Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 1.004683434s" Mar 7 00:47:32.056695 containerd[1905]: time="2026-03-07T00:47:32.056603613Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 7 00:47:32.057019 containerd[1905]: time="2026-03-07T00:47:32.056991190Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 7 00:47:32.078301 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 7 00:47:32.588032 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount959217345.mount: Deactivated successfully. Mar 7 00:47:33.487918 containerd[1905]: time="2026-03-07T00:47:33.487250269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:33.489808 containerd[1905]: time="2026-03-07T00:47:33.489775182Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21125515" Mar 7 00:47:33.492348 containerd[1905]: time="2026-03-07T00:47:33.492320798Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:33.496403 containerd[1905]: time="2026-03-07T00:47:33.496365924Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:33.496912 containerd[1905]: time="2026-03-07T00:47:33.496874741Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 1.439855814s" Mar 7 00:47:33.496912 containerd[1905]: time="2026-03-07T00:47:33.496910183Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"" Mar 7 00:47:33.572710 update_engine[1887]: I20260307 00:47:33.572638 1887 update_attempter.cc:509] Updating boot flags... Mar 7 00:47:35.949503 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:35.950014 systemd[1]: kubelet.service: Consumed 114ms CPU time, 106.7M memory peak. Mar 7 00:47:35.952099 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:35.977130 systemd[1]: Reload requested from client PID 2920 ('systemctl') (unit session-9.scope)... Mar 7 00:47:35.977312 systemd[1]: Reloading... Mar 7 00:47:36.083225 zram_generator::config[2979]: No configuration found. Mar 7 00:47:36.231566 systemd[1]: Reloading finished in 253 ms. Mar 7 00:47:36.278669 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 7 00:47:36.278739 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 7 00:47:36.278975 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:36.279016 systemd[1]: kubelet.service: Consumed 80ms CPU time, 94.9M memory peak. Mar 7 00:47:36.280415 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:36.527345 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:36.538547 (kubelet)[3034]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:47:36.687232 kubelet[3034]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 00:47:36.687232 kubelet[3034]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:47:36.687232 kubelet[3034]: I0307 00:47:36.686625 3034 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 00:47:37.137222 kubelet[3034]: I0307 00:47:37.135406 3034 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 7 00:47:37.137222 kubelet[3034]: I0307 00:47:37.135437 3034 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:47:37.137222 kubelet[3034]: I0307 00:47:37.135458 3034 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 00:47:37.137222 kubelet[3034]: I0307 00:47:37.135463 3034 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:47:37.137222 kubelet[3034]: I0307 00:47:37.135754 3034 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 00:47:37.144945 kubelet[3034]: E0307 00:47:37.144911 3034 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.26:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 7 00:47:37.145721 kubelet[3034]: I0307 00:47:37.145702 3034 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:47:37.148836 kubelet[3034]: I0307 00:47:37.148786 3034 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 7 00:47:37.151572 kubelet[3034]: I0307 00:47:37.151548 3034 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 00:47:37.151903 kubelet[3034]: I0307 00:47:37.151876 3034 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:47:37.152083 kubelet[3034]: I0307 00:47:37.151962 3034 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.3-n-801efb9c04","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:47:37.152249 kubelet[3034]: I0307 00:47:37.152235 3034 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 00:47:37.152293 kubelet[3034]: I0307 00:47:37.152286 3034 container_manager_linux.go:306] "Creating device plugin manager" Mar 7 00:47:37.152464 kubelet[3034]: I0307 00:47:37.152449 3034 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 00:47:37.157823 kubelet[3034]: I0307 00:47:37.157797 3034 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:47:37.158987 kubelet[3034]: I0307 00:47:37.158966 3034 kubelet.go:475] "Attempting to sync node with API server" Mar 7 00:47:37.159081 kubelet[3034]: I0307 00:47:37.159071 3034 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:47:37.159525 kubelet[3034]: E0307 00:47:37.159499 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.26:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.3-n-801efb9c04&limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 00:47:37.159720 kubelet[3034]: I0307 00:47:37.159707 3034 kubelet.go:387] "Adding apiserver pod source" Mar 7 00:47:37.161255 kubelet[3034]: I0307 00:47:37.161234 3034 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:47:37.162179 kubelet[3034]: E0307 00:47:37.162159 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.26:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 00:47:37.162376 kubelet[3034]: I0307 00:47:37.162360 3034 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 7 00:47:37.162846 kubelet[3034]: I0307 00:47:37.162825 3034 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:47:37.162930 kubelet[3034]: I0307 00:47:37.162919 3034 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 00:47:37.163002 kubelet[3034]: W0307 00:47:37.162992 3034 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 7 00:47:37.166029 kubelet[3034]: I0307 00:47:37.166009 3034 server.go:1262] "Started kubelet" Mar 7 00:47:37.167583 kubelet[3034]: I0307 00:47:37.167546 3034 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:47:37.168269 kubelet[3034]: I0307 00:47:37.168244 3034 server.go:310] "Adding debug handlers to kubelet server" Mar 7 00:47:37.173044 kubelet[3034]: I0307 00:47:37.172231 3034 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:47:37.173044 kubelet[3034]: I0307 00:47:37.172318 3034 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 00:47:37.173044 kubelet[3034]: I0307 00:47:37.172549 3034 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:47:37.173781 kubelet[3034]: E0307 00:47:37.172892 3034 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.26:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.26:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.3-n-801efb9c04.189a68a40f660546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.3-n-801efb9c04,UID:ci-4459.2.3-n-801efb9c04,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.3-n-801efb9c04,},FirstTimestamp:2026-03-07 00:47:37.165980998 +0000 UTC m=+0.624835825,LastTimestamp:2026-03-07 00:47:37.165980998 +0000 UTC m=+0.624835825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.3-n-801efb9c04,}" Mar 7 00:47:37.179967 kubelet[3034]: I0307 00:47:37.179921 3034 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 00:47:37.182126 kubelet[3034]: I0307 00:47:37.182093 3034 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 7 00:47:37.182245 kubelet[3034]: E0307 00:47:37.182204 3034 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-801efb9c04\" not found" Mar 7 00:47:37.182461 kubelet[3034]: I0307 00:47:37.182440 3034 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 00:47:37.182530 kubelet[3034]: I0307 00:47:37.182519 3034 reconciler.go:29] "Reconciler: start to sync state" Mar 7 00:47:37.182801 kubelet[3034]: I0307 00:47:37.181121 3034 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:47:37.184210 kubelet[3034]: E0307 00:47:37.183895 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.26:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 00:47:37.184210 kubelet[3034]: E0307 00:47:37.183988 3034 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-801efb9c04?timeout=10s\": dial tcp 10.200.20.26:6443: connect: connection refused" interval="200ms" Mar 7 00:47:37.184529 kubelet[3034]: I0307 00:47:37.184492 3034 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:47:37.184597 kubelet[3034]: I0307 00:47:37.184579 3034 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:47:37.185764 kubelet[3034]: E0307 00:47:37.185743 3034 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:47:37.186292 kubelet[3034]: I0307 00:47:37.186275 3034 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:47:37.206147 kubelet[3034]: I0307 00:47:37.206120 3034 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 00:47:37.206147 kubelet[3034]: I0307 00:47:37.206137 3034 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 00:47:37.206147 kubelet[3034]: I0307 00:47:37.206157 3034 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:47:37.212149 kubelet[3034]: I0307 00:47:37.212110 3034 policy_none.go:49] "None policy: Start" Mar 7 00:47:37.212149 kubelet[3034]: I0307 00:47:37.212145 3034 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 00:47:37.212149 kubelet[3034]: I0307 00:47:37.212157 3034 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 00:47:37.216358 kubelet[3034]: I0307 00:47:37.216276 3034 policy_none.go:47] "Start" Mar 7 00:47:37.217629 kubelet[3034]: I0307 00:47:37.217538 3034 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 00:47:37.220114 kubelet[3034]: I0307 00:47:37.220091 3034 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 00:47:37.220656 kubelet[3034]: I0307 00:47:37.220270 3034 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 7 00:47:37.220656 kubelet[3034]: I0307 00:47:37.220309 3034 kubelet.go:2428] "Starting kubelet main sync loop" Mar 7 00:47:37.220656 kubelet[3034]: E0307 00:47:37.220363 3034 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:47:37.222414 kubelet[3034]: E0307 00:47:37.222391 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.26:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 7 00:47:37.227403 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 7 00:47:37.239098 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 7 00:47:37.242359 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 7 00:47:37.249949 kubelet[3034]: E0307 00:47:37.249921 3034 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:47:37.250453 kubelet[3034]: I0307 00:47:37.250437 3034 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 00:47:37.250580 kubelet[3034]: I0307 00:47:37.250545 3034 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:47:37.251082 kubelet[3034]: I0307 00:47:37.250903 3034 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 00:47:37.252725 kubelet[3034]: E0307 00:47:37.252693 3034 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:47:37.252863 kubelet[3034]: E0307 00:47:37.252849 3034 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.2.3-n-801efb9c04\" not found" Mar 7 00:47:37.334264 systemd[1]: Created slice kubepods-burstable-pod73872d3cce42f8ca733d15fdd9d420b7.slice - libcontainer container kubepods-burstable-pod73872d3cce42f8ca733d15fdd9d420b7.slice. Mar 7 00:47:37.339867 kubelet[3034]: E0307 00:47:37.339824 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.344723 systemd[1]: Created slice kubepods-burstable-pod18710cf16308f1680635430d36cec99e.slice - libcontainer container kubepods-burstable-pod18710cf16308f1680635430d36cec99e.slice. Mar 7 00:47:37.346975 kubelet[3034]: E0307 00:47:37.346944 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.353607 kubelet[3034]: I0307 00:47:37.353580 3034 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.353988 kubelet[3034]: E0307 00:47:37.353958 3034 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.26:6443/api/v1/nodes\": dial tcp 10.200.20.26:6443: connect: connection refused" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.366678 systemd[1]: Created slice kubepods-burstable-pod5944364bc5de357ac89645c1cc475101.slice - libcontainer container kubepods-burstable-pod5944364bc5de357ac89645c1cc475101.slice. Mar 7 00:47:37.368269 kubelet[3034]: E0307 00:47:37.368240 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.384726 kubelet[3034]: E0307 00:47:37.384670 3034 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-801efb9c04?timeout=10s\": dial tcp 10.200.20.26:6443: connect: connection refused" interval="400ms" Mar 7 00:47:37.483632 kubelet[3034]: I0307 00:47:37.483356 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483632 kubelet[3034]: I0307 00:47:37.483400 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483632 kubelet[3034]: I0307 00:47:37.483420 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5944364bc5de357ac89645c1cc475101-ca-certs\") pod \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" (UID: \"5944364bc5de357ac89645c1cc475101\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483632 kubelet[3034]: I0307 00:47:37.483433 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5944364bc5de357ac89645c1cc475101-k8s-certs\") pod \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" (UID: \"5944364bc5de357ac89645c1cc475101\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483632 kubelet[3034]: I0307 00:47:37.483452 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5944364bc5de357ac89645c1cc475101-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" (UID: \"5944364bc5de357ac89645c1cc475101\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483860 kubelet[3034]: I0307 00:47:37.483461 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483860 kubelet[3034]: I0307 00:47:37.483474 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483860 kubelet[3034]: I0307 00:47:37.483489 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/18710cf16308f1680635430d36cec99e-kubeconfig\") pod \"kube-scheduler-ci-4459.2.3-n-801efb9c04\" (UID: \"18710cf16308f1680635430d36cec99e\") " pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.483860 kubelet[3034]: I0307 00:47:37.483500 3034 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-ca-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.556636 kubelet[3034]: I0307 00:47:37.556600 3034 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.557156 kubelet[3034]: E0307 00:47:37.557130 3034 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.26:6443/api/v1/nodes\": dial tcp 10.200.20.26:6443: connect: connection refused" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.645970 containerd[1905]: time="2026-03-07T00:47:37.645922909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.3-n-801efb9c04,Uid:73872d3cce42f8ca733d15fdd9d420b7,Namespace:kube-system,Attempt:0,}" Mar 7 00:47:37.652403 containerd[1905]: time="2026-03-07T00:47:37.652354837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.3-n-801efb9c04,Uid:18710cf16308f1680635430d36cec99e,Namespace:kube-system,Attempt:0,}" Mar 7 00:47:37.674249 containerd[1905]: time="2026-03-07T00:47:37.674170615Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.3-n-801efb9c04,Uid:5944364bc5de357ac89645c1cc475101,Namespace:kube-system,Attempt:0,}" Mar 7 00:47:37.785691 kubelet[3034]: E0307 00:47:37.785551 3034 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-801efb9c04?timeout=10s\": dial tcp 10.200.20.26:6443: connect: connection refused" interval="800ms" Mar 7 00:47:37.959493 kubelet[3034]: I0307 00:47:37.959463 3034 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:37.959826 kubelet[3034]: E0307 00:47:37.959798 3034 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.26:6443/api/v1/nodes\": dial tcp 10.200.20.26:6443: connect: connection refused" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:38.059778 kubelet[3034]: E0307 00:47:38.059627 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.26:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.3-n-801efb9c04&limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 00:47:38.075310 kubelet[3034]: E0307 00:47:38.075276 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.26:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 00:47:38.298380 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount45050464.mount: Deactivated successfully. Mar 7 00:47:38.328146 containerd[1905]: time="2026-03-07T00:47:38.327572199Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:47:38.332689 containerd[1905]: time="2026-03-07T00:47:38.332655920Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 7 00:47:38.338644 containerd[1905]: time="2026-03-07T00:47:38.338601473Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:47:38.343223 containerd[1905]: time="2026-03-07T00:47:38.343145834Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:47:38.345956 containerd[1905]: time="2026-03-07T00:47:38.345917553Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:47:38.348409 containerd[1905]: time="2026-03-07T00:47:38.348369978Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 7 00:47:38.350884 containerd[1905]: time="2026-03-07T00:47:38.350842739Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 7 00:47:38.355262 containerd[1905]: time="2026-03-07T00:47:38.354261161Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:47:38.355262 containerd[1905]: time="2026-03-07T00:47:38.354681700Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 677.266169ms" Mar 7 00:47:38.355610 containerd[1905]: time="2026-03-07T00:47:38.355575029Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 699.119172ms" Mar 7 00:47:38.356222 containerd[1905]: time="2026-03-07T00:47:38.356177400Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 706.352912ms" Mar 7 00:47:38.426204 containerd[1905]: time="2026-03-07T00:47:38.426102965Z" level=info msg="connecting to shim 047e7a6af71d1a163a82cc5347e764d17ce3f6fe1d8b58c57c90ce86b428db8d" address="unix:///run/containerd/s/3aea1c7dc416fa7b69053c7c3fa8ad23850fe26c104ac9e1cbe338177e7d35ba" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:47:38.426868 containerd[1905]: time="2026-03-07T00:47:38.426833926Z" level=info msg="connecting to shim 34809015c8831cd01325cea22cc11d3f58059c6bf9f2041c4d23ee30f57404d6" address="unix:///run/containerd/s/84ff34d83b9d2c9f7bd8b9ecfb8099a4cbc465cfd0b4b5276f9592380a48efb4" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:47:38.440004 containerd[1905]: time="2026-03-07T00:47:38.439962233Z" level=info msg="connecting to shim cea0c598f6e57a463774f9913164b246130e5b11dc1b4e9a1c1af7398b9db449" address="unix:///run/containerd/s/83e626ff7f83dd4637368baa6c01419f9c5d738183ff684082109f6684568ac8" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:47:38.457370 systemd[1]: Started cri-containerd-047e7a6af71d1a163a82cc5347e764d17ce3f6fe1d8b58c57c90ce86b428db8d.scope - libcontainer container 047e7a6af71d1a163a82cc5347e764d17ce3f6fe1d8b58c57c90ce86b428db8d. Mar 7 00:47:38.461115 kubelet[3034]: E0307 00:47:38.460013 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.26:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 7 00:47:38.467306 systemd[1]: Started cri-containerd-34809015c8831cd01325cea22cc11d3f58059c6bf9f2041c4d23ee30f57404d6.scope - libcontainer container 34809015c8831cd01325cea22cc11d3f58059c6bf9f2041c4d23ee30f57404d6. Mar 7 00:47:38.471383 systemd[1]: Started cri-containerd-cea0c598f6e57a463774f9913164b246130e5b11dc1b4e9a1c1af7398b9db449.scope - libcontainer container cea0c598f6e57a463774f9913164b246130e5b11dc1b4e9a1c1af7398b9db449. Mar 7 00:47:38.523651 containerd[1905]: time="2026-03-07T00:47:38.523610708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.3-n-801efb9c04,Uid:18710cf16308f1680635430d36cec99e,Namespace:kube-system,Attempt:0,} returns sandbox id \"047e7a6af71d1a163a82cc5347e764d17ce3f6fe1d8b58c57c90ce86b428db8d\"" Mar 7 00:47:38.526334 containerd[1905]: time="2026-03-07T00:47:38.526272766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.3-n-801efb9c04,Uid:5944364bc5de357ac89645c1cc475101,Namespace:kube-system,Attempt:0,} returns sandbox id \"34809015c8831cd01325cea22cc11d3f58059c6bf9f2041c4d23ee30f57404d6\"" Mar 7 00:47:38.534733 containerd[1905]: time="2026-03-07T00:47:38.534695521Z" level=info msg="CreateContainer within sandbox \"047e7a6af71d1a163a82cc5347e764d17ce3f6fe1d8b58c57c90ce86b428db8d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 7 00:47:38.535815 containerd[1905]: time="2026-03-07T00:47:38.535722608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.3-n-801efb9c04,Uid:73872d3cce42f8ca733d15fdd9d420b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"cea0c598f6e57a463774f9913164b246130e5b11dc1b4e9a1c1af7398b9db449\"" Mar 7 00:47:38.538858 containerd[1905]: time="2026-03-07T00:47:38.538799630Z" level=info msg="CreateContainer within sandbox \"34809015c8831cd01325cea22cc11d3f58059c6bf9f2041c4d23ee30f57404d6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 7 00:47:38.544233 containerd[1905]: time="2026-03-07T00:47:38.544180589Z" level=info msg="CreateContainer within sandbox \"cea0c598f6e57a463774f9913164b246130e5b11dc1b4e9a1c1af7398b9db449\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 7 00:47:38.564041 containerd[1905]: time="2026-03-07T00:47:38.563988035Z" level=info msg="Container 1f46f5458388f86d241299462b80ccec2ecb63dabc0050c5737b1b91f8fe430b: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:47:38.574219 containerd[1905]: time="2026-03-07T00:47:38.574001519Z" level=info msg="Container 29303611851be347ad38115797ef3b18f4f67d437c9132272890e1ed6a265554: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:47:38.587287 containerd[1905]: time="2026-03-07T00:47:38.586567400Z" level=info msg="Container 7c3b1af9c11f22fbb1b452356230caecdbce933f2de34f9db452b6116cc21139: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:47:38.587710 kubelet[3034]: E0307 00:47:38.587065 3034 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-801efb9c04?timeout=10s\": dial tcp 10.200.20.26:6443: connect: connection refused" interval="1.6s" Mar 7 00:47:38.600989 containerd[1905]: time="2026-03-07T00:47:38.600941484Z" level=info msg="CreateContainer within sandbox \"34809015c8831cd01325cea22cc11d3f58059c6bf9f2041c4d23ee30f57404d6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1f46f5458388f86d241299462b80ccec2ecb63dabc0050c5737b1b91f8fe430b\"" Mar 7 00:47:38.601673 containerd[1905]: time="2026-03-07T00:47:38.601636428Z" level=info msg="StartContainer for \"1f46f5458388f86d241299462b80ccec2ecb63dabc0050c5737b1b91f8fe430b\"" Mar 7 00:47:38.602578 containerd[1905]: time="2026-03-07T00:47:38.602543870Z" level=info msg="connecting to shim 1f46f5458388f86d241299462b80ccec2ecb63dabc0050c5737b1b91f8fe430b" address="unix:///run/containerd/s/84ff34d83b9d2c9f7bd8b9ecfb8099a4cbc465cfd0b4b5276f9592380a48efb4" protocol=ttrpc version=3 Mar 7 00:47:38.616722 containerd[1905]: time="2026-03-07T00:47:38.616681223Z" level=info msg="CreateContainer within sandbox \"047e7a6af71d1a163a82cc5347e764d17ce3f6fe1d8b58c57c90ce86b428db8d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"29303611851be347ad38115797ef3b18f4f67d437c9132272890e1ed6a265554\"" Mar 7 00:47:38.617823 containerd[1905]: time="2026-03-07T00:47:38.617781498Z" level=info msg="StartContainer for \"29303611851be347ad38115797ef3b18f4f67d437c9132272890e1ed6a265554\"" Mar 7 00:47:38.618063 containerd[1905]: time="2026-03-07T00:47:38.618030637Z" level=info msg="CreateContainer within sandbox \"cea0c598f6e57a463774f9913164b246130e5b11dc1b4e9a1c1af7398b9db449\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7c3b1af9c11f22fbb1b452356230caecdbce933f2de34f9db452b6116cc21139\"" Mar 7 00:47:38.618447 systemd[1]: Started cri-containerd-1f46f5458388f86d241299462b80ccec2ecb63dabc0050c5737b1b91f8fe430b.scope - libcontainer container 1f46f5458388f86d241299462b80ccec2ecb63dabc0050c5737b1b91f8fe430b. Mar 7 00:47:38.618730 containerd[1905]: time="2026-03-07T00:47:38.618667483Z" level=info msg="StartContainer for \"7c3b1af9c11f22fbb1b452356230caecdbce933f2de34f9db452b6116cc21139\"" Mar 7 00:47:38.619703 containerd[1905]: time="2026-03-07T00:47:38.619463223Z" level=info msg="connecting to shim 7c3b1af9c11f22fbb1b452356230caecdbce933f2de34f9db452b6116cc21139" address="unix:///run/containerd/s/83e626ff7f83dd4637368baa6c01419f9c5d738183ff684082109f6684568ac8" protocol=ttrpc version=3 Mar 7 00:47:38.619827 containerd[1905]: time="2026-03-07T00:47:38.619805495Z" level=info msg="connecting to shim 29303611851be347ad38115797ef3b18f4f67d437c9132272890e1ed6a265554" address="unix:///run/containerd/s/3aea1c7dc416fa7b69053c7c3fa8ad23850fe26c104ac9e1cbe338177e7d35ba" protocol=ttrpc version=3 Mar 7 00:47:38.640349 systemd[1]: Started cri-containerd-29303611851be347ad38115797ef3b18f4f67d437c9132272890e1ed6a265554.scope - libcontainer container 29303611851be347ad38115797ef3b18f4f67d437c9132272890e1ed6a265554. Mar 7 00:47:38.645075 systemd[1]: Started cri-containerd-7c3b1af9c11f22fbb1b452356230caecdbce933f2de34f9db452b6116cc21139.scope - libcontainer container 7c3b1af9c11f22fbb1b452356230caecdbce933f2de34f9db452b6116cc21139. Mar 7 00:47:38.672847 containerd[1905]: time="2026-03-07T00:47:38.672313187Z" level=info msg="StartContainer for \"1f46f5458388f86d241299462b80ccec2ecb63dabc0050c5737b1b91f8fe430b\" returns successfully" Mar 7 00:47:38.683707 kubelet[3034]: E0307 00:47:38.683668 3034 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.26:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.26:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 00:47:38.712656 containerd[1905]: time="2026-03-07T00:47:38.712548363Z" level=info msg="StartContainer for \"29303611851be347ad38115797ef3b18f4f67d437c9132272890e1ed6a265554\" returns successfully" Mar 7 00:47:38.726967 containerd[1905]: time="2026-03-07T00:47:38.726923736Z" level=info msg="StartContainer for \"7c3b1af9c11f22fbb1b452356230caecdbce933f2de34f9db452b6116cc21139\" returns successfully" Mar 7 00:47:38.763589 kubelet[3034]: I0307 00:47:38.763214 3034 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:39.231545 kubelet[3034]: E0307 00:47:39.231485 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:39.233803 kubelet[3034]: E0307 00:47:39.233781 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:39.237341 kubelet[3034]: E0307 00:47:39.237207 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.240799 kubelet[3034]: E0307 00:47:40.240611 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.240799 kubelet[3034]: E0307 00:47:40.240692 3034 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.536865 kubelet[3034]: E0307 00:47:40.536720 3034 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459.2.3-n-801efb9c04\" not found" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.638673 kubelet[3034]: I0307 00:47:40.638462 3034 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.638673 kubelet[3034]: E0307 00:47:40.638506 3034 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459.2.3-n-801efb9c04\": node \"ci-4459.2.3-n-801efb9c04\" not found" Mar 7 00:47:40.656442 kubelet[3034]: E0307 00:47:40.656014 3034 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-801efb9c04\" not found" Mar 7 00:47:40.756710 kubelet[3034]: E0307 00:47:40.756666 3034 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-801efb9c04\" not found" Mar 7 00:47:40.883258 kubelet[3034]: I0307 00:47:40.882773 3034 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.889604 kubelet[3034]: E0307 00:47:40.889270 3034 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.889604 kubelet[3034]: I0307 00:47:40.889303 3034 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.891056 kubelet[3034]: E0307 00:47:40.891032 3034 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.3-n-801efb9c04\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.891346 kubelet[3034]: I0307 00:47:40.891159 3034 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:40.892582 kubelet[3034]: E0307 00:47:40.892552 3034 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:41.165478 kubelet[3034]: I0307 00:47:41.165335 3034 apiserver.go:52] "Watching apiserver" Mar 7 00:47:41.183070 kubelet[3034]: I0307 00:47:41.183024 3034 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 00:47:42.036417 kubelet[3034]: I0307 00:47:42.036330 3034 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:42.044352 kubelet[3034]: I0307 00:47:42.044087 3034 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 00:47:42.600029 systemd[1]: Reload requested from client PID 3323 ('systemctl') (unit session-9.scope)... Mar 7 00:47:42.600045 systemd[1]: Reloading... Mar 7 00:47:42.674276 zram_generator::config[3367]: No configuration found. Mar 7 00:47:42.843113 systemd[1]: Reloading finished in 242 ms. Mar 7 00:47:42.871657 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:42.884759 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 00:47:42.885174 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:42.885254 systemd[1]: kubelet.service: Consumed 804ms CPU time, 121.2M memory peak. Mar 7 00:47:42.887873 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:47:43.001968 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:47:43.008578 (kubelet)[3434]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:47:43.046474 kubelet[3434]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 00:47:43.046474 kubelet[3434]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:47:43.046899 kubelet[3434]: I0307 00:47:43.046546 3434 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 00:47:43.052713 kubelet[3434]: I0307 00:47:43.052621 3434 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 7 00:47:43.052713 kubelet[3434]: I0307 00:47:43.052653 3434 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:47:43.052713 kubelet[3434]: I0307 00:47:43.052677 3434 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 00:47:43.052713 kubelet[3434]: I0307 00:47:43.052682 3434 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:47:43.053486 kubelet[3434]: I0307 00:47:43.053457 3434 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 00:47:43.054659 kubelet[3434]: I0307 00:47:43.054636 3434 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 7 00:47:43.056759 kubelet[3434]: I0307 00:47:43.056561 3434 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:47:43.060321 kubelet[3434]: I0307 00:47:43.060299 3434 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 7 00:47:43.063322 kubelet[3434]: I0307 00:47:43.063292 3434 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 00:47:43.064002 kubelet[3434]: I0307 00:47:43.063646 3434 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:47:43.064002 kubelet[3434]: I0307 00:47:43.063674 3434 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.3-n-801efb9c04","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:47:43.064002 kubelet[3434]: I0307 00:47:43.063811 3434 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 00:47:43.064002 kubelet[3434]: I0307 00:47:43.063817 3434 container_manager_linux.go:306] "Creating device plugin manager" Mar 7 00:47:43.064402 kubelet[3434]: I0307 00:47:43.063842 3434 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 00:47:43.064402 kubelet[3434]: I0307 00:47:43.064025 3434 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:47:43.064402 kubelet[3434]: I0307 00:47:43.064150 3434 kubelet.go:475] "Attempting to sync node with API server" Mar 7 00:47:43.064402 kubelet[3434]: I0307 00:47:43.064161 3434 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:47:43.064402 kubelet[3434]: I0307 00:47:43.064270 3434 kubelet.go:387] "Adding apiserver pod source" Mar 7 00:47:43.064402 kubelet[3434]: I0307 00:47:43.064283 3434 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:47:43.067655 kubelet[3434]: I0307 00:47:43.067048 3434 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 7 00:47:43.069656 kubelet[3434]: I0307 00:47:43.069150 3434 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:47:43.069656 kubelet[3434]: I0307 00:47:43.069349 3434 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 00:47:43.074542 kubelet[3434]: I0307 00:47:43.074519 3434 server.go:1262] "Started kubelet" Mar 7 00:47:43.077748 kubelet[3434]: I0307 00:47:43.077724 3434 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 00:47:43.080370 kubelet[3434]: I0307 00:47:43.080271 3434 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:47:43.081630 kubelet[3434]: I0307 00:47:43.081603 3434 server.go:310] "Adding debug handlers to kubelet server" Mar 7 00:47:43.086548 kubelet[3434]: I0307 00:47:43.086445 3434 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:47:43.086548 kubelet[3434]: I0307 00:47:43.086526 3434 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 00:47:43.086783 kubelet[3434]: I0307 00:47:43.086689 3434 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:47:43.087847 kubelet[3434]: I0307 00:47:43.087822 3434 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:47:43.089044 kubelet[3434]: I0307 00:47:43.089024 3434 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 7 00:47:43.090936 kubelet[3434]: I0307 00:47:43.090911 3434 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:47:43.091109 kubelet[3434]: I0307 00:47:43.091003 3434 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:47:43.093917 kubelet[3434]: I0307 00:47:43.093866 3434 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 00:47:43.094157 kubelet[3434]: I0307 00:47:43.094134 3434 reconciler.go:29] "Reconciler: start to sync state" Mar 7 00:47:43.094272 kubelet[3434]: I0307 00:47:43.094203 3434 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:47:43.095525 kubelet[3434]: E0307 00:47:43.095501 3434 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:47:43.114874 kubelet[3434]: I0307 00:47:43.114840 3434 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 00:47:43.116960 kubelet[3434]: I0307 00:47:43.116804 3434 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 00:47:43.116960 kubelet[3434]: I0307 00:47:43.116826 3434 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 7 00:47:43.116960 kubelet[3434]: I0307 00:47:43.116886 3434 kubelet.go:2428] "Starting kubelet main sync loop" Mar 7 00:47:43.116960 kubelet[3434]: E0307 00:47:43.116928 3434 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153165 3434 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153214 3434 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153242 3434 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153386 3434 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153395 3434 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153411 3434 policy_none.go:49] "None policy: Start" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153419 3434 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153426 3434 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153515 3434 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 7 00:47:43.153917 kubelet[3434]: I0307 00:47:43.153523 3434 policy_none.go:47] "Start" Mar 7 00:47:43.159089 kubelet[3434]: E0307 00:47:43.159063 3434 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:47:43.159623 kubelet[3434]: I0307 00:47:43.159604 3434 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 00:47:43.159765 kubelet[3434]: I0307 00:47:43.159734 3434 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:47:43.160091 kubelet[3434]: I0307 00:47:43.160072 3434 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 00:47:43.162424 kubelet[3434]: E0307 00:47:43.162362 3434 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:47:43.218872 kubelet[3434]: I0307 00:47:43.218517 3434 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.218872 kubelet[3434]: I0307 00:47:43.218551 3434 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.218872 kubelet[3434]: I0307 00:47:43.218778 3434 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.227142 kubelet[3434]: I0307 00:47:43.227110 3434 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 00:47:43.231191 kubelet[3434]: I0307 00:47:43.231146 3434 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 00:47:43.232359 kubelet[3434]: I0307 00:47:43.232329 3434 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 00:47:43.232582 kubelet[3434]: E0307 00:47:43.232523 3434 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.3-n-801efb9c04\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.269795 kubelet[3434]: I0307 00:47:43.269321 3434 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.282429 kubelet[3434]: I0307 00:47:43.282384 3434 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.282581 kubelet[3434]: I0307 00:47:43.282498 3434 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296124 kubelet[3434]: I0307 00:47:43.296011 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5944364bc5de357ac89645c1cc475101-ca-certs\") pod \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" (UID: \"5944364bc5de357ac89645c1cc475101\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296124 kubelet[3434]: I0307 00:47:43.296071 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5944364bc5de357ac89645c1cc475101-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" (UID: \"5944364bc5de357ac89645c1cc475101\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296124 kubelet[3434]: I0307 00:47:43.296086 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-ca-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296124 kubelet[3434]: I0307 00:47:43.296099 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296943 kubelet[3434]: I0307 00:47:43.296659 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296943 kubelet[3434]: I0307 00:47:43.296700 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/18710cf16308f1680635430d36cec99e-kubeconfig\") pod \"kube-scheduler-ci-4459.2.3-n-801efb9c04\" (UID: \"18710cf16308f1680635430d36cec99e\") " pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296943 kubelet[3434]: I0307 00:47:43.296766 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5944364bc5de357ac89645c1cc475101-k8s-certs\") pod \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" (UID: \"5944364bc5de357ac89645c1cc475101\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296943 kubelet[3434]: I0307 00:47:43.296794 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:43.296943 kubelet[3434]: I0307 00:47:43.296811 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73872d3cce42f8ca733d15fdd9d420b7-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.3-n-801efb9c04\" (UID: \"73872d3cce42f8ca733d15fdd9d420b7\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:44.066903 kubelet[3434]: I0307 00:47:44.066609 3434 apiserver.go:52] "Watching apiserver" Mar 7 00:47:46.811626 kubelet[3434]: I0307 00:47:44.094628 3434 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 00:47:46.811626 kubelet[3434]: I0307 00:47:44.136418 3434 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:46.811626 kubelet[3434]: I0307 00:47:44.147371 3434 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 7 00:47:46.811626 kubelet[3434]: E0307 00:47:44.147423 3434 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.3-n-801efb9c04\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" Mar 7 00:47:46.811626 kubelet[3434]: I0307 00:47:44.180480 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.2.3-n-801efb9c04" podStartSLOduration=1.180462145 podStartE2EDuration="1.180462145s" podCreationTimestamp="2026-03-07 00:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:47:44.165169269 +0000 UTC m=+1.152481481" watchObservedRunningTime="2026-03-07 00:47:44.180462145 +0000 UTC m=+1.167774301" Mar 7 00:47:46.811626 kubelet[3434]: I0307 00:47:44.190770 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-801efb9c04" podStartSLOduration=1.190754352 podStartE2EDuration="1.190754352s" podCreationTimestamp="2026-03-07 00:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:47:44.180934491 +0000 UTC m=+1.168246655" watchObservedRunningTime="2026-03-07 00:47:44.190754352 +0000 UTC m=+1.178066500" Mar 7 00:47:46.812355 kubelet[3434]: I0307 00:47:44.210536 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.2.3-n-801efb9c04" podStartSLOduration=2.210521624 podStartE2EDuration="2.210521624s" podCreationTimestamp="2026-03-07 00:47:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:47:44.191178216 +0000 UTC m=+1.178490372" watchObservedRunningTime="2026-03-07 00:47:44.210521624 +0000 UTC m=+1.197833788" Mar 7 00:47:48.747608 kubelet[3434]: I0307 00:47:48.747569 3434 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 7 00:47:48.748233 containerd[1905]: time="2026-03-07T00:47:48.747934737Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 7 00:47:48.748630 kubelet[3434]: I0307 00:47:48.748298 3434 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 7 00:47:49.937266 systemd[1]: Created slice kubepods-besteffort-podea556f37_151f_416b_a1cc_575542da5268.slice - libcontainer container kubepods-besteffort-podea556f37_151f_416b_a1cc_575542da5268.slice. Mar 7 00:47:50.040326 kubelet[3434]: I0307 00:47:50.040222 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ea556f37-151f-416b-a1cc-575542da5268-kube-proxy\") pod \"kube-proxy-7q22d\" (UID: \"ea556f37-151f-416b-a1cc-575542da5268\") " pod="kube-system/kube-proxy-7q22d" Mar 7 00:47:50.040326 kubelet[3434]: I0307 00:47:50.040264 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ea556f37-151f-416b-a1cc-575542da5268-xtables-lock\") pod \"kube-proxy-7q22d\" (UID: \"ea556f37-151f-416b-a1cc-575542da5268\") " pod="kube-system/kube-proxy-7q22d" Mar 7 00:47:50.040326 kubelet[3434]: I0307 00:47:50.040277 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea556f37-151f-416b-a1cc-575542da5268-lib-modules\") pod \"kube-proxy-7q22d\" (UID: \"ea556f37-151f-416b-a1cc-575542da5268\") " pod="kube-system/kube-proxy-7q22d" Mar 7 00:47:50.040326 kubelet[3434]: I0307 00:47:50.040293 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29rss\" (UniqueName: \"kubernetes.io/projected/ea556f37-151f-416b-a1cc-575542da5268-kube-api-access-29rss\") pod \"kube-proxy-7q22d\" (UID: \"ea556f37-151f-416b-a1cc-575542da5268\") " pod="kube-system/kube-proxy-7q22d" Mar 7 00:47:50.098537 systemd[1]: Created slice kubepods-besteffort-pod63f2b8e3_a0e7_497e_b459_be561bd0918c.slice - libcontainer container kubepods-besteffort-pod63f2b8e3_a0e7_497e_b459_be561bd0918c.slice. Mar 7 00:47:50.141238 kubelet[3434]: I0307 00:47:50.140625 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/63f2b8e3-a0e7-497e-b459-be561bd0918c-var-lib-calico\") pod \"tigera-operator-5588576f44-vp9sv\" (UID: \"63f2b8e3-a0e7-497e-b459-be561bd0918c\") " pod="tigera-operator/tigera-operator-5588576f44-vp9sv" Mar 7 00:47:50.141238 kubelet[3434]: I0307 00:47:50.140666 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cgbk\" (UniqueName: \"kubernetes.io/projected/63f2b8e3-a0e7-497e-b459-be561bd0918c-kube-api-access-2cgbk\") pod \"tigera-operator-5588576f44-vp9sv\" (UID: \"63f2b8e3-a0e7-497e-b459-be561bd0918c\") " pod="tigera-operator/tigera-operator-5588576f44-vp9sv" Mar 7 00:47:50.261506 containerd[1905]: time="2026-03-07T00:47:50.261457401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7q22d,Uid:ea556f37-151f-416b-a1cc-575542da5268,Namespace:kube-system,Attempt:0,}" Mar 7 00:47:50.453023 containerd[1905]: time="2026-03-07T00:47:50.452846645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-vp9sv,Uid:63f2b8e3-a0e7-497e-b459-be561bd0918c,Namespace:tigera-operator,Attempt:0,}" Mar 7 00:47:50.672268 containerd[1905]: time="2026-03-07T00:47:50.672148283Z" level=info msg="connecting to shim 96f12bcf4305188bbac50b055eab19fdc9c9ca807c2801c3f2fb2d76380c1d40" address="unix:///run/containerd/s/e15a154e298f725024814065a6df8ce698d3b84d0d60ef00f699fd5a78e3dae9" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:47:50.693330 systemd[1]: Started cri-containerd-96f12bcf4305188bbac50b055eab19fdc9c9ca807c2801c3f2fb2d76380c1d40.scope - libcontainer container 96f12bcf4305188bbac50b055eab19fdc9c9ca807c2801c3f2fb2d76380c1d40. Mar 7 00:47:50.756031 containerd[1905]: time="2026-03-07T00:47:50.755984891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7q22d,Uid:ea556f37-151f-416b-a1cc-575542da5268,Namespace:kube-system,Attempt:0,} returns sandbox id \"96f12bcf4305188bbac50b055eab19fdc9c9ca807c2801c3f2fb2d76380c1d40\"" Mar 7 00:47:51.118764 containerd[1905]: time="2026-03-07T00:47:51.118649037Z" level=info msg="CreateContainer within sandbox \"96f12bcf4305188bbac50b055eab19fdc9c9ca807c2801c3f2fb2d76380c1d40\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 7 00:47:51.222651 containerd[1905]: time="2026-03-07T00:47:51.222568617Z" level=info msg="connecting to shim a2c44450ab47335b53fcb98d28978545e2b307b21b9793b7060bc342ff357de2" address="unix:///run/containerd/s/b9b63d61315beb3eba025733e75a5ae70419d045a3e681e3e6ff4b4df347366d" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:47:51.245354 systemd[1]: Started cri-containerd-a2c44450ab47335b53fcb98d28978545e2b307b21b9793b7060bc342ff357de2.scope - libcontainer container a2c44450ab47335b53fcb98d28978545e2b307b21b9793b7060bc342ff357de2. Mar 7 00:47:51.417779 containerd[1905]: time="2026-03-07T00:47:51.417653471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-vp9sv,Uid:63f2b8e3-a0e7-497e-b459-be561bd0918c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a2c44450ab47335b53fcb98d28978545e2b307b21b9793b7060bc342ff357de2\"" Mar 7 00:47:51.420493 containerd[1905]: time="2026-03-07T00:47:51.420220311Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 7 00:47:51.472410 containerd[1905]: time="2026-03-07T00:47:51.471902823Z" level=info msg="Container 54badecc42345361f8565cb8932105f08bc501aae5c12ef17602d07f12796907: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:47:51.474507 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount441529985.mount: Deactivated successfully. Mar 7 00:47:51.908013 containerd[1905]: time="2026-03-07T00:47:51.907961084Z" level=info msg="CreateContainer within sandbox \"96f12bcf4305188bbac50b055eab19fdc9c9ca807c2801c3f2fb2d76380c1d40\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"54badecc42345361f8565cb8932105f08bc501aae5c12ef17602d07f12796907\"" Mar 7 00:47:51.909050 containerd[1905]: time="2026-03-07T00:47:51.909008163Z" level=info msg="StartContainer for \"54badecc42345361f8565cb8932105f08bc501aae5c12ef17602d07f12796907\"" Mar 7 00:47:51.955741 containerd[1905]: time="2026-03-07T00:47:51.955670439Z" level=info msg="connecting to shim 54badecc42345361f8565cb8932105f08bc501aae5c12ef17602d07f12796907" address="unix:///run/containerd/s/e15a154e298f725024814065a6df8ce698d3b84d0d60ef00f699fd5a78e3dae9" protocol=ttrpc version=3 Mar 7 00:47:51.979377 systemd[1]: Started cri-containerd-54badecc42345361f8565cb8932105f08bc501aae5c12ef17602d07f12796907.scope - libcontainer container 54badecc42345361f8565cb8932105f08bc501aae5c12ef17602d07f12796907. Mar 7 00:47:52.034367 containerd[1905]: time="2026-03-07T00:47:52.034312669Z" level=info msg="StartContainer for \"54badecc42345361f8565cb8932105f08bc501aae5c12ef17602d07f12796907\" returns successfully" Mar 7 00:47:54.675564 kubelet[3434]: I0307 00:47:54.675495 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7q22d" podStartSLOduration=5.675476795 podStartE2EDuration="5.675476795s" podCreationTimestamp="2026-03-07 00:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:47:52.166708544 +0000 UTC m=+9.154020692" watchObservedRunningTime="2026-03-07 00:47:54.675476795 +0000 UTC m=+11.662788943" Mar 7 00:47:55.071374 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount91944165.mount: Deactivated successfully. Mar 7 00:47:56.004237 containerd[1905]: time="2026-03-07T00:47:56.004161917Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:56.068366 containerd[1905]: time="2026-03-07T00:47:56.068157764Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 7 00:47:56.114048 containerd[1905]: time="2026-03-07T00:47:56.113977715Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:56.117440 containerd[1905]: time="2026-03-07T00:47:56.117388157Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:47:56.117924 containerd[1905]: time="2026-03-07T00:47:56.117881897Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 4.697627288s" Mar 7 00:47:56.117924 containerd[1905]: time="2026-03-07T00:47:56.117914498Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 7 00:47:56.164490 containerd[1905]: time="2026-03-07T00:47:56.164453398Z" level=info msg="CreateContainer within sandbox \"a2c44450ab47335b53fcb98d28978545e2b307b21b9793b7060bc342ff357de2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 7 00:47:56.611433 containerd[1905]: time="2026-03-07T00:47:56.611265172Z" level=info msg="Container 474c6ff2a8396ec043671cd6090fffd3c01e2e31d1a57afa7af2f9d02eb5a0d8: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:47:56.729567 containerd[1905]: time="2026-03-07T00:47:56.729342000Z" level=info msg="CreateContainer within sandbox \"a2c44450ab47335b53fcb98d28978545e2b307b21b9793b7060bc342ff357de2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"474c6ff2a8396ec043671cd6090fffd3c01e2e31d1a57afa7af2f9d02eb5a0d8\"" Mar 7 00:47:56.731780 containerd[1905]: time="2026-03-07T00:47:56.730450781Z" level=info msg="StartContainer for \"474c6ff2a8396ec043671cd6090fffd3c01e2e31d1a57afa7af2f9d02eb5a0d8\"" Mar 7 00:47:56.731780 containerd[1905]: time="2026-03-07T00:47:56.731709488Z" level=info msg="connecting to shim 474c6ff2a8396ec043671cd6090fffd3c01e2e31d1a57afa7af2f9d02eb5a0d8" address="unix:///run/containerd/s/b9b63d61315beb3eba025733e75a5ae70419d045a3e681e3e6ff4b4df347366d" protocol=ttrpc version=3 Mar 7 00:47:56.752345 systemd[1]: Started cri-containerd-474c6ff2a8396ec043671cd6090fffd3c01e2e31d1a57afa7af2f9d02eb5a0d8.scope - libcontainer container 474c6ff2a8396ec043671cd6090fffd3c01e2e31d1a57afa7af2f9d02eb5a0d8. Mar 7 00:47:56.780258 containerd[1905]: time="2026-03-07T00:47:56.780223340Z" level=info msg="StartContainer for \"474c6ff2a8396ec043671cd6090fffd3c01e2e31d1a57afa7af2f9d02eb5a0d8\" returns successfully" Mar 7 00:48:03.244200 sudo[2382]: pam_unix(sudo:session): session closed for user root Mar 7 00:48:03.324359 sshd[2381]: Connection closed by 10.200.16.10 port 55288 Mar 7 00:48:03.323778 sshd-session[2378]: pam_unix(sshd:session): session closed for user core Mar 7 00:48:03.327808 systemd[1]: sshd@6-10.200.20.26:22-10.200.16.10:55288.service: Deactivated successfully. Mar 7 00:48:03.330000 systemd[1]: session-9.scope: Deactivated successfully. Mar 7 00:48:03.331469 systemd[1]: session-9.scope: Consumed 3.607s CPU time, 221.7M memory peak. Mar 7 00:48:03.334620 systemd-logind[1883]: Session 9 logged out. Waiting for processes to exit. Mar 7 00:48:03.337550 systemd-logind[1883]: Removed session 9. Mar 7 00:48:08.374667 kubelet[3434]: I0307 00:48:08.374593 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-vp9sv" podStartSLOduration=14.675526317 podStartE2EDuration="19.374578125s" podCreationTimestamp="2026-03-07 00:47:49 +0000 UTC" firstStartedPulling="2026-03-07 00:47:51.419709468 +0000 UTC m=+8.407021616" lastFinishedPulling="2026-03-07 00:47:56.118761276 +0000 UTC m=+13.106073424" observedRunningTime="2026-03-07 00:47:57.177738552 +0000 UTC m=+14.165050700" watchObservedRunningTime="2026-03-07 00:48:08.374578125 +0000 UTC m=+25.361890273" Mar 7 00:48:08.388012 systemd[1]: Created slice kubepods-besteffort-podbc16043b_4c4d_4aa2_9249_912765338bf3.slice - libcontainer container kubepods-besteffort-podbc16043b_4c4d_4aa2_9249_912765338bf3.slice. Mar 7 00:48:08.449993 kubelet[3434]: I0307 00:48:08.449943 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc16043b-4c4d-4aa2-9249-912765338bf3-tigera-ca-bundle\") pod \"calico-typha-858cf75778-lqsxp\" (UID: \"bc16043b-4c4d-4aa2-9249-912765338bf3\") " pod="calico-system/calico-typha-858cf75778-lqsxp" Mar 7 00:48:08.449993 kubelet[3434]: I0307 00:48:08.449987 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/bc16043b-4c4d-4aa2-9249-912765338bf3-typha-certs\") pod \"calico-typha-858cf75778-lqsxp\" (UID: \"bc16043b-4c4d-4aa2-9249-912765338bf3\") " pod="calico-system/calico-typha-858cf75778-lqsxp" Mar 7 00:48:08.449993 kubelet[3434]: I0307 00:48:08.450001 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5gc\" (UniqueName: \"kubernetes.io/projected/bc16043b-4c4d-4aa2-9249-912765338bf3-kube-api-access-4t5gc\") pod \"calico-typha-858cf75778-lqsxp\" (UID: \"bc16043b-4c4d-4aa2-9249-912765338bf3\") " pod="calico-system/calico-typha-858cf75778-lqsxp" Mar 7 00:48:08.687896 systemd[1]: Created slice kubepods-besteffort-pod462592d5_307f_4ca7_bf24_366a81b67901.slice - libcontainer container kubepods-besteffort-pod462592d5_307f_4ca7_bf24_366a81b67901.slice. Mar 7 00:48:08.696935 containerd[1905]: time="2026-03-07T00:48:08.696858544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-858cf75778-lqsxp,Uid:bc16043b-4c4d-4aa2-9249-912765338bf3,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:08.742021 containerd[1905]: time="2026-03-07T00:48:08.741975132Z" level=info msg="connecting to shim 952afdf997b4373416069b25dd7bab34048b540b3194065f6a7b43082716b5d3" address="unix:///run/containerd/s/4f180d267798e0ee892dc1459f83432d34dd1edbcc463cc21213d9ef10fc2b49" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:08.753274 kubelet[3434]: I0307 00:48:08.753242 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-cni-net-dir\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753394 kubelet[3434]: I0307 00:48:08.753290 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-policysync\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753394 kubelet[3434]: I0307 00:48:08.753302 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-sys-fs\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753394 kubelet[3434]: I0307 00:48:08.753326 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/462592d5-307f-4ca7-bf24-366a81b67901-node-certs\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753394 kubelet[3434]: I0307 00:48:08.753335 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-var-run-calico\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753394 kubelet[3434]: I0307 00:48:08.753344 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-xtables-lock\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753483 kubelet[3434]: I0307 00:48:08.753354 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-cni-bin-dir\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753483 kubelet[3434]: I0307 00:48:08.753364 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-nodeproc\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753483 kubelet[3434]: I0307 00:48:08.753378 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-var-lib-calico\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753483 kubelet[3434]: I0307 00:48:08.753393 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462592d5-307f-4ca7-bf24-366a81b67901-tigera-ca-bundle\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753483 kubelet[3434]: I0307 00:48:08.753406 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-flexvol-driver-host\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753557 kubelet[3434]: I0307 00:48:08.753417 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wrx\" (UniqueName: \"kubernetes.io/projected/462592d5-307f-4ca7-bf24-366a81b67901-kube-api-access-t5wrx\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753557 kubelet[3434]: I0307 00:48:08.753431 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-cni-log-dir\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753557 kubelet[3434]: I0307 00:48:08.753440 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-bpffs\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.753557 kubelet[3434]: I0307 00:48:08.753447 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/462592d5-307f-4ca7-bf24-366a81b67901-lib-modules\") pod \"calico-node-szw7z\" (UID: \"462592d5-307f-4ca7-bf24-366a81b67901\") " pod="calico-system/calico-node-szw7z" Mar 7 00:48:08.768548 systemd[1]: Started cri-containerd-952afdf997b4373416069b25dd7bab34048b540b3194065f6a7b43082716b5d3.scope - libcontainer container 952afdf997b4373416069b25dd7bab34048b540b3194065f6a7b43082716b5d3. Mar 7 00:48:08.809108 containerd[1905]: time="2026-03-07T00:48:08.809067445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-858cf75778-lqsxp,Uid:bc16043b-4c4d-4aa2-9249-912765338bf3,Namespace:calico-system,Attempt:0,} returns sandbox id \"952afdf997b4373416069b25dd7bab34048b540b3194065f6a7b43082716b5d3\"" Mar 7 00:48:08.810745 containerd[1905]: time="2026-03-07T00:48:08.810719349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 7 00:48:08.821679 kubelet[3434]: E0307 00:48:08.821639 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:08.853740 kubelet[3434]: I0307 00:48:08.853700 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d4fe93f-ae81-4273-9c62-18e31d256fca-socket-dir\") pod \"csi-node-driver-s56zd\" (UID: \"2d4fe93f-ae81-4273-9c62-18e31d256fca\") " pod="calico-system/csi-node-driver-s56zd" Mar 7 00:48:08.853740 kubelet[3434]: I0307 00:48:08.853745 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d4fe93f-ae81-4273-9c62-18e31d256fca-registration-dir\") pod \"csi-node-driver-s56zd\" (UID: \"2d4fe93f-ae81-4273-9c62-18e31d256fca\") " pod="calico-system/csi-node-driver-s56zd" Mar 7 00:48:08.853895 kubelet[3434]: I0307 00:48:08.853778 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d4fe93f-ae81-4273-9c62-18e31d256fca-kubelet-dir\") pod \"csi-node-driver-s56zd\" (UID: \"2d4fe93f-ae81-4273-9c62-18e31d256fca\") " pod="calico-system/csi-node-driver-s56zd" Mar 7 00:48:08.853895 kubelet[3434]: I0307 00:48:08.853797 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2d4fe93f-ae81-4273-9c62-18e31d256fca-varrun\") pod \"csi-node-driver-s56zd\" (UID: \"2d4fe93f-ae81-4273-9c62-18e31d256fca\") " pod="calico-system/csi-node-driver-s56zd" Mar 7 00:48:08.853895 kubelet[3434]: I0307 00:48:08.853806 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrnzw\" (UniqueName: \"kubernetes.io/projected/2d4fe93f-ae81-4273-9c62-18e31d256fca-kube-api-access-jrnzw\") pod \"csi-node-driver-s56zd\" (UID: \"2d4fe93f-ae81-4273-9c62-18e31d256fca\") " pod="calico-system/csi-node-driver-s56zd" Mar 7 00:48:08.858958 kubelet[3434]: E0307 00:48:08.858904 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.858958 kubelet[3434]: W0307 00:48:08.858927 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.858958 kubelet[3434]: E0307 00:48:08.858944 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.859511 kubelet[3434]: E0307 00:48:08.859446 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.859511 kubelet[3434]: W0307 00:48:08.859460 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.859511 kubelet[3434]: E0307 00:48:08.859471 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.860239 kubelet[3434]: E0307 00:48:08.860089 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.860239 kubelet[3434]: W0307 00:48:08.860103 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.860239 kubelet[3434]: E0307 00:48:08.860115 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.860662 kubelet[3434]: E0307 00:48:08.860650 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.860737 kubelet[3434]: W0307 00:48:08.860726 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.860791 kubelet[3434]: E0307 00:48:08.860781 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.861403 kubelet[3434]: E0307 00:48:08.861330 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.861403 kubelet[3434]: W0307 00:48:08.861341 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.861403 kubelet[3434]: E0307 00:48:08.861352 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.861657 kubelet[3434]: E0307 00:48:08.861646 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.861781 kubelet[3434]: W0307 00:48:08.861714 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.861781 kubelet[3434]: E0307 00:48:08.861729 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.862022 kubelet[3434]: E0307 00:48:08.861961 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.862022 kubelet[3434]: W0307 00:48:08.861972 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.862022 kubelet[3434]: E0307 00:48:08.861980 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.862310 kubelet[3434]: E0307 00:48:08.862248 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.862310 kubelet[3434]: W0307 00:48:08.862259 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.862310 kubelet[3434]: E0307 00:48:08.862269 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.862582 kubelet[3434]: E0307 00:48:08.862533 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.862582 kubelet[3434]: W0307 00:48:08.862544 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.862582 kubelet[3434]: E0307 00:48:08.862554 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.862896 kubelet[3434]: E0307 00:48:08.862770 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.862896 kubelet[3434]: W0307 00:48:08.862780 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.862896 kubelet[3434]: E0307 00:48:08.862788 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.863156 kubelet[3434]: E0307 00:48:08.863145 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.863257 kubelet[3434]: W0307 00:48:08.863245 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.863319 kubelet[3434]: E0307 00:48:08.863307 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.939242 kubelet[3434]: E0307 00:48:08.938654 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.939242 kubelet[3434]: W0307 00:48:08.938677 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.939242 kubelet[3434]: E0307 00:48:08.938695 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.954940 kubelet[3434]: E0307 00:48:08.954836 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.954940 kubelet[3434]: W0307 00:48:08.954853 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.954940 kubelet[3434]: E0307 00:48:08.954871 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.955541 kubelet[3434]: E0307 00:48:08.955435 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.955541 kubelet[3434]: W0307 00:48:08.955449 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.956140 kubelet[3434]: E0307 00:48:08.956035 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.956508 kubelet[3434]: E0307 00:48:08.956469 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.956508 kubelet[3434]: W0307 00:48:08.956482 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.956508 kubelet[3434]: E0307 00:48:08.956494 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.957180 kubelet[3434]: E0307 00:48:08.957166 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.957180 kubelet[3434]: W0307 00:48:08.957217 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.957180 kubelet[3434]: E0307 00:48:08.957231 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.957911 kubelet[3434]: E0307 00:48:08.957888 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.957973 kubelet[3434]: W0307 00:48:08.957903 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.958017 kubelet[3434]: E0307 00:48:08.957971 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.958199 kubelet[3434]: E0307 00:48:08.958174 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.958282 kubelet[3434]: W0307 00:48:08.958206 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.958282 kubelet[3434]: E0307 00:48:08.958217 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.958364 kubelet[3434]: E0307 00:48:08.958353 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.958364 kubelet[3434]: W0307 00:48:08.958361 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.958407 kubelet[3434]: E0307 00:48:08.958368 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.958579 kubelet[3434]: E0307 00:48:08.958566 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.958579 kubelet[3434]: W0307 00:48:08.958575 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.958674 kubelet[3434]: E0307 00:48:08.958583 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.958713 kubelet[3434]: E0307 00:48:08.958701 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.958713 kubelet[3434]: W0307 00:48:08.958709 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.958772 kubelet[3434]: E0307 00:48:08.958716 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.958917 kubelet[3434]: E0307 00:48:08.958904 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.958917 kubelet[3434]: W0307 00:48:08.958918 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.958975 kubelet[3434]: E0307 00:48:08.958925 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.959115 kubelet[3434]: E0307 00:48:08.959101 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.959115 kubelet[3434]: W0307 00:48:08.959112 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.959201 kubelet[3434]: E0307 00:48:08.959119 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.959303 kubelet[3434]: E0307 00:48:08.959289 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.959303 kubelet[3434]: W0307 00:48:08.959299 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.959371 kubelet[3434]: E0307 00:48:08.959307 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.959502 kubelet[3434]: E0307 00:48:08.959490 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.959502 kubelet[3434]: W0307 00:48:08.959500 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.959588 kubelet[3434]: E0307 00:48:08.959507 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.959630 kubelet[3434]: E0307 00:48:08.959619 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.959630 kubelet[3434]: W0307 00:48:08.959626 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.959738 kubelet[3434]: E0307 00:48:08.959632 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.959861 kubelet[3434]: E0307 00:48:08.959845 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.959861 kubelet[3434]: W0307 00:48:08.959857 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.959945 kubelet[3434]: E0307 00:48:08.959864 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.959975 kubelet[3434]: E0307 00:48:08.959964 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.959975 kubelet[3434]: W0307 00:48:08.959969 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.959975 kubelet[3434]: E0307 00:48:08.959975 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.960116 kubelet[3434]: E0307 00:48:08.960105 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.960116 kubelet[3434]: W0307 00:48:08.960113 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.960175 kubelet[3434]: E0307 00:48:08.960121 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.960316 kubelet[3434]: E0307 00:48:08.960303 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.960316 kubelet[3434]: W0307 00:48:08.960312 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.960388 kubelet[3434]: E0307 00:48:08.960319 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.960433 kubelet[3434]: E0307 00:48:08.960423 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.960433 kubelet[3434]: W0307 00:48:08.960430 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.960480 kubelet[3434]: E0307 00:48:08.960436 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.960630 kubelet[3434]: E0307 00:48:08.960616 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.960630 kubelet[3434]: W0307 00:48:08.960626 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.960705 kubelet[3434]: E0307 00:48:08.960634 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.960778 kubelet[3434]: E0307 00:48:08.960766 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.960778 kubelet[3434]: W0307 00:48:08.960775 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.960893 kubelet[3434]: E0307 00:48:08.960781 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.960971 kubelet[3434]: E0307 00:48:08.960959 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.960971 kubelet[3434]: W0307 00:48:08.960967 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.961025 kubelet[3434]: E0307 00:48:08.960975 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.961327 kubelet[3434]: E0307 00:48:08.961243 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.961327 kubelet[3434]: W0307 00:48:08.961257 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.961327 kubelet[3434]: E0307 00:48:08.961267 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.961553 kubelet[3434]: E0307 00:48:08.961542 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.961622 kubelet[3434]: W0307 00:48:08.961611 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.961671 kubelet[3434]: E0307 00:48:08.961660 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.961879 kubelet[3434]: E0307 00:48:08.961862 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:08.961879 kubelet[3434]: W0307 00:48:08.961874 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:08.961938 kubelet[3434]: E0307 00:48:08.961890 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:08.996475 containerd[1905]: time="2026-03-07T00:48:08.996432314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-szw7z,Uid:462592d5-307f-4ca7-bf24-366a81b67901,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:09.030403 kubelet[3434]: E0307 00:48:09.030371 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:09.030403 kubelet[3434]: W0307 00:48:09.030393 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:09.030403 kubelet[3434]: E0307 00:48:09.030412 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:09.035806 containerd[1905]: time="2026-03-07T00:48:09.035762896Z" level=info msg="connecting to shim c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0" address="unix:///run/containerd/s/42e20cd812904ca32864b44da869f0df04cf0279915116f9bc3c04d79efdd27d" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:09.055348 systemd[1]: Started cri-containerd-c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0.scope - libcontainer container c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0. Mar 7 00:48:09.091174 containerd[1905]: time="2026-03-07T00:48:09.091131831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-szw7z,Uid:462592d5-307f-4ca7-bf24-366a81b67901,Namespace:calico-system,Attempt:0,} returns sandbox id \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\"" Mar 7 00:48:09.945196 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1312064789.mount: Deactivated successfully. Mar 7 00:48:10.117493 kubelet[3434]: E0307 00:48:10.117340 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:10.373629 containerd[1905]: time="2026-03-07T00:48:10.373573961Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:10.375954 containerd[1905]: time="2026-03-07T00:48:10.375925547Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 7 00:48:10.378743 containerd[1905]: time="2026-03-07T00:48:10.378701454Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:10.383122 containerd[1905]: time="2026-03-07T00:48:10.382696143Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:10.383122 containerd[1905]: time="2026-03-07T00:48:10.383014859Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 1.572149201s" Mar 7 00:48:10.383122 containerd[1905]: time="2026-03-07T00:48:10.383039108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 7 00:48:10.384123 containerd[1905]: time="2026-03-07T00:48:10.384099197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 7 00:48:10.399479 containerd[1905]: time="2026-03-07T00:48:10.399442058Z" level=info msg="CreateContainer within sandbox \"952afdf997b4373416069b25dd7bab34048b540b3194065f6a7b43082716b5d3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 7 00:48:10.415038 containerd[1905]: time="2026-03-07T00:48:10.414993783Z" level=info msg="Container d19986d04dc5556cfd07122157c60b13542a241ee9716dc7d8638c8dc30521d4: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:10.437606 containerd[1905]: time="2026-03-07T00:48:10.437556866Z" level=info msg="CreateContainer within sandbox \"952afdf997b4373416069b25dd7bab34048b540b3194065f6a7b43082716b5d3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d19986d04dc5556cfd07122157c60b13542a241ee9716dc7d8638c8dc30521d4\"" Mar 7 00:48:10.438629 containerd[1905]: time="2026-03-07T00:48:10.438479238Z" level=info msg="StartContainer for \"d19986d04dc5556cfd07122157c60b13542a241ee9716dc7d8638c8dc30521d4\"" Mar 7 00:48:10.440145 containerd[1905]: time="2026-03-07T00:48:10.440115244Z" level=info msg="connecting to shim d19986d04dc5556cfd07122157c60b13542a241ee9716dc7d8638c8dc30521d4" address="unix:///run/containerd/s/4f180d267798e0ee892dc1459f83432d34dd1edbcc463cc21213d9ef10fc2b49" protocol=ttrpc version=3 Mar 7 00:48:10.455344 systemd[1]: Started cri-containerd-d19986d04dc5556cfd07122157c60b13542a241ee9716dc7d8638c8dc30521d4.scope - libcontainer container d19986d04dc5556cfd07122157c60b13542a241ee9716dc7d8638c8dc30521d4. Mar 7 00:48:10.488771 containerd[1905]: time="2026-03-07T00:48:10.488735112Z" level=info msg="StartContainer for \"d19986d04dc5556cfd07122157c60b13542a241ee9716dc7d8638c8dc30521d4\" returns successfully" Mar 7 00:48:11.253349 kubelet[3434]: E0307 00:48:11.253315 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.254002 kubelet[3434]: W0307 00:48:11.253837 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.254002 kubelet[3434]: E0307 00:48:11.253870 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.254292 kubelet[3434]: E0307 00:48:11.254066 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.254292 kubelet[3434]: W0307 00:48:11.254080 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.254292 kubelet[3434]: E0307 00:48:11.254112 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.254468 kubelet[3434]: E0307 00:48:11.254404 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.254468 kubelet[3434]: W0307 00:48:11.254414 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.254468 kubelet[3434]: E0307 00:48:11.254424 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.254729 kubelet[3434]: E0307 00:48:11.254686 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.254729 kubelet[3434]: W0307 00:48:11.254697 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.254729 kubelet[3434]: E0307 00:48:11.254708 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.255032 kubelet[3434]: E0307 00:48:11.255007 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.255032 kubelet[3434]: W0307 00:48:11.255018 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.255211 kubelet[3434]: E0307 00:48:11.255107 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.255424 kubelet[3434]: E0307 00:48:11.255374 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.255424 kubelet[3434]: W0307 00:48:11.255385 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.255424 kubelet[3434]: E0307 00:48:11.255394 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.255660 kubelet[3434]: E0307 00:48:11.255632 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.255660 kubelet[3434]: W0307 00:48:11.255643 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.255772 kubelet[3434]: E0307 00:48:11.255727 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.256013 kubelet[3434]: E0307 00:48:11.255990 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.256140 kubelet[3434]: W0307 00:48:11.256002 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.256140 kubelet[3434]: E0307 00:48:11.256091 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.256395 kubelet[3434]: E0307 00:48:11.256364 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.256395 kubelet[3434]: W0307 00:48:11.256375 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.256395 kubelet[3434]: E0307 00:48:11.256385 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.256635 kubelet[3434]: E0307 00:48:11.256621 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.256744 kubelet[3434]: W0307 00:48:11.256699 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.256744 kubelet[3434]: E0307 00:48:11.256714 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.256973 kubelet[3434]: E0307 00:48:11.256929 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.256973 kubelet[3434]: W0307 00:48:11.256939 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.256973 kubelet[3434]: E0307 00:48:11.256947 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.257270 kubelet[3434]: E0307 00:48:11.257211 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.257270 kubelet[3434]: W0307 00:48:11.257222 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.257270 kubelet[3434]: E0307 00:48:11.257231 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.257606 kubelet[3434]: E0307 00:48:11.257529 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.257606 kubelet[3434]: W0307 00:48:11.257557 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.257606 kubelet[3434]: E0307 00:48:11.257568 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.257825 kubelet[3434]: E0307 00:48:11.257815 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.257979 kubelet[3434]: W0307 00:48:11.257884 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.257979 kubelet[3434]: E0307 00:48:11.257898 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.258097 kubelet[3434]: E0307 00:48:11.258088 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.258144 kubelet[3434]: W0307 00:48:11.258135 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.258254 kubelet[3434]: E0307 00:48:11.258173 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.272490 kubelet[3434]: E0307 00:48:11.272474 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.272615 kubelet[3434]: W0307 00:48:11.272577 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.272615 kubelet[3434]: E0307 00:48:11.272597 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.272883 kubelet[3434]: E0307 00:48:11.272854 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.272883 kubelet[3434]: W0307 00:48:11.272865 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.272883 kubelet[3434]: E0307 00:48:11.272873 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.273151 kubelet[3434]: E0307 00:48:11.273140 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.273318 kubelet[3434]: W0307 00:48:11.273212 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.273318 kubelet[3434]: E0307 00:48:11.273225 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.273649 kubelet[3434]: E0307 00:48:11.273547 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.273649 kubelet[3434]: W0307 00:48:11.273560 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.273649 kubelet[3434]: E0307 00:48:11.273569 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.273783 kubelet[3434]: E0307 00:48:11.273772 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.273879 kubelet[3434]: W0307 00:48:11.273822 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.273879 kubelet[3434]: E0307 00:48:11.273834 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.274179 kubelet[3434]: E0307 00:48:11.274149 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.274179 kubelet[3434]: W0307 00:48:11.274159 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.274179 kubelet[3434]: E0307 00:48:11.274168 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.274502 kubelet[3434]: E0307 00:48:11.274484 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.274619 kubelet[3434]: W0307 00:48:11.274495 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.274619 kubelet[3434]: E0307 00:48:11.274577 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.274854 kubelet[3434]: E0307 00:48:11.274825 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.274854 kubelet[3434]: W0307 00:48:11.274836 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.274854 kubelet[3434]: E0307 00:48:11.274845 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.275154 kubelet[3434]: E0307 00:48:11.275125 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.275154 kubelet[3434]: W0307 00:48:11.275136 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.275154 kubelet[3434]: E0307 00:48:11.275144 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.275584 kubelet[3434]: E0307 00:48:11.275555 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.275584 kubelet[3434]: W0307 00:48:11.275566 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.275584 kubelet[3434]: E0307 00:48:11.275574 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.275878 kubelet[3434]: E0307 00:48:11.275847 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.275878 kubelet[3434]: W0307 00:48:11.275859 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.275878 kubelet[3434]: E0307 00:48:11.275867 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.276178 kubelet[3434]: E0307 00:48:11.276148 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.276178 kubelet[3434]: W0307 00:48:11.276160 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.276178 kubelet[3434]: E0307 00:48:11.276168 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.276491 kubelet[3434]: E0307 00:48:11.276473 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.276644 kubelet[3434]: W0307 00:48:11.276549 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.276644 kubelet[3434]: E0307 00:48:11.276566 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.276867 kubelet[3434]: E0307 00:48:11.276843 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.276867 kubelet[3434]: W0307 00:48:11.276854 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.277020 kubelet[3434]: E0307 00:48:11.276941 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.277533 kubelet[3434]: E0307 00:48:11.277415 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.277533 kubelet[3434]: W0307 00:48:11.277431 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.277533 kubelet[3434]: E0307 00:48:11.277439 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.277686 kubelet[3434]: E0307 00:48:11.277674 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.277774 kubelet[3434]: W0307 00:48:11.277762 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.277839 kubelet[3434]: E0307 00:48:11.277810 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.278096 kubelet[3434]: E0307 00:48:11.278071 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.278096 kubelet[3434]: W0307 00:48:11.278081 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.278236 kubelet[3434]: E0307 00:48:11.278169 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.278605 kubelet[3434]: E0307 00:48:11.278558 3434 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:48:11.278605 kubelet[3434]: W0307 00:48:11.278569 3434 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:48:11.278605 kubelet[3434]: E0307 00:48:11.278578 3434 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:48:11.572210 containerd[1905]: time="2026-03-07T00:48:11.571622024Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:11.574547 containerd[1905]: time="2026-03-07T00:48:11.574521072Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 7 00:48:11.576899 containerd[1905]: time="2026-03-07T00:48:11.576857345Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:11.587834 containerd[1905]: time="2026-03-07T00:48:11.587751676Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:11.588469 containerd[1905]: time="2026-03-07T00:48:11.588407237Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.204281119s" Mar 7 00:48:11.588469 containerd[1905]: time="2026-03-07T00:48:11.588435422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 7 00:48:11.633894 containerd[1905]: time="2026-03-07T00:48:11.633834622Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 7 00:48:11.655211 containerd[1905]: time="2026-03-07T00:48:11.654430701Z" level=info msg="Container 7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:11.656981 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1993179859.mount: Deactivated successfully. Mar 7 00:48:11.676335 containerd[1905]: time="2026-03-07T00:48:11.676283036Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f\"" Mar 7 00:48:11.678248 containerd[1905]: time="2026-03-07T00:48:11.677475970Z" level=info msg="StartContainer for \"7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f\"" Mar 7 00:48:11.679159 containerd[1905]: time="2026-03-07T00:48:11.679088592Z" level=info msg="connecting to shim 7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f" address="unix:///run/containerd/s/42e20cd812904ca32864b44da869f0df04cf0279915116f9bc3c04d79efdd27d" protocol=ttrpc version=3 Mar 7 00:48:11.701353 systemd[1]: Started cri-containerd-7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f.scope - libcontainer container 7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f. Mar 7 00:48:11.751489 containerd[1905]: time="2026-03-07T00:48:11.751376744Z" level=info msg="StartContainer for \"7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f\" returns successfully" Mar 7 00:48:11.757066 systemd[1]: cri-containerd-7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f.scope: Deactivated successfully. Mar 7 00:48:11.760292 containerd[1905]: time="2026-03-07T00:48:11.760250293Z" level=info msg="received container exit event container_id:\"7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f\" id:\"7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f\" pid:4070 exited_at:{seconds:1772844491 nanos:759859174}" Mar 7 00:48:11.784019 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7e34e0d17601f25b0d9ba6074d450bb15d431f1c0ff5ec09d56231a1789be48f-rootfs.mount: Deactivated successfully. Mar 7 00:48:12.118167 kubelet[3434]: E0307 00:48:12.118113 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:12.202215 kubelet[3434]: I0307 00:48:12.201796 3434 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:48:12.219203 kubelet[3434]: I0307 00:48:12.218940 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-858cf75778-lqsxp" podStartSLOduration=2.645269877 podStartE2EDuration="4.218923775s" podCreationTimestamp="2026-03-07 00:48:08 +0000 UTC" firstStartedPulling="2026-03-07 00:48:08.81032783 +0000 UTC m=+25.797639978" lastFinishedPulling="2026-03-07 00:48:10.38398172 +0000 UTC m=+27.371293876" observedRunningTime="2026-03-07 00:48:11.210380406 +0000 UTC m=+28.197692554" watchObservedRunningTime="2026-03-07 00:48:12.218923775 +0000 UTC m=+29.206235923" Mar 7 00:48:13.207482 containerd[1905]: time="2026-03-07T00:48:13.206897817Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 7 00:48:14.117378 kubelet[3434]: E0307 00:48:14.117331 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:16.117621 kubelet[3434]: E0307 00:48:16.117568 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:17.044180 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1590426915.mount: Deactivated successfully. Mar 7 00:48:18.117881 kubelet[3434]: E0307 00:48:18.117825 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:20.117708 kubelet[3434]: E0307 00:48:20.117650 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:22.117534 kubelet[3434]: E0307 00:48:22.117478 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:23.659467 containerd[1905]: time="2026-03-07T00:48:23.659402706Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:23.662435 containerd[1905]: time="2026-03-07T00:48:23.662396148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 7 00:48:23.665059 containerd[1905]: time="2026-03-07T00:48:23.665005472Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:23.668265 containerd[1905]: time="2026-03-07T00:48:23.668216179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:23.668795 containerd[1905]: time="2026-03-07T00:48:23.668482629Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 10.460599086s" Mar 7 00:48:23.668795 containerd[1905]: time="2026-03-07T00:48:23.668510630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 7 00:48:23.674848 containerd[1905]: time="2026-03-07T00:48:23.674820648Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 7 00:48:23.696849 containerd[1905]: time="2026-03-07T00:48:23.696725774Z" level=info msg="Container 4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:23.715448 containerd[1905]: time="2026-03-07T00:48:23.715410121Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\"" Mar 7 00:48:23.717524 containerd[1905]: time="2026-03-07T00:48:23.717324083Z" level=info msg="StartContainer for \"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\"" Mar 7 00:48:23.719261 containerd[1905]: time="2026-03-07T00:48:23.719170081Z" level=info msg="connecting to shim 4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf" address="unix:///run/containerd/s/42e20cd812904ca32864b44da869f0df04cf0279915116f9bc3c04d79efdd27d" protocol=ttrpc version=3 Mar 7 00:48:23.739316 systemd[1]: Started cri-containerd-4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf.scope - libcontainer container 4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf. Mar 7 00:48:23.799211 containerd[1905]: time="2026-03-07T00:48:23.799160471Z" level=info msg="StartContainer for \"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\" returns successfully" Mar 7 00:48:23.825061 systemd[1]: cri-containerd-4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf.scope: Deactivated successfully. Mar 7 00:48:23.825599 containerd[1905]: time="2026-03-07T00:48:23.825279295Z" level=info msg="received container exit event container_id:\"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\" id:\"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\" pid:4126 exited_at:{seconds:1772844503 nanos:824930522}" Mar 7 00:48:23.844173 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf-rootfs.mount: Deactivated successfully. Mar 7 00:48:24.354786 kubelet[3434]: E0307 00:48:24.117395 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:26.117512 kubelet[3434]: E0307 00:48:26.117457 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:28.117577 kubelet[3434]: E0307 00:48:28.117259 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:28.817834 kubelet[3434]: I0307 00:48:28.817461 3434 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:48:30.118227 kubelet[3434]: E0307 00:48:30.118152 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:32.117617 kubelet[3434]: E0307 00:48:32.117562 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:33.826418 containerd[1905]: time="2026-03-07T00:48:33.826353422Z" level=error msg="failed to handle container TaskExit event container_id:\"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\" id:\"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\" pid:4126 exited_at:{seconds:1772844503 nanos:824930522}" error="failed to stop container: failed to delete task: context deadline exceeded" Mar 7 00:48:34.465019 kubelet[3434]: E0307 00:48:34.117913 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:34.470667 containerd[1905]: time="2026-03-07T00:48:34.470606172Z" level=error msg="ttrpc: received message on inactive stream" stream=31 Mar 7 00:48:35.745927 containerd[1905]: time="2026-03-07T00:48:35.745867511Z" level=info msg="TaskExit event container_id:\"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\" id:\"4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf\" pid:4126 exited_at:{seconds:1772844503 nanos:824930522}" Mar 7 00:48:35.747663 containerd[1905]: time="2026-03-07T00:48:35.747633650Z" level=info msg="Ensure that container 4fcbd1c202483dca78665fb68b51f4ceaf9c8fc865a0f32ca5892e026e1c1cdf in task-service has been cleanup successfully" Mar 7 00:48:36.117296 kubelet[3434]: E0307 00:48:36.117119 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:36.253457 containerd[1905]: time="2026-03-07T00:48:36.253416118Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 7 00:48:38.118218 kubelet[3434]: E0307 00:48:38.118135 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:38.268569 containerd[1905]: time="2026-03-07T00:48:38.268403834Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:38.271602 containerd[1905]: time="2026-03-07T00:48:38.271567835Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 7 00:48:38.274541 containerd[1905]: time="2026-03-07T00:48:38.274496915Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:38.279229 containerd[1905]: time="2026-03-07T00:48:38.278554917Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:38.279229 containerd[1905]: time="2026-03-07T00:48:38.278998286Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.025550191s" Mar 7 00:48:38.279229 containerd[1905]: time="2026-03-07T00:48:38.279019239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 7 00:48:38.287047 containerd[1905]: time="2026-03-07T00:48:38.287019016Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 7 00:48:38.306403 containerd[1905]: time="2026-03-07T00:48:38.306367769Z" level=info msg="Container 53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:38.309665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount791269980.mount: Deactivated successfully. Mar 7 00:48:38.325410 containerd[1905]: time="2026-03-07T00:48:38.325350612Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798\"" Mar 7 00:48:38.326109 containerd[1905]: time="2026-03-07T00:48:38.326090360Z" level=info msg="StartContainer for \"53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798\"" Mar 7 00:48:38.327451 containerd[1905]: time="2026-03-07T00:48:38.327420171Z" level=info msg="connecting to shim 53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798" address="unix:///run/containerd/s/42e20cd812904ca32864b44da869f0df04cf0279915116f9bc3c04d79efdd27d" protocol=ttrpc version=3 Mar 7 00:48:38.346320 systemd[1]: Started cri-containerd-53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798.scope - libcontainer container 53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798. Mar 7 00:48:38.404289 containerd[1905]: time="2026-03-07T00:48:38.404168790Z" level=info msg="StartContainer for \"53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798\" returns successfully" Mar 7 00:48:39.594123 systemd[1]: cri-containerd-53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798.scope: Deactivated successfully. Mar 7 00:48:39.594387 systemd[1]: cri-containerd-53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798.scope: Consumed 352ms CPU time, 200.1M memory peak, 171.3M written to disk. Mar 7 00:48:39.596365 containerd[1905]: time="2026-03-07T00:48:39.596325324Z" level=info msg="received container exit event container_id:\"53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798\" id:\"53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798\" pid:4186 exited_at:{seconds:1772844519 nanos:595852474}" Mar 7 00:48:39.614475 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-53bb70e64541b044ca6b66e51d7e6547255f8e2a3bea7f93ea045a73b55e6798-rootfs.mount: Deactivated successfully. Mar 7 00:48:39.689992 kubelet[3434]: I0307 00:48:39.687283 3434 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 7 00:48:40.527043 systemd[1]: Created slice kubepods-besteffort-pod4deed308_3e60_4e75_bf96_2c78a050ad12.slice - libcontainer container kubepods-besteffort-pod4deed308_3e60_4e75_bf96_2c78a050ad12.slice. Mar 7 00:48:40.539084 systemd[1]: Created slice kubepods-besteffort-pod2d4fe93f_ae81_4273_9c62_18e31d256fca.slice - libcontainer container kubepods-besteffort-pod2d4fe93f_ae81_4273_9c62_18e31d256fca.slice. Mar 7 00:48:40.550954 systemd[1]: Created slice kubepods-burstable-poda7c3eef4_1dc3_4b1b_808a_fffb6d8090ec.slice - libcontainer container kubepods-burstable-poda7c3eef4_1dc3_4b1b_808a_fffb6d8090ec.slice. Mar 7 00:48:40.552929 kubelet[3434]: I0307 00:48:40.552904 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4deed308-3e60-4e75-bf96-2c78a050ad12-tigera-ca-bundle\") pod \"calico-kube-controllers-7667f774f9-5sqxh\" (UID: \"4deed308-3e60-4e75-bf96-2c78a050ad12\") " pod="calico-system/calico-kube-controllers-7667f774f9-5sqxh" Mar 7 00:48:40.553042 kubelet[3434]: I0307 00:48:40.553028 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9h9\" (UniqueName: \"kubernetes.io/projected/4deed308-3e60-4e75-bf96-2c78a050ad12-kube-api-access-qn9h9\") pod \"calico-kube-controllers-7667f774f9-5sqxh\" (UID: \"4deed308-3e60-4e75-bf96-2c78a050ad12\") " pod="calico-system/calico-kube-controllers-7667f774f9-5sqxh" Mar 7 00:48:40.553111 kubelet[3434]: I0307 00:48:40.553101 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec-config-volume\") pod \"coredns-66bc5c9577-v9vbm\" (UID: \"a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec\") " pod="kube-system/coredns-66bc5c9577-v9vbm" Mar 7 00:48:40.553180 kubelet[3434]: I0307 00:48:40.553169 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhbw\" (UniqueName: \"kubernetes.io/projected/a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec-kube-api-access-nxhbw\") pod \"coredns-66bc5c9577-v9vbm\" (UID: \"a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec\") " pod="kube-system/coredns-66bc5c9577-v9vbm" Mar 7 00:48:40.555153 containerd[1905]: time="2026-03-07T00:48:40.555119282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s56zd,Uid:2d4fe93f-ae81-4273-9c62-18e31d256fca,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:40.558982 systemd[1]: Created slice kubepods-besteffort-pod394e811f_ffeb_4807_804b_bb1ee6e95381.slice - libcontainer container kubepods-besteffort-pod394e811f_ffeb_4807_804b_bb1ee6e95381.slice. Mar 7 00:48:40.573083 systemd[1]: Created slice kubepods-besteffort-pod912e50ec_c7a9_464f_8c19_f8fe3bd6f8f8.slice - libcontainer container kubepods-besteffort-pod912e50ec_c7a9_464f_8c19_f8fe3bd6f8f8.slice. Mar 7 00:48:40.588003 systemd[1]: Created slice kubepods-burstable-podf8bdc5d6_1197_42d6_a80f_f36a8fb0cade.slice - libcontainer container kubepods-burstable-podf8bdc5d6_1197_42d6_a80f_f36a8fb0cade.slice. Mar 7 00:48:40.598875 systemd[1]: Created slice kubepods-besteffort-pod4f6c5938_63cd_4a96_b97e_7bc139a51dd3.slice - libcontainer container kubepods-besteffort-pod4f6c5938_63cd_4a96_b97e_7bc139a51dd3.slice. Mar 7 00:48:40.606153 systemd[1]: Created slice kubepods-besteffort-pod231b476a_62e7_421f_b6e5_1c8fe696c6e7.slice - libcontainer container kubepods-besteffort-pod231b476a_62e7_421f_b6e5_1c8fe696c6e7.slice. Mar 7 00:48:40.639506 containerd[1905]: time="2026-03-07T00:48:40.639457433Z" level=error msg="Failed to destroy network for sandbox \"b36b73cb400c73113d9c4e0c34abe766158fe0ac76dfafa401b34ce729d3235a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.641680 systemd[1]: run-netns-cni\x2d98e37e09\x2d1c58\x2d395f\x2d2066\x2ddea770968cc2.mount: Deactivated successfully. Mar 7 00:48:40.645660 containerd[1905]: time="2026-03-07T00:48:40.645585082Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s56zd,Uid:2d4fe93f-ae81-4273-9c62-18e31d256fca,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b36b73cb400c73113d9c4e0c34abe766158fe0ac76dfafa401b34ce729d3235a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.648213 kubelet[3434]: E0307 00:48:40.647952 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b36b73cb400c73113d9c4e0c34abe766158fe0ac76dfafa401b34ce729d3235a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.648213 kubelet[3434]: E0307 00:48:40.648023 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b36b73cb400c73113d9c4e0c34abe766158fe0ac76dfafa401b34ce729d3235a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s56zd" Mar 7 00:48:40.648213 kubelet[3434]: E0307 00:48:40.648041 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b36b73cb400c73113d9c4e0c34abe766158fe0ac76dfafa401b34ce729d3235a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s56zd" Mar 7 00:48:40.648414 kubelet[3434]: E0307 00:48:40.648086 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s56zd_calico-system(2d4fe93f-ae81-4273-9c62-18e31d256fca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s56zd_calico-system(2d4fe93f-ae81-4273-9c62-18e31d256fca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b36b73cb400c73113d9c4e0c34abe766158fe0ac76dfafa401b34ce729d3235a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s56zd" podUID="2d4fe93f-ae81-4273-9c62-18e31d256fca" Mar 7 00:48:40.654391 kubelet[3434]: I0307 00:48:40.654359 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxw8\" (UniqueName: \"kubernetes.io/projected/f8bdc5d6-1197-42d6-a80f-f36a8fb0cade-kube-api-access-nnxw8\") pod \"coredns-66bc5c9577-rz88n\" (UID: \"f8bdc5d6-1197-42d6-a80f-f36a8fb0cade\") " pod="kube-system/coredns-66bc5c9577-rz88n" Mar 7 00:48:40.654530 kubelet[3434]: I0307 00:48:40.654517 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231b476a-62e7-421f-b6e5-1c8fe696c6e7-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-n5q4d\" (UID: \"231b476a-62e7-421f-b6e5-1c8fe696c6e7\") " pod="calico-system/goldmane-cccfbd5cf-n5q4d" Mar 7 00:48:40.654609 kubelet[3434]: I0307 00:48:40.654597 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4djl\" (UniqueName: \"kubernetes.io/projected/231b476a-62e7-421f-b6e5-1c8fe696c6e7-kube-api-access-k4djl\") pod \"goldmane-cccfbd5cf-n5q4d\" (UID: \"231b476a-62e7-421f-b6e5-1c8fe696c6e7\") " pod="calico-system/goldmane-cccfbd5cf-n5q4d" Mar 7 00:48:40.655005 kubelet[3434]: I0307 00:48:40.654667 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4f6c5938-63cd-4a96-b97e-7bc139a51dd3-calico-apiserver-certs\") pod \"calico-apiserver-5fd9487db-c7nk2\" (UID: \"4f6c5938-63cd-4a96-b97e-7bc139a51dd3\") " pod="calico-system/calico-apiserver-5fd9487db-c7nk2" Mar 7 00:48:40.655124 kubelet[3434]: I0307 00:48:40.655108 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlbh2\" (UniqueName: \"kubernetes.io/projected/4f6c5938-63cd-4a96-b97e-7bc139a51dd3-kube-api-access-vlbh2\") pod \"calico-apiserver-5fd9487db-c7nk2\" (UID: \"4f6c5938-63cd-4a96-b97e-7bc139a51dd3\") " pod="calico-system/calico-apiserver-5fd9487db-c7nk2" Mar 7 00:48:40.655225 kubelet[3434]: I0307 00:48:40.655206 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/394e811f-ffeb-4807-804b-bb1ee6e95381-calico-apiserver-certs\") pod \"calico-apiserver-5fd9487db-vv4jn\" (UID: \"394e811f-ffeb-4807-804b-bb1ee6e95381\") " pod="calico-system/calico-apiserver-5fd9487db-vv4jn" Mar 7 00:48:40.655367 kubelet[3434]: I0307 00:48:40.655310 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8bdc5d6-1197-42d6-a80f-f36a8fb0cade-config-volume\") pod \"coredns-66bc5c9577-rz88n\" (UID: \"f8bdc5d6-1197-42d6-a80f-f36a8fb0cade\") " pod="kube-system/coredns-66bc5c9577-rz88n" Mar 7 00:48:40.655416 kubelet[3434]: I0307 00:48:40.655393 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmpwt\" (UniqueName: \"kubernetes.io/projected/394e811f-ffeb-4807-804b-bb1ee6e95381-kube-api-access-nmpwt\") pod \"calico-apiserver-5fd9487db-vv4jn\" (UID: \"394e811f-ffeb-4807-804b-bb1ee6e95381\") " pod="calico-system/calico-apiserver-5fd9487db-vv4jn" Mar 7 00:48:40.656207 kubelet[3434]: I0307 00:48:40.655454 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/231b476a-62e7-421f-b6e5-1c8fe696c6e7-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-n5q4d\" (UID: \"231b476a-62e7-421f-b6e5-1c8fe696c6e7\") " pod="calico-system/goldmane-cccfbd5cf-n5q4d" Mar 7 00:48:40.656207 kubelet[3434]: I0307 00:48:40.655468 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-nginx-config\") pod \"whisker-5dbd7d4d68-gbxd7\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " pod="calico-system/whisker-5dbd7d4d68-gbxd7" Mar 7 00:48:40.656207 kubelet[3434]: I0307 00:48:40.655509 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-backend-key-pair\") pod \"whisker-5dbd7d4d68-gbxd7\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " pod="calico-system/whisker-5dbd7d4d68-gbxd7" Mar 7 00:48:40.656795 kubelet[3434]: I0307 00:48:40.656535 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231b476a-62e7-421f-b6e5-1c8fe696c6e7-config\") pod \"goldmane-cccfbd5cf-n5q4d\" (UID: \"231b476a-62e7-421f-b6e5-1c8fe696c6e7\") " pod="calico-system/goldmane-cccfbd5cf-n5q4d" Mar 7 00:48:40.656795 kubelet[3434]: I0307 00:48:40.656557 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-ca-bundle\") pod \"whisker-5dbd7d4d68-gbxd7\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " pod="calico-system/whisker-5dbd7d4d68-gbxd7" Mar 7 00:48:40.656795 kubelet[3434]: I0307 00:48:40.656568 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwz28\" (UniqueName: \"kubernetes.io/projected/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-kube-api-access-hwz28\") pod \"whisker-5dbd7d4d68-gbxd7\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " pod="calico-system/whisker-5dbd7d4d68-gbxd7" Mar 7 00:48:40.839349 containerd[1905]: time="2026-03-07T00:48:40.839240187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7667f774f9-5sqxh,Uid:4deed308-3e60-4e75-bf96-2c78a050ad12,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:40.861203 containerd[1905]: time="2026-03-07T00:48:40.860856743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v9vbm,Uid:a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec,Namespace:kube-system,Attempt:0,}" Mar 7 00:48:40.875951 containerd[1905]: time="2026-03-07T00:48:40.875916707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-vv4jn,Uid:394e811f-ffeb-4807-804b-bb1ee6e95381,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:40.886920 containerd[1905]: time="2026-03-07T00:48:40.886885539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dbd7d4d68-gbxd7,Uid:912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:40.890130 containerd[1905]: time="2026-03-07T00:48:40.890093052Z" level=error msg="Failed to destroy network for sandbox \"9aa7fd29c368a72150856fc5afb749ce04a611ac070f071a9050238b3c126f6b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.896591 containerd[1905]: time="2026-03-07T00:48:40.896494847Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7667f774f9-5sqxh,Uid:4deed308-3e60-4e75-bf96-2c78a050ad12,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9aa7fd29c368a72150856fc5afb749ce04a611ac070f071a9050238b3c126f6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.897360 kubelet[3434]: E0307 00:48:40.897277 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9aa7fd29c368a72150856fc5afb749ce04a611ac070f071a9050238b3c126f6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.897360 kubelet[3434]: E0307 00:48:40.897328 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9aa7fd29c368a72150856fc5afb749ce04a611ac070f071a9050238b3c126f6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7667f774f9-5sqxh" Mar 7 00:48:40.897360 kubelet[3434]: E0307 00:48:40.897346 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9aa7fd29c368a72150856fc5afb749ce04a611ac070f071a9050238b3c126f6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7667f774f9-5sqxh" Mar 7 00:48:40.897690 kubelet[3434]: E0307 00:48:40.897380 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7667f774f9-5sqxh_calico-system(4deed308-3e60-4e75-bf96-2c78a050ad12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7667f774f9-5sqxh_calico-system(4deed308-3e60-4e75-bf96-2c78a050ad12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9aa7fd29c368a72150856fc5afb749ce04a611ac070f071a9050238b3c126f6b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7667f774f9-5sqxh" podUID="4deed308-3e60-4e75-bf96-2c78a050ad12" Mar 7 00:48:40.902198 containerd[1905]: time="2026-03-07T00:48:40.901930365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rz88n,Uid:f8bdc5d6-1197-42d6-a80f-f36a8fb0cade,Namespace:kube-system,Attempt:0,}" Mar 7 00:48:40.914124 containerd[1905]: time="2026-03-07T00:48:40.914091907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-c7nk2,Uid:4f6c5938-63cd-4a96-b97e-7bc139a51dd3,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:40.919512 containerd[1905]: time="2026-03-07T00:48:40.919443942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-n5q4d,Uid:231b476a-62e7-421f-b6e5-1c8fe696c6e7,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:40.956915 containerd[1905]: time="2026-03-07T00:48:40.956809247Z" level=error msg="Failed to destroy network for sandbox \"17054bb9a12c3a8f9fd4d5c059b9c871dfdbe45498342cc618400542b4569c66\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.961830 containerd[1905]: time="2026-03-07T00:48:40.961445975Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v9vbm,Uid:a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"17054bb9a12c3a8f9fd4d5c059b9c871dfdbe45498342cc618400542b4569c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.961970 kubelet[3434]: E0307 00:48:40.961666 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17054bb9a12c3a8f9fd4d5c059b9c871dfdbe45498342cc618400542b4569c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.961970 kubelet[3434]: E0307 00:48:40.961716 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17054bb9a12c3a8f9fd4d5c059b9c871dfdbe45498342cc618400542b4569c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-v9vbm" Mar 7 00:48:40.961970 kubelet[3434]: E0307 00:48:40.961736 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17054bb9a12c3a8f9fd4d5c059b9c871dfdbe45498342cc618400542b4569c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-v9vbm" Mar 7 00:48:40.964063 kubelet[3434]: E0307 00:48:40.961783 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-v9vbm_kube-system(a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-v9vbm_kube-system(a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"17054bb9a12c3a8f9fd4d5c059b9c871dfdbe45498342cc618400542b4569c66\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-v9vbm" podUID="a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec" Mar 7 00:48:40.984015 containerd[1905]: time="2026-03-07T00:48:40.983968181Z" level=error msg="Failed to destroy network for sandbox \"d6b317e503bef579387fb544a8ee3e44040f8acf5f283eeea3acb8d5f0d6b7b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.988998 containerd[1905]: time="2026-03-07T00:48:40.988935842Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-vv4jn,Uid:394e811f-ffeb-4807-804b-bb1ee6e95381,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6b317e503bef579387fb544a8ee3e44040f8acf5f283eeea3acb8d5f0d6b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.989338 kubelet[3434]: E0307 00:48:40.989288 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6b317e503bef579387fb544a8ee3e44040f8acf5f283eeea3acb8d5f0d6b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:40.989432 kubelet[3434]: E0307 00:48:40.989344 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6b317e503bef579387fb544a8ee3e44040f8acf5f283eeea3acb8d5f0d6b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5fd9487db-vv4jn" Mar 7 00:48:40.989432 kubelet[3434]: E0307 00:48:40.989359 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6b317e503bef579387fb544a8ee3e44040f8acf5f283eeea3acb8d5f0d6b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5fd9487db-vv4jn" Mar 7 00:48:40.989432 kubelet[3434]: E0307 00:48:40.989408 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5fd9487db-vv4jn_calico-system(394e811f-ffeb-4807-804b-bb1ee6e95381)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5fd9487db-vv4jn_calico-system(394e811f-ffeb-4807-804b-bb1ee6e95381)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d6b317e503bef579387fb544a8ee3e44040f8acf5f283eeea3acb8d5f0d6b7b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5fd9487db-vv4jn" podUID="394e811f-ffeb-4807-804b-bb1ee6e95381" Mar 7 00:48:41.006057 containerd[1905]: time="2026-03-07T00:48:41.006012313Z" level=error msg="Failed to destroy network for sandbox \"76dd181ad155d5c50839332e664af80ddce093ee8005c05910155181c9c7c35f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.007320 containerd[1905]: time="2026-03-07T00:48:41.007287314Z" level=error msg="Failed to destroy network for sandbox \"4d74c507c566f42d1fc1b56effd406d5940ab0dbec1760343c25af18f7f6e1a5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.009983 containerd[1905]: time="2026-03-07T00:48:41.009819994Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dbd7d4d68-gbxd7,Uid:912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"76dd181ad155d5c50839332e664af80ddce093ee8005c05910155181c9c7c35f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.010489 kubelet[3434]: E0307 00:48:41.010285 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76dd181ad155d5c50839332e664af80ddce093ee8005c05910155181c9c7c35f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.010489 kubelet[3434]: E0307 00:48:41.010338 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76dd181ad155d5c50839332e664af80ddce093ee8005c05910155181c9c7c35f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dbd7d4d68-gbxd7" Mar 7 00:48:41.010489 kubelet[3434]: E0307 00:48:41.010353 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76dd181ad155d5c50839332e664af80ddce093ee8005c05910155181c9c7c35f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dbd7d4d68-gbxd7" Mar 7 00:48:41.010606 kubelet[3434]: E0307 00:48:41.010399 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5dbd7d4d68-gbxd7_calico-system(912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5dbd7d4d68-gbxd7_calico-system(912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76dd181ad155d5c50839332e664af80ddce093ee8005c05910155181c9c7c35f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5dbd7d4d68-gbxd7" podUID="912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8" Mar 7 00:48:41.013507 containerd[1905]: time="2026-03-07T00:48:41.013458916Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rz88n,Uid:f8bdc5d6-1197-42d6-a80f-f36a8fb0cade,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d74c507c566f42d1fc1b56effd406d5940ab0dbec1760343c25af18f7f6e1a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.014465 kubelet[3434]: E0307 00:48:41.013780 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d74c507c566f42d1fc1b56effd406d5940ab0dbec1760343c25af18f7f6e1a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.014465 kubelet[3434]: E0307 00:48:41.013912 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d74c507c566f42d1fc1b56effd406d5940ab0dbec1760343c25af18f7f6e1a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rz88n" Mar 7 00:48:41.014465 kubelet[3434]: E0307 00:48:41.013935 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d74c507c566f42d1fc1b56effd406d5940ab0dbec1760343c25af18f7f6e1a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rz88n" Mar 7 00:48:41.014593 kubelet[3434]: E0307 00:48:41.013981 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-rz88n_kube-system(f8bdc5d6-1197-42d6-a80f-f36a8fb0cade)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-rz88n_kube-system(f8bdc5d6-1197-42d6-a80f-f36a8fb0cade)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d74c507c566f42d1fc1b56effd406d5940ab0dbec1760343c25af18f7f6e1a5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-rz88n" podUID="f8bdc5d6-1197-42d6-a80f-f36a8fb0cade" Mar 7 00:48:41.015002 containerd[1905]: time="2026-03-07T00:48:41.014861777Z" level=error msg="Failed to destroy network for sandbox \"ad3f6114534d3b9c538e8247271876154f1beed7c6466523ef24c4d7c103c7a6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.018059 containerd[1905]: time="2026-03-07T00:48:41.018029593Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-c7nk2,Uid:4f6c5938-63cd-4a96-b97e-7bc139a51dd3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad3f6114534d3b9c538e8247271876154f1beed7c6466523ef24c4d7c103c7a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.018463 kubelet[3434]: E0307 00:48:41.018343 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad3f6114534d3b9c538e8247271876154f1beed7c6466523ef24c4d7c103c7a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.018463 kubelet[3434]: E0307 00:48:41.018376 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad3f6114534d3b9c538e8247271876154f1beed7c6466523ef24c4d7c103c7a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5fd9487db-c7nk2" Mar 7 00:48:41.018463 kubelet[3434]: E0307 00:48:41.018393 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad3f6114534d3b9c538e8247271876154f1beed7c6466523ef24c4d7c103c7a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5fd9487db-c7nk2" Mar 7 00:48:41.018558 kubelet[3434]: E0307 00:48:41.018432 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5fd9487db-c7nk2_calico-system(4f6c5938-63cd-4a96-b97e-7bc139a51dd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5fd9487db-c7nk2_calico-system(4f6c5938-63cd-4a96-b97e-7bc139a51dd3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad3f6114534d3b9c538e8247271876154f1beed7c6466523ef24c4d7c103c7a6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5fd9487db-c7nk2" podUID="4f6c5938-63cd-4a96-b97e-7bc139a51dd3" Mar 7 00:48:41.032112 containerd[1905]: time="2026-03-07T00:48:41.032079062Z" level=error msg="Failed to destroy network for sandbox \"ea71b8272651a1ce6f795aed65ebd5cc5f9a5125e0ead2d22ad82d396fd2cc37\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.035636 containerd[1905]: time="2026-03-07T00:48:41.035596828Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-n5q4d,Uid:231b476a-62e7-421f-b6e5-1c8fe696c6e7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea71b8272651a1ce6f795aed65ebd5cc5f9a5125e0ead2d22ad82d396fd2cc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.035850 kubelet[3434]: E0307 00:48:41.035816 3434 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea71b8272651a1ce6f795aed65ebd5cc5f9a5125e0ead2d22ad82d396fd2cc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:48:41.035916 kubelet[3434]: E0307 00:48:41.035865 3434 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea71b8272651a1ce6f795aed65ebd5cc5f9a5125e0ead2d22ad82d396fd2cc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-n5q4d" Mar 7 00:48:41.035916 kubelet[3434]: E0307 00:48:41.035883 3434 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea71b8272651a1ce6f795aed65ebd5cc5f9a5125e0ead2d22ad82d396fd2cc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-n5q4d" Mar 7 00:48:41.035960 kubelet[3434]: E0307 00:48:41.035938 3434 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-n5q4d_calico-system(231b476a-62e7-421f-b6e5-1c8fe696c6e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-n5q4d_calico-system(231b476a-62e7-421f-b6e5-1c8fe696c6e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ea71b8272651a1ce6f795aed65ebd5cc5f9a5125e0ead2d22ad82d396fd2cc37\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-n5q4d" podUID="231b476a-62e7-421f-b6e5-1c8fe696c6e7" Mar 7 00:48:41.281362 containerd[1905]: time="2026-03-07T00:48:41.281314580Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 7 00:48:41.299268 containerd[1905]: time="2026-03-07T00:48:41.298610308Z" level=info msg="Container 92be73b9adbc355778debe23bafedd0cdd08e767bb2c858ab71c8172426222e2: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:41.317237 containerd[1905]: time="2026-03-07T00:48:41.317160284Z" level=info msg="CreateContainer within sandbox \"c8f2c755fdae1249b1586e0adff1f46b90b7938567a154691cbedded1079a9c0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"92be73b9adbc355778debe23bafedd0cdd08e767bb2c858ab71c8172426222e2\"" Mar 7 00:48:41.318390 containerd[1905]: time="2026-03-07T00:48:41.318352201Z" level=info msg="StartContainer for \"92be73b9adbc355778debe23bafedd0cdd08e767bb2c858ab71c8172426222e2\"" Mar 7 00:48:41.320894 containerd[1905]: time="2026-03-07T00:48:41.320822711Z" level=info msg="connecting to shim 92be73b9adbc355778debe23bafedd0cdd08e767bb2c858ab71c8172426222e2" address="unix:///run/containerd/s/42e20cd812904ca32864b44da869f0df04cf0279915116f9bc3c04d79efdd27d" protocol=ttrpc version=3 Mar 7 00:48:41.339458 systemd[1]: Started cri-containerd-92be73b9adbc355778debe23bafedd0cdd08e767bb2c858ab71c8172426222e2.scope - libcontainer container 92be73b9adbc355778debe23bafedd0cdd08e767bb2c858ab71c8172426222e2. Mar 7 00:48:41.412198 containerd[1905]: time="2026-03-07T00:48:41.412144519Z" level=info msg="StartContainer for \"92be73b9adbc355778debe23bafedd0cdd08e767bb2c858ab71c8172426222e2\" returns successfully" Mar 7 00:48:41.666283 kubelet[3434]: I0307 00:48:41.665692 3434 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-backend-key-pair\") pod \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " Mar 7 00:48:41.666283 kubelet[3434]: I0307 00:48:41.665760 3434 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-nginx-config\") pod \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " Mar 7 00:48:41.666283 kubelet[3434]: I0307 00:48:41.665776 3434 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwz28\" (UniqueName: \"kubernetes.io/projected/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-kube-api-access-hwz28\") pod \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " Mar 7 00:48:41.666283 kubelet[3434]: I0307 00:48:41.665786 3434 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-ca-bundle\") pod \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\" (UID: \"912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8\") " Mar 7 00:48:41.666283 kubelet[3434]: I0307 00:48:41.666140 3434 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8" (UID: "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:48:41.667091 kubelet[3434]: I0307 00:48:41.667064 3434 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8" (UID: "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:48:41.670338 systemd[1]: var-lib-kubelet-pods-912e50ec\x2dc7a9\x2d464f\x2d8c19\x2df8fe3bd6f8f8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dhwz28.mount: Deactivated successfully. Mar 7 00:48:41.671206 kubelet[3434]: I0307 00:48:41.671007 3434 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8" (UID: "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 7 00:48:41.671158 systemd[1]: var-lib-kubelet-pods-912e50ec\x2dc7a9\x2d464f\x2d8c19\x2df8fe3bd6f8f8-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 7 00:48:41.673328 kubelet[3434]: I0307 00:48:41.673289 3434 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-kube-api-access-hwz28" (OuterVolumeSpecName: "kube-api-access-hwz28") pod "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8" (UID: "912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8"). InnerVolumeSpecName "kube-api-access-hwz28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 7 00:48:41.766528 kubelet[3434]: I0307 00:48:41.766452 3434 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-backend-key-pair\") on node \"ci-4459.2.3-n-801efb9c04\" DevicePath \"\"" Mar 7 00:48:41.766528 kubelet[3434]: I0307 00:48:41.766484 3434 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-nginx-config\") on node \"ci-4459.2.3-n-801efb9c04\" DevicePath \"\"" Mar 7 00:48:41.766528 kubelet[3434]: I0307 00:48:41.766491 3434 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwz28\" (UniqueName: \"kubernetes.io/projected/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-kube-api-access-hwz28\") on node \"ci-4459.2.3-n-801efb9c04\" DevicePath \"\"" Mar 7 00:48:41.766528 kubelet[3434]: I0307 00:48:41.766497 3434 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8-whisker-ca-bundle\") on node \"ci-4459.2.3-n-801efb9c04\" DevicePath \"\"" Mar 7 00:48:42.278046 systemd[1]: Removed slice kubepods-besteffort-pod912e50ec_c7a9_464f_8c19_f8fe3bd6f8f8.slice - libcontainer container kubepods-besteffort-pod912e50ec_c7a9_464f_8c19_f8fe3bd6f8f8.slice. Mar 7 00:48:42.299936 kubelet[3434]: I0307 00:48:42.299633 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-szw7z" podStartSLOduration=5.112623793 podStartE2EDuration="34.299616414s" podCreationTimestamp="2026-03-07 00:48:08 +0000 UTC" firstStartedPulling="2026-03-07 00:48:09.092776454 +0000 UTC m=+26.080088610" lastFinishedPulling="2026-03-07 00:48:38.279769083 +0000 UTC m=+55.267081231" observedRunningTime="2026-03-07 00:48:42.298454618 +0000 UTC m=+59.285766814" watchObservedRunningTime="2026-03-07 00:48:42.299616414 +0000 UTC m=+59.286928562" Mar 7 00:48:42.375100 systemd[1]: Created slice kubepods-besteffort-pod43152dbd_fc58_4a4a_b5b7_84918b9513e4.slice - libcontainer container kubepods-besteffort-pod43152dbd_fc58_4a4a_b5b7_84918b9513e4.slice. Mar 7 00:48:42.469996 kubelet[3434]: I0307 00:48:42.469943 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/43152dbd-fc58-4a4a-b5b7-84918b9513e4-nginx-config\") pod \"whisker-6f59684f6f-jt4kk\" (UID: \"43152dbd-fc58-4a4a-b5b7-84918b9513e4\") " pod="calico-system/whisker-6f59684f6f-jt4kk" Mar 7 00:48:42.470346 kubelet[3434]: I0307 00:48:42.470253 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxw7\" (UniqueName: \"kubernetes.io/projected/43152dbd-fc58-4a4a-b5b7-84918b9513e4-kube-api-access-rtxw7\") pod \"whisker-6f59684f6f-jt4kk\" (UID: \"43152dbd-fc58-4a4a-b5b7-84918b9513e4\") " pod="calico-system/whisker-6f59684f6f-jt4kk" Mar 7 00:48:42.470346 kubelet[3434]: I0307 00:48:42.470281 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43152dbd-fc58-4a4a-b5b7-84918b9513e4-whisker-ca-bundle\") pod \"whisker-6f59684f6f-jt4kk\" (UID: \"43152dbd-fc58-4a4a-b5b7-84918b9513e4\") " pod="calico-system/whisker-6f59684f6f-jt4kk" Mar 7 00:48:42.470553 kubelet[3434]: I0307 00:48:42.470297 3434 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/43152dbd-fc58-4a4a-b5b7-84918b9513e4-whisker-backend-key-pair\") pod \"whisker-6f59684f6f-jt4kk\" (UID: \"43152dbd-fc58-4a4a-b5b7-84918b9513e4\") " pod="calico-system/whisker-6f59684f6f-jt4kk" Mar 7 00:48:42.687053 containerd[1905]: time="2026-03-07T00:48:42.686680136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f59684f6f-jt4kk,Uid:43152dbd-fc58-4a4a-b5b7-84918b9513e4,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:42.794632 systemd-networkd[1488]: cali81536c3fb13: Link UP Mar 7 00:48:42.795631 systemd-networkd[1488]: cali81536c3fb13: Gained carrier Mar 7 00:48:42.812784 containerd[1905]: 2026-03-07 00:48:42.706 [ERROR][4513] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:48:42.812784 containerd[1905]: 2026-03-07 00:48:42.717 [INFO][4513] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0 whisker-6f59684f6f- calico-system 43152dbd-fc58-4a4a-b5b7-84918b9513e4 964 0 2026-03-07 00:48:42 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6f59684f6f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 whisker-6f59684f6f-jt4kk eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali81536c3fb13 [] [] }} ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-" Mar 7 00:48:42.812784 containerd[1905]: 2026-03-07 00:48:42.717 [INFO][4513] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" Mar 7 00:48:42.812784 containerd[1905]: 2026-03-07 00:48:42.735 [INFO][4527] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" HandleID="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Workload="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.740 [INFO][4527] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" HandleID="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Workload="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"whisker-6f59684f6f-jt4kk", "timestamp":"2026-03-07 00:48:42.735557342 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030af20)} Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.740 [INFO][4527] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.740 [INFO][4527] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.740 [INFO][4527] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.742 [INFO][4527] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.745 [INFO][4527] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.749 [INFO][4527] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.751 [INFO][4527] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.812980 containerd[1905]: 2026-03-07 00:48:42.753 [INFO][4527] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.813111 containerd[1905]: 2026-03-07 00:48:42.753 [INFO][4527] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.813111 containerd[1905]: 2026-03-07 00:48:42.754 [INFO][4527] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092 Mar 7 00:48:42.813111 containerd[1905]: 2026-03-07 00:48:42.761 [INFO][4527] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.813111 containerd[1905]: 2026-03-07 00:48:42.765 [INFO][4527] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.1/26] block=192.168.36.0/26 handle="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.813111 containerd[1905]: 2026-03-07 00:48:42.765 [INFO][4527] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.1/26] handle="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:42.813111 containerd[1905]: 2026-03-07 00:48:42.765 [INFO][4527] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:42.813111 containerd[1905]: 2026-03-07 00:48:42.765 [INFO][4527] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.1/26] IPv6=[] ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" HandleID="k8s-pod-network.278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Workload="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" Mar 7 00:48:42.813223 containerd[1905]: 2026-03-07 00:48:42.767 [INFO][4513] cni-plugin/k8s.go 418: Populated endpoint ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0", GenerateName:"whisker-6f59684f6f-", Namespace:"calico-system", SelfLink:"", UID:"43152dbd-fc58-4a4a-b5b7-84918b9513e4", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f59684f6f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"whisker-6f59684f6f-jt4kk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.36.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali81536c3fb13", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:42.813223 containerd[1905]: 2026-03-07 00:48:42.768 [INFO][4513] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.1/32] ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" Mar 7 00:48:42.813273 containerd[1905]: 2026-03-07 00:48:42.768 [INFO][4513] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali81536c3fb13 ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" Mar 7 00:48:42.813273 containerd[1905]: 2026-03-07 00:48:42.796 [INFO][4513] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" Mar 7 00:48:42.813304 containerd[1905]: 2026-03-07 00:48:42.796 [INFO][4513] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0", GenerateName:"whisker-6f59684f6f-", Namespace:"calico-system", SelfLink:"", UID:"43152dbd-fc58-4a4a-b5b7-84918b9513e4", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f59684f6f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092", Pod:"whisker-6f59684f6f-jt4kk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.36.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali81536c3fb13", MAC:"be:83:68:a1:cc:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:42.813334 containerd[1905]: 2026-03-07 00:48:42.810 [INFO][4513] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" Namespace="calico-system" Pod="whisker-6f59684f6f-jt4kk" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-whisker--6f59684f6f--jt4kk-eth0" Mar 7 00:48:42.855821 containerd[1905]: time="2026-03-07T00:48:42.855773574Z" level=info msg="connecting to shim 278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092" address="unix:///run/containerd/s/dc9a584efeb826c7102474a18eebf8449ce012efc329ac0cbbc5efaab2e745f1" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:42.895687 systemd[1]: Started cri-containerd-278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092.scope - libcontainer container 278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092. Mar 7 00:48:42.978808 containerd[1905]: time="2026-03-07T00:48:42.978465437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f59684f6f-jt4kk,Uid:43152dbd-fc58-4a4a-b5b7-84918b9513e4,Namespace:calico-system,Attempt:0,} returns sandbox id \"278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092\"" Mar 7 00:48:42.983277 containerd[1905]: time="2026-03-07T00:48:42.983165127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 7 00:48:43.120289 kubelet[3434]: I0307 00:48:43.120091 3434 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8" path="/var/lib/kubelet/pods/912e50ec-c7a9-464f-8c19-f8fe3bd6f8f8/volumes" Mar 7 00:48:43.553806 systemd-networkd[1488]: vxlan.calico: Link UP Mar 7 00:48:43.553813 systemd-networkd[1488]: vxlan.calico: Gained carrier Mar 7 00:48:44.218320 systemd-networkd[1488]: cali81536c3fb13: Gained IPv6LL Mar 7 00:48:44.267675 containerd[1905]: time="2026-03-07T00:48:44.267128527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:44.270190 containerd[1905]: time="2026-03-07T00:48:44.270156522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 7 00:48:44.273019 containerd[1905]: time="2026-03-07T00:48:44.272993382Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:44.276393 containerd[1905]: time="2026-03-07T00:48:44.276358925Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:44.277065 containerd[1905]: time="2026-03-07T00:48:44.276728867Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.293520923s" Mar 7 00:48:44.277065 containerd[1905]: time="2026-03-07T00:48:44.276757948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 7 00:48:44.284082 containerd[1905]: time="2026-03-07T00:48:44.284061682Z" level=info msg="CreateContainer within sandbox \"278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 7 00:48:44.306712 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1649425662.mount: Deactivated successfully. Mar 7 00:48:44.308203 containerd[1905]: time="2026-03-07T00:48:44.307608503Z" level=info msg="Container 86fda597fbca534f2342ef525f4308af5045f0f96e624f835fa07b5d666bebca: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:44.323416 containerd[1905]: time="2026-03-07T00:48:44.323368149Z" level=info msg="CreateContainer within sandbox \"278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"86fda597fbca534f2342ef525f4308af5045f0f96e624f835fa07b5d666bebca\"" Mar 7 00:48:44.325257 containerd[1905]: time="2026-03-07T00:48:44.325154512Z" level=info msg="StartContainer for \"86fda597fbca534f2342ef525f4308af5045f0f96e624f835fa07b5d666bebca\"" Mar 7 00:48:44.327130 containerd[1905]: time="2026-03-07T00:48:44.327096474Z" level=info msg="connecting to shim 86fda597fbca534f2342ef525f4308af5045f0f96e624f835fa07b5d666bebca" address="unix:///run/containerd/s/dc9a584efeb826c7102474a18eebf8449ce012efc329ac0cbbc5efaab2e745f1" protocol=ttrpc version=3 Mar 7 00:48:44.347328 systemd[1]: Started cri-containerd-86fda597fbca534f2342ef525f4308af5045f0f96e624f835fa07b5d666bebca.scope - libcontainer container 86fda597fbca534f2342ef525f4308af5045f0f96e624f835fa07b5d666bebca. Mar 7 00:48:44.379805 containerd[1905]: time="2026-03-07T00:48:44.379743279Z" level=info msg="StartContainer for \"86fda597fbca534f2342ef525f4308af5045f0f96e624f835fa07b5d666bebca\" returns successfully" Mar 7 00:48:44.381796 containerd[1905]: time="2026-03-07T00:48:44.381760476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 7 00:48:44.666329 systemd-networkd[1488]: vxlan.calico: Gained IPv6LL Mar 7 00:48:45.624813 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1706694455.mount: Deactivated successfully. Mar 7 00:48:45.676770 containerd[1905]: time="2026-03-07T00:48:45.676250442Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:45.678956 containerd[1905]: time="2026-03-07T00:48:45.678924047Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 7 00:48:45.682557 containerd[1905]: time="2026-03-07T00:48:45.682526592Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:45.687558 containerd[1905]: time="2026-03-07T00:48:45.687508421Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:45.687969 containerd[1905]: time="2026-03-07T00:48:45.687863322Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 1.306071125s" Mar 7 00:48:45.687969 containerd[1905]: time="2026-03-07T00:48:45.687890795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 7 00:48:45.695752 containerd[1905]: time="2026-03-07T00:48:45.695675715Z" level=info msg="CreateContainer within sandbox \"278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 7 00:48:45.713395 containerd[1905]: time="2026-03-07T00:48:45.713355041Z" level=info msg="Container fbf715511d2f417c78a8ac7c398383345ffbe8b5cf2e5036bdf9e552b56f1de7: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:45.736748 containerd[1905]: time="2026-03-07T00:48:45.736705855Z" level=info msg="CreateContainer within sandbox \"278643bf68c1b939f90a0fc32ea2a66015202b58df09438f4f9cad29c3069092\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"fbf715511d2f417c78a8ac7c398383345ffbe8b5cf2e5036bdf9e552b56f1de7\"" Mar 7 00:48:45.737346 containerd[1905]: time="2026-03-07T00:48:45.737309574Z" level=info msg="StartContainer for \"fbf715511d2f417c78a8ac7c398383345ffbe8b5cf2e5036bdf9e552b56f1de7\"" Mar 7 00:48:45.738108 containerd[1905]: time="2026-03-07T00:48:45.738078691Z" level=info msg="connecting to shim fbf715511d2f417c78a8ac7c398383345ffbe8b5cf2e5036bdf9e552b56f1de7" address="unix:///run/containerd/s/dc9a584efeb826c7102474a18eebf8449ce012efc329ac0cbbc5efaab2e745f1" protocol=ttrpc version=3 Mar 7 00:48:45.759352 systemd[1]: Started cri-containerd-fbf715511d2f417c78a8ac7c398383345ffbe8b5cf2e5036bdf9e552b56f1de7.scope - libcontainer container fbf715511d2f417c78a8ac7c398383345ffbe8b5cf2e5036bdf9e552b56f1de7. Mar 7 00:48:45.794992 containerd[1905]: time="2026-03-07T00:48:45.794951881Z" level=info msg="StartContainer for \"fbf715511d2f417c78a8ac7c398383345ffbe8b5cf2e5036bdf9e552b56f1de7\" returns successfully" Mar 7 00:48:46.301252 kubelet[3434]: I0307 00:48:46.300687 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6f59684f6f-jt4kk" podStartSLOduration=1.59468898 podStartE2EDuration="4.3006712s" podCreationTimestamp="2026-03-07 00:48:42 +0000 UTC" firstStartedPulling="2026-03-07 00:48:42.982684789 +0000 UTC m=+59.969996937" lastFinishedPulling="2026-03-07 00:48:45.688667009 +0000 UTC m=+62.675979157" observedRunningTime="2026-03-07 00:48:46.299504363 +0000 UTC m=+63.286816519" watchObservedRunningTime="2026-03-07 00:48:46.3006712 +0000 UTC m=+63.287983356" Mar 7 00:48:52.123585 containerd[1905]: time="2026-03-07T00:48:52.123533546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-n5q4d,Uid:231b476a-62e7-421f-b6e5-1c8fe696c6e7,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:52.130209 containerd[1905]: time="2026-03-07T00:48:52.130049647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-vv4jn,Uid:394e811f-ffeb-4807-804b-bb1ee6e95381,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:52.138011 containerd[1905]: time="2026-03-07T00:48:52.137981161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v9vbm,Uid:a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec,Namespace:kube-system,Attempt:0,}" Mar 7 00:48:52.271995 systemd-networkd[1488]: cali2b9de4d47cf: Link UP Mar 7 00:48:52.272175 systemd-networkd[1488]: cali2b9de4d47cf: Gained carrier Mar 7 00:48:52.291485 containerd[1905]: 2026-03-07 00:48:52.178 [INFO][4925] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0 goldmane-cccfbd5cf- calico-system 231b476a-62e7-421f-b6e5-1c8fe696c6e7 909 0 2026-03-07 00:48:07 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 goldmane-cccfbd5cf-n5q4d eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali2b9de4d47cf [] [] }} ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-" Mar 7 00:48:52.291485 containerd[1905]: 2026-03-07 00:48:52.179 [INFO][4925] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" Mar 7 00:48:52.291485 containerd[1905]: 2026-03-07 00:48:52.211 [INFO][4959] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" HandleID="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Workload="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.219 [INFO][4959] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" HandleID="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Workload="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbb90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"goldmane-cccfbd5cf-n5q4d", "timestamp":"2026-03-07 00:48:52.211445826 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400024cdc0)} Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.219 [INFO][4959] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.219 [INFO][4959] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.219 [INFO][4959] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.221 [INFO][4959] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.226 [INFO][4959] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.232 [INFO][4959] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.234 [INFO][4959] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291684 containerd[1905]: 2026-03-07 00:48:52.238 [INFO][4959] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291814 containerd[1905]: 2026-03-07 00:48:52.238 [INFO][4959] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291814 containerd[1905]: 2026-03-07 00:48:52.241 [INFO][4959] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2 Mar 7 00:48:52.291814 containerd[1905]: 2026-03-07 00:48:52.249 [INFO][4959] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291814 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4959] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.2/26] block=192.168.36.0/26 handle="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291814 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4959] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.2/26] handle="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.291814 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4959] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:52.291814 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4959] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.2/26] IPv6=[] ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" HandleID="k8s-pod-network.815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Workload="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" Mar 7 00:48:52.292670 containerd[1905]: 2026-03-07 00:48:52.265 [INFO][4925] cni-plugin/k8s.go 418: Populated endpoint ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"231b476a-62e7-421f-b6e5-1c8fe696c6e7", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"goldmane-cccfbd5cf-n5q4d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2b9de4d47cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:52.292670 containerd[1905]: 2026-03-07 00:48:52.265 [INFO][4925] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.2/32] ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" Mar 7 00:48:52.292849 containerd[1905]: 2026-03-07 00:48:52.265 [INFO][4925] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b9de4d47cf ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" Mar 7 00:48:52.292849 containerd[1905]: 2026-03-07 00:48:52.272 [INFO][4925] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" Mar 7 00:48:52.293031 containerd[1905]: 2026-03-07 00:48:52.275 [INFO][4925] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"231b476a-62e7-421f-b6e5-1c8fe696c6e7", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2", Pod:"goldmane-cccfbd5cf-n5q4d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2b9de4d47cf", MAC:"b6:eb:75:91:ce:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:52.293078 containerd[1905]: 2026-03-07 00:48:52.288 [INFO][4925] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-n5q4d" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-goldmane--cccfbd5cf--n5q4d-eth0" Mar 7 00:48:52.338046 containerd[1905]: time="2026-03-07T00:48:52.337921364Z" level=info msg="connecting to shim 815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2" address="unix:///run/containerd/s/e833cd47c8cd72055125b0d46490ac32e3476946e35c64b0ffacb0ab7f0d3135" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:52.361539 systemd[1]: Started cri-containerd-815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2.scope - libcontainer container 815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2. Mar 7 00:48:52.370368 systemd-networkd[1488]: caliea8f33b8799: Link UP Mar 7 00:48:52.372553 systemd-networkd[1488]: caliea8f33b8799: Gained carrier Mar 7 00:48:52.394599 containerd[1905]: 2026-03-07 00:48:52.200 [INFO][4929] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0 calico-apiserver-5fd9487db- calico-system 394e811f-ffeb-4807-804b-bb1ee6e95381 906 0 2026-03-07 00:48:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5fd9487db projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 calico-apiserver-5fd9487db-vv4jn eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] caliea8f33b8799 [] [] }} ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-" Mar 7 00:48:52.394599 containerd[1905]: 2026-03-07 00:48:52.200 [INFO][4929] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" Mar 7 00:48:52.394599 containerd[1905]: 2026-03-07 00:48:52.245 [INFO][4965] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" HandleID="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.257 [INFO][4965] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" HandleID="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb7a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"calico-apiserver-5fd9487db-vv4jn", "timestamp":"2026-03-07 00:48:52.245133677 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030d8c0)} Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.257 [INFO][4965] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4965] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4965] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.321 [INFO][4965] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.326 [INFO][4965] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.334 [INFO][4965] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.336 [INFO][4965] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395118 containerd[1905]: 2026-03-07 00:48:52.338 [INFO][4965] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395277 containerd[1905]: 2026-03-07 00:48:52.339 [INFO][4965] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395277 containerd[1905]: 2026-03-07 00:48:52.340 [INFO][4965] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d Mar 7 00:48:52.395277 containerd[1905]: 2026-03-07 00:48:52.349 [INFO][4965] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395277 containerd[1905]: 2026-03-07 00:48:52.359 [INFO][4965] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.3/26] block=192.168.36.0/26 handle="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395277 containerd[1905]: 2026-03-07 00:48:52.359 [INFO][4965] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.3/26] handle="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.395277 containerd[1905]: 2026-03-07 00:48:52.360 [INFO][4965] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:52.395277 containerd[1905]: 2026-03-07 00:48:52.360 [INFO][4965] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.3/26] IPv6=[] ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" HandleID="k8s-pod-network.6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" Mar 7 00:48:52.395377 containerd[1905]: 2026-03-07 00:48:52.366 [INFO][4929] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0", GenerateName:"calico-apiserver-5fd9487db-", Namespace:"calico-system", SelfLink:"", UID:"394e811f-ffeb-4807-804b-bb1ee6e95381", ResourceVersion:"906", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5fd9487db", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"calico-apiserver-5fd9487db-vv4jn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliea8f33b8799", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:52.395416 containerd[1905]: 2026-03-07 00:48:52.366 [INFO][4929] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.3/32] ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" Mar 7 00:48:52.395416 containerd[1905]: 2026-03-07 00:48:52.366 [INFO][4929] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliea8f33b8799 ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" Mar 7 00:48:52.395416 containerd[1905]: 2026-03-07 00:48:52.372 [INFO][4929] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" Mar 7 00:48:52.395461 containerd[1905]: 2026-03-07 00:48:52.372 [INFO][4929] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0", GenerateName:"calico-apiserver-5fd9487db-", Namespace:"calico-system", SelfLink:"", UID:"394e811f-ffeb-4807-804b-bb1ee6e95381", ResourceVersion:"906", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5fd9487db", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d", Pod:"calico-apiserver-5fd9487db-vv4jn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliea8f33b8799", MAC:"5e:e6:15:b0:36:09", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:52.395495 containerd[1905]: 2026-03-07 00:48:52.387 [INFO][4929] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-vv4jn" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--vv4jn-eth0" Mar 7 00:48:52.450043 containerd[1905]: time="2026-03-07T00:48:52.450000417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-n5q4d,Uid:231b476a-62e7-421f-b6e5-1c8fe696c6e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2\"" Mar 7 00:48:52.452314 containerd[1905]: time="2026-03-07T00:48:52.452284159Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 7 00:48:52.471770 containerd[1905]: time="2026-03-07T00:48:52.471386053Z" level=info msg="connecting to shim 6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d" address="unix:///run/containerd/s/03f557bcb8a1ec65e520c1657853896aada9ed926e98a647fe71c76da90c4b77" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:52.482913 systemd-networkd[1488]: cali0b91c06e706: Link UP Mar 7 00:48:52.483792 systemd-networkd[1488]: cali0b91c06e706: Gained carrier Mar 7 00:48:52.508103 containerd[1905]: 2026-03-07 00:48:52.200 [INFO][4944] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0 coredns-66bc5c9577- kube-system a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec 904 0 2026-03-07 00:47:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 coredns-66bc5c9577-v9vbm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0b91c06e706 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-" Mar 7 00:48:52.508103 containerd[1905]: 2026-03-07 00:48:52.201 [INFO][4944] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" Mar 7 00:48:52.508103 containerd[1905]: 2026-03-07 00:48:52.246 [INFO][4967] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" HandleID="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Workload="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4967] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" HandleID="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Workload="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273a10), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"coredns-66bc5c9577-v9vbm", "timestamp":"2026-03-07 00:48:52.246657934 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000368dc0)} Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.263 [INFO][4967] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.360 [INFO][4967] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.360 [INFO][4967] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.425 [INFO][4967] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.434 [INFO][4967] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.440 [INFO][4967] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.443 [INFO][4967] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508329 containerd[1905]: 2026-03-07 00:48:52.446 [INFO][4967] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508466 containerd[1905]: 2026-03-07 00:48:52.446 [INFO][4967] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508466 containerd[1905]: 2026-03-07 00:48:52.447 [INFO][4967] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e Mar 7 00:48:52.508466 containerd[1905]: 2026-03-07 00:48:52.455 [INFO][4967] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508466 containerd[1905]: 2026-03-07 00:48:52.468 [INFO][4967] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.4/26] block=192.168.36.0/26 handle="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508466 containerd[1905]: 2026-03-07 00:48:52.468 [INFO][4967] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.4/26] handle="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:52.508466 containerd[1905]: 2026-03-07 00:48:52.468 [INFO][4967] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:52.508466 containerd[1905]: 2026-03-07 00:48:52.468 [INFO][4967] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.4/26] IPv6=[] ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" HandleID="k8s-pod-network.f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Workload="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" Mar 7 00:48:52.508573 containerd[1905]: 2026-03-07 00:48:52.478 [INFO][4944] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 47, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"coredns-66bc5c9577-v9vbm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0b91c06e706", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:52.508573 containerd[1905]: 2026-03-07 00:48:52.478 [INFO][4944] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.4/32] ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" Mar 7 00:48:52.508573 containerd[1905]: 2026-03-07 00:48:52.478 [INFO][4944] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0b91c06e706 ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" Mar 7 00:48:52.508573 containerd[1905]: 2026-03-07 00:48:52.484 [INFO][4944] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" Mar 7 00:48:52.508573 containerd[1905]: 2026-03-07 00:48:52.487 [INFO][4944] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 47, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e", Pod:"coredns-66bc5c9577-v9vbm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0b91c06e706", MAC:"fa:2e:9b:1e:55:71", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:52.508697 containerd[1905]: 2026-03-07 00:48:52.503 [INFO][4944] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" Namespace="kube-system" Pod="coredns-66bc5c9577-v9vbm" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--v9vbm-eth0" Mar 7 00:48:52.520480 systemd[1]: Started cri-containerd-6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d.scope - libcontainer container 6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d. Mar 7 00:48:52.553306 containerd[1905]: time="2026-03-07T00:48:52.552833218Z" level=info msg="connecting to shim f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e" address="unix:///run/containerd/s/0c284190406b37f7a805a541d5a93196088890c8b8c7874cb8e70c7bc15dc48e" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:52.575567 containerd[1905]: time="2026-03-07T00:48:52.575517118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-vv4jn,Uid:394e811f-ffeb-4807-804b-bb1ee6e95381,Namespace:calico-system,Attempt:0,} returns sandbox id \"6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d\"" Mar 7 00:48:52.586369 systemd[1]: Started cri-containerd-f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e.scope - libcontainer container f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e. Mar 7 00:48:52.621853 containerd[1905]: time="2026-03-07T00:48:52.621805186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v9vbm,Uid:a7c3eef4-1dc3-4b1b-808a-fffb6d8090ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e\"" Mar 7 00:48:52.629443 containerd[1905]: time="2026-03-07T00:48:52.629401088Z" level=info msg="CreateContainer within sandbox \"f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:48:52.645756 containerd[1905]: time="2026-03-07T00:48:52.645239627Z" level=info msg="Container 93c2fba195dee8712d51a122cef5cbcce76823f4e7998a286abada6f01994931: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:52.658805 containerd[1905]: time="2026-03-07T00:48:52.658757503Z" level=info msg="CreateContainer within sandbox \"f9ae78dbd5c3f0e493bb2c158b7baefaa0e6402a1818f834f33aabf4d787179e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"93c2fba195dee8712d51a122cef5cbcce76823f4e7998a286abada6f01994931\"" Mar 7 00:48:52.659569 containerd[1905]: time="2026-03-07T00:48:52.659387367Z" level=info msg="StartContainer for \"93c2fba195dee8712d51a122cef5cbcce76823f4e7998a286abada6f01994931\"" Mar 7 00:48:52.660566 containerd[1905]: time="2026-03-07T00:48:52.660499369Z" level=info msg="connecting to shim 93c2fba195dee8712d51a122cef5cbcce76823f4e7998a286abada6f01994931" address="unix:///run/containerd/s/0c284190406b37f7a805a541d5a93196088890c8b8c7874cb8e70c7bc15dc48e" protocol=ttrpc version=3 Mar 7 00:48:52.684373 systemd[1]: Started cri-containerd-93c2fba195dee8712d51a122cef5cbcce76823f4e7998a286abada6f01994931.scope - libcontainer container 93c2fba195dee8712d51a122cef5cbcce76823f4e7998a286abada6f01994931. Mar 7 00:48:52.716445 containerd[1905]: time="2026-03-07T00:48:52.716402862Z" level=info msg="StartContainer for \"93c2fba195dee8712d51a122cef5cbcce76823f4e7998a286abada6f01994931\" returns successfully" Mar 7 00:48:53.125957 containerd[1905]: time="2026-03-07T00:48:53.125917206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s56zd,Uid:2d4fe93f-ae81-4273-9c62-18e31d256fca,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:53.132655 containerd[1905]: time="2026-03-07T00:48:53.132618801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rz88n,Uid:f8bdc5d6-1197-42d6-a80f-f36a8fb0cade,Namespace:kube-system,Attempt:0,}" Mar 7 00:48:53.244052 systemd-networkd[1488]: cali8be2cb503b9: Link UP Mar 7 00:48:53.244875 systemd-networkd[1488]: cali8be2cb503b9: Gained carrier Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.176 [INFO][5215] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0 csi-node-driver- calico-system 2d4fe93f-ae81-4273-9c62-18e31d256fca 737 0 2026-03-07 00:48:08 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 csi-node-driver-s56zd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8be2cb503b9 [] [] }} ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.176 [INFO][5215] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.203 [INFO][5246] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" HandleID="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Workload="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.210 [INFO][5246] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" HandleID="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Workload="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbdc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"csi-node-driver-s56zd", "timestamp":"2026-03-07 00:48:53.203745795 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400036d4a0)} Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.210 [INFO][5246] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.210 [INFO][5246] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.210 [INFO][5246] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.213 [INFO][5246] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.217 [INFO][5246] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.221 [INFO][5246] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.222 [INFO][5246] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.224 [INFO][5246] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.224 [INFO][5246] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.225 [INFO][5246] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5 Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.230 [INFO][5246] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.238 [INFO][5246] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.5/26] block=192.168.36.0/26 handle="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.238 [INFO][5246] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.5/26] handle="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.238 [INFO][5246] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:53.265597 containerd[1905]: 2026-03-07 00:48:53.238 [INFO][5246] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.5/26] IPv6=[] ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" HandleID="k8s-pod-network.6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Workload="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" Mar 7 00:48:53.266411 containerd[1905]: 2026-03-07 00:48:53.240 [INFO][5215] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2d4fe93f-ae81-4273-9c62-18e31d256fca", ResourceVersion:"737", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"csi-node-driver-s56zd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8be2cb503b9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:53.266411 containerd[1905]: 2026-03-07 00:48:53.240 [INFO][5215] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.5/32] ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" Mar 7 00:48:53.266411 containerd[1905]: 2026-03-07 00:48:53.240 [INFO][5215] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8be2cb503b9 ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" Mar 7 00:48:53.266411 containerd[1905]: 2026-03-07 00:48:53.244 [INFO][5215] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" Mar 7 00:48:53.266411 containerd[1905]: 2026-03-07 00:48:53.245 [INFO][5215] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2d4fe93f-ae81-4273-9c62-18e31d256fca", ResourceVersion:"737", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5", Pod:"csi-node-driver-s56zd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8be2cb503b9", MAC:"c6:9d:3f:4c:24:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:53.266411 containerd[1905]: 2026-03-07 00:48:53.263 [INFO][5215] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" Namespace="calico-system" Pod="csi-node-driver-s56zd" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-csi--node--driver--s56zd-eth0" Mar 7 00:48:53.318426 containerd[1905]: time="2026-03-07T00:48:53.316695176Z" level=info msg="connecting to shim 6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5" address="unix:///run/containerd/s/2e9892e575af5815ea6e96e7583c3145c3d587596275d2d2a22fab2103f051dd" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:53.353532 kubelet[3434]: I0307 00:48:53.352460 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-v9vbm" podStartSLOduration=64.352444296 podStartE2EDuration="1m4.352444296s" podCreationTimestamp="2026-03-07 00:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:48:53.330287567 +0000 UTC m=+70.317599795" watchObservedRunningTime="2026-03-07 00:48:53.352444296 +0000 UTC m=+70.339756444" Mar 7 00:48:53.364323 systemd[1]: Started cri-containerd-6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5.scope - libcontainer container 6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5. Mar 7 00:48:53.394067 systemd-networkd[1488]: cali26a8a41b19d: Link UP Mar 7 00:48:53.395831 systemd-networkd[1488]: cali26a8a41b19d: Gained carrier Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.180 [INFO][5225] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0 coredns-66bc5c9577- kube-system f8bdc5d6-1197-42d6-a80f-f36a8fb0cade 905 0 2026-03-07 00:47:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 coredns-66bc5c9577-rz88n eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali26a8a41b19d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.181 [INFO][5225] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.206 [INFO][5251] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" HandleID="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Workload="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.215 [INFO][5251] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" HandleID="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Workload="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb4c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"coredns-66bc5c9577-rz88n", "timestamp":"2026-03-07 00:48:53.206523315 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000254dc0)} Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.215 [INFO][5251] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.238 [INFO][5251] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.238 [INFO][5251] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.317 [INFO][5251] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.334 [INFO][5251] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.350 [INFO][5251] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.359 [INFO][5251] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.369 [INFO][5251] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.369 [INFO][5251] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.371 [INFO][5251] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1 Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.375 [INFO][5251] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.385 [INFO][5251] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.6/26] block=192.168.36.0/26 handle="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.385 [INFO][5251] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.6/26] handle="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.385 [INFO][5251] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:53.415264 containerd[1905]: 2026-03-07 00:48:53.385 [INFO][5251] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.6/26] IPv6=[] ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" HandleID="k8s-pod-network.cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Workload="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" Mar 7 00:48:53.415860 containerd[1905]: 2026-03-07 00:48:53.389 [INFO][5225] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f8bdc5d6-1197-42d6-a80f-f36a8fb0cade", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 47, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"coredns-66bc5c9577-rz88n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali26a8a41b19d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:53.415860 containerd[1905]: 2026-03-07 00:48:53.389 [INFO][5225] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.6/32] ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" Mar 7 00:48:53.415860 containerd[1905]: 2026-03-07 00:48:53.389 [INFO][5225] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali26a8a41b19d ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" Mar 7 00:48:53.415860 containerd[1905]: 2026-03-07 00:48:53.396 [INFO][5225] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" Mar 7 00:48:53.415860 containerd[1905]: 2026-03-07 00:48:53.397 [INFO][5225] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f8bdc5d6-1197-42d6-a80f-f36a8fb0cade", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 47, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1", Pod:"coredns-66bc5c9577-rz88n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali26a8a41b19d", MAC:"6e:9c:8f:07:0f:a6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:53.415977 containerd[1905]: 2026-03-07 00:48:53.412 [INFO][5225] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" Namespace="kube-system" Pod="coredns-66bc5c9577-rz88n" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-coredns--66bc5c9577--rz88n-eth0" Mar 7 00:48:53.428328 containerd[1905]: time="2026-03-07T00:48:53.428219000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s56zd,Uid:2d4fe93f-ae81-4273-9c62-18e31d256fca,Namespace:calico-system,Attempt:0,} returns sandbox id \"6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5\"" Mar 7 00:48:53.463574 containerd[1905]: time="2026-03-07T00:48:53.463535911Z" level=info msg="connecting to shim cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1" address="unix:///run/containerd/s/539e141a55b12453ad691a18c56faf10afba6371c0006c9e713abc24ca02009b" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:53.486396 systemd[1]: Started cri-containerd-cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1.scope - libcontainer container cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1. Mar 7 00:48:53.527531 containerd[1905]: time="2026-03-07T00:48:53.527488051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rz88n,Uid:f8bdc5d6-1197-42d6-a80f-f36a8fb0cade,Namespace:kube-system,Attempt:0,} returns sandbox id \"cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1\"" Mar 7 00:48:53.539379 containerd[1905]: time="2026-03-07T00:48:53.539337584Z" level=info msg="CreateContainer within sandbox \"cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:48:53.556608 containerd[1905]: time="2026-03-07T00:48:53.556556368Z" level=info msg="Container bcd89f69b3b529a95b9d90ff5aa9d9d1af3724b5a6995cae5b21edf28a62f360: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:53.578765 containerd[1905]: time="2026-03-07T00:48:53.578716344Z" level=info msg="CreateContainer within sandbox \"cc402ab9b67996929b2cde4d098b83a69d5659975fdef18b0331a7cc7df013a1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bcd89f69b3b529a95b9d90ff5aa9d9d1af3724b5a6995cae5b21edf28a62f360\"" Mar 7 00:48:53.580008 containerd[1905]: time="2026-03-07T00:48:53.579497838Z" level=info msg="StartContainer for \"bcd89f69b3b529a95b9d90ff5aa9d9d1af3724b5a6995cae5b21edf28a62f360\"" Mar 7 00:48:53.580466 containerd[1905]: time="2026-03-07T00:48:53.580436369Z" level=info msg="connecting to shim bcd89f69b3b529a95b9d90ff5aa9d9d1af3724b5a6995cae5b21edf28a62f360" address="unix:///run/containerd/s/539e141a55b12453ad691a18c56faf10afba6371c0006c9e713abc24ca02009b" protocol=ttrpc version=3 Mar 7 00:48:53.600372 systemd[1]: Started cri-containerd-bcd89f69b3b529a95b9d90ff5aa9d9d1af3724b5a6995cae5b21edf28a62f360.scope - libcontainer container bcd89f69b3b529a95b9d90ff5aa9d9d1af3724b5a6995cae5b21edf28a62f360. Mar 7 00:48:53.626499 systemd-networkd[1488]: cali0b91c06e706: Gained IPv6LL Mar 7 00:48:53.629863 containerd[1905]: time="2026-03-07T00:48:53.629825465Z" level=info msg="StartContainer for \"bcd89f69b3b529a95b9d90ff5aa9d9d1af3724b5a6995cae5b21edf28a62f360\" returns successfully" Mar 7 00:48:53.946521 systemd-networkd[1488]: cali2b9de4d47cf: Gained IPv6LL Mar 7 00:48:54.129345 containerd[1905]: time="2026-03-07T00:48:54.129273086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7667f774f9-5sqxh,Uid:4deed308-3e60-4e75-bf96-2c78a050ad12,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:54.267863 systemd-networkd[1488]: caliea8f33b8799: Gained IPv6LL Mar 7 00:48:54.286988 systemd-networkd[1488]: calib978f9006d8: Link UP Mar 7 00:48:54.288785 systemd-networkd[1488]: calib978f9006d8: Gained carrier Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.189 [INFO][5444] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0 calico-kube-controllers-7667f774f9- calico-system 4deed308-3e60-4e75-bf96-2c78a050ad12 903 0 2026-03-07 00:48:09 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7667f774f9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 calico-kube-controllers-7667f774f9-5sqxh eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib978f9006d8 [] [] }} ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.189 [INFO][5444] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.219 [INFO][5458] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" HandleID="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.228 [INFO][5458] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" HandleID="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb4c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"calico-kube-controllers-7667f774f9-5sqxh", "timestamp":"2026-03-07 00:48:54.219639546 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003bf340)} Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.228 [INFO][5458] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.228 [INFO][5458] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.228 [INFO][5458] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.232 [INFO][5458] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.238 [INFO][5458] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.245 [INFO][5458] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.248 [INFO][5458] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.251 [INFO][5458] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.251 [INFO][5458] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.253 [INFO][5458] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7 Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.261 [INFO][5458] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.272 [INFO][5458] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.7/26] block=192.168.36.0/26 handle="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.273 [INFO][5458] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.7/26] handle="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.274 [INFO][5458] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:54.307588 containerd[1905]: 2026-03-07 00:48:54.274 [INFO][5458] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.7/26] IPv6=[] ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" HandleID="k8s-pod-network.18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" Mar 7 00:48:54.308591 containerd[1905]: 2026-03-07 00:48:54.280 [INFO][5444] cni-plugin/k8s.go 418: Populated endpoint ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0", GenerateName:"calico-kube-controllers-7667f774f9-", Namespace:"calico-system", SelfLink:"", UID:"4deed308-3e60-4e75-bf96-2c78a050ad12", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7667f774f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"calico-kube-controllers-7667f774f9-5sqxh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib978f9006d8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:54.308591 containerd[1905]: 2026-03-07 00:48:54.281 [INFO][5444] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.7/32] ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" Mar 7 00:48:54.308591 containerd[1905]: 2026-03-07 00:48:54.281 [INFO][5444] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib978f9006d8 ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" Mar 7 00:48:54.308591 containerd[1905]: 2026-03-07 00:48:54.291 [INFO][5444] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" Mar 7 00:48:54.308591 containerd[1905]: 2026-03-07 00:48:54.292 [INFO][5444] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0", GenerateName:"calico-kube-controllers-7667f774f9-", Namespace:"calico-system", SelfLink:"", UID:"4deed308-3e60-4e75-bf96-2c78a050ad12", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7667f774f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7", Pod:"calico-kube-controllers-7667f774f9-5sqxh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib978f9006d8", MAC:"06:b9:c6:04:f0:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:54.308591 containerd[1905]: 2026-03-07 00:48:54.303 [INFO][5444] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" Namespace="calico-system" Pod="calico-kube-controllers-7667f774f9-5sqxh" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--kube--controllers--7667f774f9--5sqxh-eth0" Mar 7 00:48:54.342128 kubelet[3434]: I0307 00:48:54.342066 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-rz88n" podStartSLOduration=65.342046763 podStartE2EDuration="1m5.342046763s" podCreationTimestamp="2026-03-07 00:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:48:54.340008302 +0000 UTC m=+71.327320498" watchObservedRunningTime="2026-03-07 00:48:54.342046763 +0000 UTC m=+71.329358911" Mar 7 00:48:54.373532 containerd[1905]: time="2026-03-07T00:48:54.373485409Z" level=info msg="connecting to shim 18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7" address="unix:///run/containerd/s/80fdce625638fa95177b00d6980eec283f19ae91023ab66a6ccca0ea339a1e23" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:54.417779 systemd[1]: Started cri-containerd-18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7.scope - libcontainer container 18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7. Mar 7 00:48:54.481285 containerd[1905]: time="2026-03-07T00:48:54.481236331Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7667f774f9-5sqxh,Uid:4deed308-3e60-4e75-bf96-2c78a050ad12,Namespace:calico-system,Attempt:0,} returns sandbox id \"18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7\"" Mar 7 00:48:54.714396 systemd-networkd[1488]: cali26a8a41b19d: Gained IPv6LL Mar 7 00:48:54.921238 containerd[1905]: time="2026-03-07T00:48:54.920786172Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:54.923547 containerd[1905]: time="2026-03-07T00:48:54.923414158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 7 00:48:54.926572 containerd[1905]: time="2026-03-07T00:48:54.926521915Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:54.931701 containerd[1905]: time="2026-03-07T00:48:54.931655852Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:54.932315 containerd[1905]: time="2026-03-07T00:48:54.932096437Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 2.479389926s" Mar 7 00:48:54.932315 containerd[1905]: time="2026-03-07T00:48:54.932130742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 7 00:48:54.934448 containerd[1905]: time="2026-03-07T00:48:54.934414204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:48:54.939635 containerd[1905]: time="2026-03-07T00:48:54.939601047Z" level=info msg="CreateContainer within sandbox \"815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 7 00:48:54.956869 containerd[1905]: time="2026-03-07T00:48:54.956822870Z" level=info msg="Container ee2f4cb65e424239a2559335fb307320f42423241e46d00fb5442e1d15cbcd55: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:54.973976 containerd[1905]: time="2026-03-07T00:48:54.973835061Z" level=info msg="CreateContainer within sandbox \"815927f6edc1847dff1a8d072205ede3bbc72c685655d706179be4b3406457f2\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"ee2f4cb65e424239a2559335fb307320f42423241e46d00fb5442e1d15cbcd55\"" Mar 7 00:48:54.975540 containerd[1905]: time="2026-03-07T00:48:54.975519173Z" level=info msg="StartContainer for \"ee2f4cb65e424239a2559335fb307320f42423241e46d00fb5442e1d15cbcd55\"" Mar 7 00:48:54.976828 containerd[1905]: time="2026-03-07T00:48:54.976750603Z" level=info msg="connecting to shim ee2f4cb65e424239a2559335fb307320f42423241e46d00fb5442e1d15cbcd55" address="unix:///run/containerd/s/e833cd47c8cd72055125b0d46490ac32e3476946e35c64b0ffacb0ab7f0d3135" protocol=ttrpc version=3 Mar 7 00:48:54.994355 systemd[1]: Started cri-containerd-ee2f4cb65e424239a2559335fb307320f42423241e46d00fb5442e1d15cbcd55.scope - libcontainer container ee2f4cb65e424239a2559335fb307320f42423241e46d00fb5442e1d15cbcd55. Mar 7 00:48:55.030821 containerd[1905]: time="2026-03-07T00:48:55.030781122Z" level=info msg="StartContainer for \"ee2f4cb65e424239a2559335fb307320f42423241e46d00fb5442e1d15cbcd55\" returns successfully" Mar 7 00:48:55.098397 systemd-networkd[1488]: cali8be2cb503b9: Gained IPv6LL Mar 7 00:48:55.134023 containerd[1905]: time="2026-03-07T00:48:55.133977481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-c7nk2,Uid:4f6c5938-63cd-4a96-b97e-7bc139a51dd3,Namespace:calico-system,Attempt:0,}" Mar 7 00:48:55.134762 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1903252330.mount: Deactivated successfully. Mar 7 00:48:55.235013 systemd-networkd[1488]: cali4b05252fac5: Link UP Mar 7 00:48:55.235146 systemd-networkd[1488]: cali4b05252fac5: Gained carrier Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.172 [INFO][5586] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0 calico-apiserver-5fd9487db- calico-system 4f6c5938-63cd-4a96-b97e-7bc139a51dd3 908 0 2026-03-07 00:48:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5fd9487db projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.3-n-801efb9c04 calico-apiserver-5fd9487db-c7nk2 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali4b05252fac5 [] [] }} ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.172 [INFO][5586] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.193 [INFO][5597] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" HandleID="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.199 [INFO][5597] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" HandleID="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273490), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-801efb9c04", "pod":"calico-apiserver-5fd9487db-c7nk2", "timestamp":"2026-03-07 00:48:55.19366218 +0000 UTC"}, Hostname:"ci-4459.2.3-n-801efb9c04", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002e2f20)} Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.199 [INFO][5597] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.199 [INFO][5597] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.199 [INFO][5597] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-801efb9c04' Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.202 [INFO][5597] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.205 [INFO][5597] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.209 [INFO][5597] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.210 [INFO][5597] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.212 [INFO][5597] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.212 [INFO][5597] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.213 [INFO][5597] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.219 [INFO][5597] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.229 [INFO][5597] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.8/26] block=192.168.36.0/26 handle="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.229 [INFO][5597] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.8/26] handle="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" host="ci-4459.2.3-n-801efb9c04" Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.230 [INFO][5597] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:48:55.249888 containerd[1905]: 2026-03-07 00:48:55.230 [INFO][5597] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.8/26] IPv6=[] ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" HandleID="k8s-pod-network.326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Workload="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" Mar 7 00:48:55.250836 containerd[1905]: 2026-03-07 00:48:55.232 [INFO][5586] cni-plugin/k8s.go 418: Populated endpoint ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0", GenerateName:"calico-apiserver-5fd9487db-", Namespace:"calico-system", SelfLink:"", UID:"4f6c5938-63cd-4a96-b97e-7bc139a51dd3", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5fd9487db", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"", Pod:"calico-apiserver-5fd9487db-c7nk2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4b05252fac5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:55.250836 containerd[1905]: 2026-03-07 00:48:55.232 [INFO][5586] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.8/32] ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" Mar 7 00:48:55.250836 containerd[1905]: 2026-03-07 00:48:55.232 [INFO][5586] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4b05252fac5 ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" Mar 7 00:48:55.250836 containerd[1905]: 2026-03-07 00:48:55.234 [INFO][5586] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" Mar 7 00:48:55.250836 containerd[1905]: 2026-03-07 00:48:55.234 [INFO][5586] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0", GenerateName:"calico-apiserver-5fd9487db-", Namespace:"calico-system", SelfLink:"", UID:"4f6c5938-63cd-4a96-b97e-7bc139a51dd3", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 48, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5fd9487db", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-801efb9c04", ContainerID:"326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f", Pod:"calico-apiserver-5fd9487db-c7nk2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4b05252fac5", MAC:"66:01:7a:e6:41:92", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:48:55.250836 containerd[1905]: 2026-03-07 00:48:55.247 [INFO][5586] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" Namespace="calico-system" Pod="calico-apiserver-5fd9487db-c7nk2" WorkloadEndpoint="ci--4459.2.3--n--801efb9c04-k8s-calico--apiserver--5fd9487db--c7nk2-eth0" Mar 7 00:48:55.288437 containerd[1905]: time="2026-03-07T00:48:55.288394004Z" level=info msg="connecting to shim 326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f" address="unix:///run/containerd/s/79aec1d2031c911a3ec726f8f8712e7ab2a67bc69f86c4b31b5ca4b3d0d26c18" namespace=k8s.io protocol=ttrpc version=3 Mar 7 00:48:55.308333 systemd[1]: Started cri-containerd-326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f.scope - libcontainer container 326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f. Mar 7 00:48:55.377394 containerd[1905]: time="2026-03-07T00:48:55.377014791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5fd9487db-c7nk2,Uid:4f6c5938-63cd-4a96-b97e-7bc139a51dd3,Namespace:calico-system,Attempt:0,} returns sandbox id \"326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f\"" Mar 7 00:48:55.379181 kubelet[3434]: I0307 00:48:55.378964 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-n5q4d" podStartSLOduration=45.897943477 podStartE2EDuration="48.378945536s" podCreationTimestamp="2026-03-07 00:48:07 +0000 UTC" firstStartedPulling="2026-03-07 00:48:52.452011652 +0000 UTC m=+69.439323800" lastFinishedPulling="2026-03-07 00:48:54.933013551 +0000 UTC m=+71.920325859" observedRunningTime="2026-03-07 00:48:55.350766973 +0000 UTC m=+72.338079137" watchObservedRunningTime="2026-03-07 00:48:55.378945536 +0000 UTC m=+72.366257732" Mar 7 00:48:55.482400 systemd-networkd[1488]: calib978f9006d8: Gained IPv6LL Mar 7 00:48:56.954553 systemd-networkd[1488]: cali4b05252fac5: Gained IPv6LL Mar 7 00:48:58.708967 containerd[1905]: time="2026-03-07T00:48:58.708891047Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:58.711616 containerd[1905]: time="2026-03-07T00:48:58.711579549Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 7 00:48:58.757123 containerd[1905]: time="2026-03-07T00:48:58.757065816Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:58.821217 containerd[1905]: time="2026-03-07T00:48:58.820820166Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:48:58.821526 containerd[1905]: time="2026-03-07T00:48:58.821409397Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.886872852s" Mar 7 00:48:58.821526 containerd[1905]: time="2026-03-07T00:48:58.821438638Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:48:58.823651 containerd[1905]: time="2026-03-07T00:48:58.823431537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 7 00:48:58.870872 containerd[1905]: time="2026-03-07T00:48:58.870833020Z" level=info msg="CreateContainer within sandbox \"6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:48:59.060048 containerd[1905]: time="2026-03-07T00:48:59.059434315Z" level=info msg="Container b0b2544f1b3ca43cf67f5b8a2f00e182325df22f1dcd06582323dd3523f738b5: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:48:59.173710 containerd[1905]: time="2026-03-07T00:48:59.173647552Z" level=info msg="CreateContainer within sandbox \"6797d636d3adee5a20541ea9ba302cf16ac721b1a94197946f6ca3cd64d4662d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b0b2544f1b3ca43cf67f5b8a2f00e182325df22f1dcd06582323dd3523f738b5\"" Mar 7 00:48:59.174604 containerd[1905]: time="2026-03-07T00:48:59.174539970Z" level=info msg="StartContainer for \"b0b2544f1b3ca43cf67f5b8a2f00e182325df22f1dcd06582323dd3523f738b5\"" Mar 7 00:48:59.175945 containerd[1905]: time="2026-03-07T00:48:59.175919990Z" level=info msg="connecting to shim b0b2544f1b3ca43cf67f5b8a2f00e182325df22f1dcd06582323dd3523f738b5" address="unix:///run/containerd/s/03f557bcb8a1ec65e520c1657853896aada9ed926e98a647fe71c76da90c4b77" protocol=ttrpc version=3 Mar 7 00:48:59.195340 systemd[1]: Started cri-containerd-b0b2544f1b3ca43cf67f5b8a2f00e182325df22f1dcd06582323dd3523f738b5.scope - libcontainer container b0b2544f1b3ca43cf67f5b8a2f00e182325df22f1dcd06582323dd3523f738b5. Mar 7 00:48:59.266361 containerd[1905]: time="2026-03-07T00:48:59.266236114Z" level=info msg="StartContainer for \"b0b2544f1b3ca43cf67f5b8a2f00e182325df22f1dcd06582323dd3523f738b5\" returns successfully" Mar 7 00:48:59.356451 kubelet[3434]: I0307 00:48:59.356194 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5fd9487db-vv4jn" podStartSLOduration=47.110690801 podStartE2EDuration="53.35616528s" podCreationTimestamp="2026-03-07 00:48:06 +0000 UTC" firstStartedPulling="2026-03-07 00:48:52.577074833 +0000 UTC m=+69.564386981" lastFinishedPulling="2026-03-07 00:48:58.822549304 +0000 UTC m=+75.809861460" observedRunningTime="2026-03-07 00:48:59.35531504 +0000 UTC m=+76.342627260" watchObservedRunningTime="2026-03-07 00:48:59.35616528 +0000 UTC m=+76.343477428" Mar 7 00:49:00.345212 kubelet[3434]: I0307 00:49:00.344760 3434 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:49:01.564243 update_engine[1887]: I20260307 00:49:01.563652 1887 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 7 00:49:01.564243 update_engine[1887]: I20260307 00:49:01.563707 1887 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 7 00:49:01.564243 update_engine[1887]: I20260307 00:49:01.563911 1887 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 7 00:49:01.565887 update_engine[1887]: I20260307 00:49:01.565855 1887 omaha_request_params.cc:62] Current group set to stable Mar 7 00:49:01.566859 update_engine[1887]: I20260307 00:49:01.566677 1887 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 7 00:49:01.566859 update_engine[1887]: I20260307 00:49:01.566702 1887 update_attempter.cc:643] Scheduling an action processor start. Mar 7 00:49:01.566859 update_engine[1887]: I20260307 00:49:01.566721 1887 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 7 00:49:01.566859 update_engine[1887]: I20260307 00:49:01.566754 1887 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 7 00:49:01.566859 update_engine[1887]: I20260307 00:49:01.566812 1887 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 7 00:49:01.566859 update_engine[1887]: I20260307 00:49:01.566817 1887 omaha_request_action.cc:272] Request: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: Mar 7 00:49:01.566859 update_engine[1887]: I20260307 00:49:01.566821 1887 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:49:01.569427 update_engine[1887]: I20260307 00:49:01.569329 1887 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:49:01.570300 update_engine[1887]: I20260307 00:49:01.570271 1887 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:49:01.571259 locksmithd[1984]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 7 00:49:01.643125 update_engine[1887]: E20260307 00:49:01.643056 1887 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:49:01.643289 update_engine[1887]: I20260307 00:49:01.643177 1887 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 7 00:49:01.718829 containerd[1905]: time="2026-03-07T00:49:01.718767417Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:01.722659 containerd[1905]: time="2026-03-07T00:49:01.722621643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 7 00:49:01.767462 containerd[1905]: time="2026-03-07T00:49:01.767372986Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:01.814862 containerd[1905]: time="2026-03-07T00:49:01.814715595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:01.815514 containerd[1905]: time="2026-03-07T00:49:01.815484800Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 2.992019438s" Mar 7 00:49:01.815700 containerd[1905]: time="2026-03-07T00:49:01.815606013Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 7 00:49:01.818474 containerd[1905]: time="2026-03-07T00:49:01.818370413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 7 00:49:01.864090 containerd[1905]: time="2026-03-07T00:49:01.864044991Z" level=info msg="CreateContainer within sandbox \"6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 7 00:49:02.069274 containerd[1905]: time="2026-03-07T00:49:02.067648910Z" level=info msg="Container 1aa061741e82dfd3e82cc7270891330be9cff46f74e0d023e56e405072e42876: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:49:02.211871 containerd[1905]: time="2026-03-07T00:49:02.211831018Z" level=info msg="CreateContainer within sandbox \"6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"1aa061741e82dfd3e82cc7270891330be9cff46f74e0d023e56e405072e42876\"" Mar 7 00:49:02.212841 containerd[1905]: time="2026-03-07T00:49:02.212715220Z" level=info msg="StartContainer for \"1aa061741e82dfd3e82cc7270891330be9cff46f74e0d023e56e405072e42876\"" Mar 7 00:49:02.213972 containerd[1905]: time="2026-03-07T00:49:02.213936586Z" level=info msg="connecting to shim 1aa061741e82dfd3e82cc7270891330be9cff46f74e0d023e56e405072e42876" address="unix:///run/containerd/s/2e9892e575af5815ea6e96e7583c3145c3d587596275d2d2a22fab2103f051dd" protocol=ttrpc version=3 Mar 7 00:49:02.237348 systemd[1]: Started cri-containerd-1aa061741e82dfd3e82cc7270891330be9cff46f74e0d023e56e405072e42876.scope - libcontainer container 1aa061741e82dfd3e82cc7270891330be9cff46f74e0d023e56e405072e42876. Mar 7 00:49:02.294152 containerd[1905]: time="2026-03-07T00:49:02.294049356Z" level=info msg="StartContainer for \"1aa061741e82dfd3e82cc7270891330be9cff46f74e0d023e56e405072e42876\" returns successfully" Mar 7 00:49:11.570716 update_engine[1887]: I20260307 00:49:11.570642 1887 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:49:11.571649 update_engine[1887]: I20260307 00:49:11.571180 1887 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:49:11.571649 update_engine[1887]: I20260307 00:49:11.571595 1887 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:49:11.604013 update_engine[1887]: E20260307 00:49:11.603827 1887 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:49:11.604013 update_engine[1887]: I20260307 00:49:11.603963 1887 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 7 00:49:11.812771 containerd[1905]: time="2026-03-07T00:49:11.812713619Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:11.856901 containerd[1905]: time="2026-03-07T00:49:11.856587474Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 7 00:49:11.859658 containerd[1905]: time="2026-03-07T00:49:11.859493757Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:11.920826 containerd[1905]: time="2026-03-07T00:49:11.920755448Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:11.921346 containerd[1905]: time="2026-03-07T00:49:11.921302180Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 10.102646204s" Mar 7 00:49:11.921346 containerd[1905]: time="2026-03-07T00:49:11.921335965Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 7 00:49:11.922696 containerd[1905]: time="2026-03-07T00:49:11.922649526Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:49:12.016692 containerd[1905]: time="2026-03-07T00:49:12.016650736Z" level=info msg="CreateContainer within sandbox \"18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 7 00:49:12.165816 containerd[1905]: time="2026-03-07T00:49:12.165722316Z" level=info msg="Container 8ea4d449b0a552655260f2684dbda539f9d3f265fb1ca4fb16103edb9ecfb2ac: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:49:12.368583 containerd[1905]: time="2026-03-07T00:49:12.368483442Z" level=info msg="CreateContainer within sandbox \"18991b326b0ef4b73eeb62388f41c33794c8d7067c67bc6e205af430f65a13b7\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8ea4d449b0a552655260f2684dbda539f9d3f265fb1ca4fb16103edb9ecfb2ac\"" Mar 7 00:49:12.369104 containerd[1905]: time="2026-03-07T00:49:12.369078232Z" level=info msg="StartContainer for \"8ea4d449b0a552655260f2684dbda539f9d3f265fb1ca4fb16103edb9ecfb2ac\"" Mar 7 00:49:12.414197 containerd[1905]: time="2026-03-07T00:49:12.414136729Z" level=info msg="connecting to shim 8ea4d449b0a552655260f2684dbda539f9d3f265fb1ca4fb16103edb9ecfb2ac" address="unix:///run/containerd/s/80fdce625638fa95177b00d6980eec283f19ae91023ab66a6ccca0ea339a1e23" protocol=ttrpc version=3 Mar 7 00:49:12.450336 systemd[1]: Started cri-containerd-8ea4d449b0a552655260f2684dbda539f9d3f265fb1ca4fb16103edb9ecfb2ac.scope - libcontainer container 8ea4d449b0a552655260f2684dbda539f9d3f265fb1ca4fb16103edb9ecfb2ac. Mar 7 00:49:12.512767 containerd[1905]: time="2026-03-07T00:49:12.512708804Z" level=info msg="StartContainer for \"8ea4d449b0a552655260f2684dbda539f9d3f265fb1ca4fb16103edb9ecfb2ac\" returns successfully" Mar 7 00:49:12.920233 containerd[1905]: time="2026-03-07T00:49:12.919995730Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:12.922685 containerd[1905]: time="2026-03-07T00:49:12.922605614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 7 00:49:12.924205 containerd[1905]: time="2026-03-07T00:49:12.924146729Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 1.001470274s" Mar 7 00:49:12.924403 containerd[1905]: time="2026-03-07T00:49:12.924303511Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:49:12.925864 containerd[1905]: time="2026-03-07T00:49:12.925733437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 7 00:49:13.013884 containerd[1905]: time="2026-03-07T00:49:13.013838185Z" level=info msg="CreateContainer within sandbox \"326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:49:13.158454 containerd[1905]: time="2026-03-07T00:49:13.158405032Z" level=info msg="Container c1ea5095edc88a018e4cc9aa198576cb93a37e1c3d7d41323f17f787a583c073: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:49:13.321828 containerd[1905]: time="2026-03-07T00:49:13.321779422Z" level=info msg="CreateContainer within sandbox \"326c26ba1cb3164c2cfb1e9a3f4bf096550a5b7a7701d0f7e9c954e94168152f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c1ea5095edc88a018e4cc9aa198576cb93a37e1c3d7d41323f17f787a583c073\"" Mar 7 00:49:13.322602 containerd[1905]: time="2026-03-07T00:49:13.322577092Z" level=info msg="StartContainer for \"c1ea5095edc88a018e4cc9aa198576cb93a37e1c3d7d41323f17f787a583c073\"" Mar 7 00:49:13.324328 containerd[1905]: time="2026-03-07T00:49:13.324277237Z" level=info msg="connecting to shim c1ea5095edc88a018e4cc9aa198576cb93a37e1c3d7d41323f17f787a583c073" address="unix:///run/containerd/s/79aec1d2031c911a3ec726f8f8712e7ab2a67bc69f86c4b31b5ca4b3d0d26c18" protocol=ttrpc version=3 Mar 7 00:49:13.347375 systemd[1]: Started cri-containerd-c1ea5095edc88a018e4cc9aa198576cb93a37e1c3d7d41323f17f787a583c073.scope - libcontainer container c1ea5095edc88a018e4cc9aa198576cb93a37e1c3d7d41323f17f787a583c073. Mar 7 00:49:13.421670 containerd[1905]: time="2026-03-07T00:49:13.421624338Z" level=info msg="StartContainer for \"c1ea5095edc88a018e4cc9aa198576cb93a37e1c3d7d41323f17f787a583c073\" returns successfully" Mar 7 00:49:13.476111 kubelet[3434]: I0307 00:49:13.475985 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7667f774f9-5sqxh" podStartSLOduration=47.037073984 podStartE2EDuration="1m4.475969709s" podCreationTimestamp="2026-03-07 00:48:09 +0000 UTC" firstStartedPulling="2026-03-07 00:48:54.483209029 +0000 UTC m=+71.470521177" lastFinishedPulling="2026-03-07 00:49:11.922104754 +0000 UTC m=+88.909416902" observedRunningTime="2026-03-07 00:49:13.403852028 +0000 UTC m=+90.391164176" watchObservedRunningTime="2026-03-07 00:49:13.475969709 +0000 UTC m=+90.463281857" Mar 7 00:49:15.169577 containerd[1905]: time="2026-03-07T00:49:15.169052719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:15.214639 containerd[1905]: time="2026-03-07T00:49:15.214596689Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 7 00:49:15.260439 containerd[1905]: time="2026-03-07T00:49:15.260367541Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:15.267234 containerd[1905]: time="2026-03-07T00:49:15.266341449Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:49:15.267820 containerd[1905]: time="2026-03-07T00:49:15.267789728Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 2.342026442s" Mar 7 00:49:15.268087 containerd[1905]: time="2026-03-07T00:49:15.268070475Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 7 00:49:16.120281 containerd[1905]: time="2026-03-07T00:49:16.119034149Z" level=info msg="CreateContainer within sandbox \"6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 7 00:49:16.140878 kubelet[3434]: I0307 00:49:16.140810 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5fd9487db-c7nk2" podStartSLOduration=52.596198 podStartE2EDuration="1m10.140785779s" podCreationTimestamp="2026-03-07 00:48:06 +0000 UTC" firstStartedPulling="2026-03-07 00:48:55.380638303 +0000 UTC m=+72.367950451" lastFinishedPulling="2026-03-07 00:49:12.925226074 +0000 UTC m=+89.912538230" observedRunningTime="2026-03-07 00:49:14.413059519 +0000 UTC m=+91.400371683" watchObservedRunningTime="2026-03-07 00:49:16.140785779 +0000 UTC m=+93.128097927" Mar 7 00:49:16.281196 containerd[1905]: time="2026-03-07T00:49:16.280367132Z" level=info msg="Container 14afe99713a4e580caf930e485da79558ddcff889ce90a5485f25e3e9555b27a: CDI devices from CRI Config.CDIDevices: []" Mar 7 00:49:16.284612 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3256396751.mount: Deactivated successfully. Mar 7 00:49:16.297292 containerd[1905]: time="2026-03-07T00:49:16.297249345Z" level=info msg="CreateContainer within sandbox \"6220fd9bf1afdb89d835df2cc2e52a11db653a9a666b9dedfd88f2e293f7b0b5\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"14afe99713a4e580caf930e485da79558ddcff889ce90a5485f25e3e9555b27a\"" Mar 7 00:49:16.298008 containerd[1905]: time="2026-03-07T00:49:16.297833031Z" level=info msg="StartContainer for \"14afe99713a4e580caf930e485da79558ddcff889ce90a5485f25e3e9555b27a\"" Mar 7 00:49:16.299991 containerd[1905]: time="2026-03-07T00:49:16.299898902Z" level=info msg="connecting to shim 14afe99713a4e580caf930e485da79558ddcff889ce90a5485f25e3e9555b27a" address="unix:///run/containerd/s/2e9892e575af5815ea6e96e7583c3145c3d587596275d2d2a22fab2103f051dd" protocol=ttrpc version=3 Mar 7 00:49:16.320330 systemd[1]: Started cri-containerd-14afe99713a4e580caf930e485da79558ddcff889ce90a5485f25e3e9555b27a.scope - libcontainer container 14afe99713a4e580caf930e485da79558ddcff889ce90a5485f25e3e9555b27a. Mar 7 00:49:16.373600 containerd[1905]: time="2026-03-07T00:49:16.373492040Z" level=info msg="StartContainer for \"14afe99713a4e580caf930e485da79558ddcff889ce90a5485f25e3e9555b27a\" returns successfully" Mar 7 00:49:16.423505 kubelet[3434]: I0307 00:49:16.423449 3434 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-s56zd" podStartSLOduration=46.585477239 podStartE2EDuration="1m8.423434267s" podCreationTimestamp="2026-03-07 00:48:08 +0000 UTC" firstStartedPulling="2026-03-07 00:48:53.430938454 +0000 UTC m=+70.418250610" lastFinishedPulling="2026-03-07 00:49:15.26889549 +0000 UTC m=+92.256207638" observedRunningTime="2026-03-07 00:49:16.42231924 +0000 UTC m=+93.409631388" watchObservedRunningTime="2026-03-07 00:49:16.423434267 +0000 UTC m=+93.410746415" Mar 7 00:49:17.218051 kubelet[3434]: I0307 00:49:17.218007 3434 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 7 00:49:17.220496 kubelet[3434]: I0307 00:49:17.220471 3434 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 7 00:49:21.565373 update_engine[1887]: I20260307 00:49:21.565290 1887 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:49:21.565870 update_engine[1887]: I20260307 00:49:21.565393 1887 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:49:21.565870 update_engine[1887]: I20260307 00:49:21.565773 1887 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:49:21.578760 update_engine[1887]: E20260307 00:49:21.578708 1887 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:49:21.578864 update_engine[1887]: I20260307 00:49:21.578814 1887 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 7 00:49:31.060134 systemd[1]: Started sshd@7-10.200.20.26:22-10.200.16.10:34962.service - OpenSSH per-connection server daemon (10.200.16.10:34962). Mar 7 00:49:31.485081 sshd[6084]: Accepted publickey for core from 10.200.16.10 port 34962 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:31.489261 sshd-session[6084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:31.496247 systemd-logind[1883]: New session 10 of user core. Mar 7 00:49:31.502316 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 7 00:49:31.562815 update_engine[1887]: I20260307 00:49:31.562278 1887 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:49:31.562815 update_engine[1887]: I20260307 00:49:31.562374 1887 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:49:31.562815 update_engine[1887]: I20260307 00:49:31.562762 1887 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:49:31.663419 update_engine[1887]: E20260307 00:49:31.663211 1887 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663733 1887 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663753 1887 omaha_request_action.cc:617] Omaha request response: Mar 7 00:49:31.664788 update_engine[1887]: E20260307 00:49:31.663836 1887 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663851 1887 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663856 1887 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663861 1887 update_attempter.cc:306] Processing Done. Mar 7 00:49:31.664788 update_engine[1887]: E20260307 00:49:31.663873 1887 update_attempter.cc:619] Update failed. Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663877 1887 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663882 1887 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663885 1887 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663950 1887 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663970 1887 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 7 00:49:31.664788 update_engine[1887]: I20260307 00:49:31.663975 1887 omaha_request_action.cc:272] Request: Mar 7 00:49:31.664788 update_engine[1887]: Mar 7 00:49:31.664788 update_engine[1887]: Mar 7 00:49:31.664788 update_engine[1887]: Mar 7 00:49:31.665286 update_engine[1887]: Mar 7 00:49:31.665286 update_engine[1887]: Mar 7 00:49:31.665286 update_engine[1887]: Mar 7 00:49:31.665286 update_engine[1887]: I20260307 00:49:31.663978 1887 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:49:31.665286 update_engine[1887]: I20260307 00:49:31.663996 1887 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:49:31.665286 update_engine[1887]: I20260307 00:49:31.664280 1887 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:49:31.665612 locksmithd[1984]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 7 00:49:31.766629 update_engine[1887]: E20260307 00:49:31.766345 1887 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:49:31.766629 update_engine[1887]: I20260307 00:49:31.766425 1887 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 7 00:49:31.766629 update_engine[1887]: I20260307 00:49:31.766431 1887 omaha_request_action.cc:617] Omaha request response: Mar 7 00:49:31.766629 update_engine[1887]: I20260307 00:49:31.766438 1887 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 7 00:49:31.766629 update_engine[1887]: I20260307 00:49:31.766443 1887 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 7 00:49:31.766629 update_engine[1887]: I20260307 00:49:31.766446 1887 update_attempter.cc:306] Processing Done. Mar 7 00:49:31.766629 update_engine[1887]: I20260307 00:49:31.766450 1887 update_attempter.cc:310] Error event sent. Mar 7 00:49:31.766629 update_engine[1887]: I20260307 00:49:31.766459 1887 update_check_scheduler.cc:74] Next update check in 40m6s Mar 7 00:49:31.766842 locksmithd[1984]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 7 00:49:31.800969 sshd[6087]: Connection closed by 10.200.16.10 port 34962 Mar 7 00:49:31.801807 sshd-session[6084]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:31.807743 systemd[1]: sshd@7-10.200.20.26:22-10.200.16.10:34962.service: Deactivated successfully. Mar 7 00:49:31.807998 systemd-logind[1883]: Session 10 logged out. Waiting for processes to exit. Mar 7 00:49:31.809900 systemd[1]: session-10.scope: Deactivated successfully. Mar 7 00:49:31.814010 systemd-logind[1883]: Removed session 10. Mar 7 00:49:34.322734 kubelet[3434]: I0307 00:49:34.322640 3434 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:49:36.892064 systemd[1]: Started sshd@8-10.200.20.26:22-10.200.16.10:34970.service - OpenSSH per-connection server daemon (10.200.16.10:34970). Mar 7 00:49:37.315250 sshd[6110]: Accepted publickey for core from 10.200.16.10 port 34970 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:37.316704 sshd-session[6110]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:37.320473 systemd-logind[1883]: New session 11 of user core. Mar 7 00:49:37.328451 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 7 00:49:37.608814 sshd[6113]: Connection closed by 10.200.16.10 port 34970 Mar 7 00:49:37.608648 sshd-session[6110]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:37.612425 systemd[1]: sshd@8-10.200.20.26:22-10.200.16.10:34970.service: Deactivated successfully. Mar 7 00:49:37.613919 systemd[1]: session-11.scope: Deactivated successfully. Mar 7 00:49:37.614596 systemd-logind[1883]: Session 11 logged out. Waiting for processes to exit. Mar 7 00:49:37.615799 systemd-logind[1883]: Removed session 11. Mar 7 00:49:42.697026 systemd[1]: Started sshd@9-10.200.20.26:22-10.200.16.10:45436.service - OpenSSH per-connection server daemon (10.200.16.10:45436). Mar 7 00:49:43.117249 sshd[6127]: Accepted publickey for core from 10.200.16.10 port 45436 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:43.119910 sshd-session[6127]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:43.124235 systemd-logind[1883]: New session 12 of user core. Mar 7 00:49:43.130351 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 7 00:49:43.426026 sshd[6132]: Connection closed by 10.200.16.10 port 45436 Mar 7 00:49:43.426962 sshd-session[6127]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:43.430980 systemd[1]: sshd@9-10.200.20.26:22-10.200.16.10:45436.service: Deactivated successfully. Mar 7 00:49:43.432778 systemd[1]: session-12.scope: Deactivated successfully. Mar 7 00:49:43.433556 systemd-logind[1883]: Session 12 logged out. Waiting for processes to exit. Mar 7 00:49:43.435549 systemd-logind[1883]: Removed session 12. Mar 7 00:49:48.516633 systemd[1]: Started sshd@10-10.200.20.26:22-10.200.16.10:45446.service - OpenSSH per-connection server daemon (10.200.16.10:45446). Mar 7 00:49:48.944122 sshd[6211]: Accepted publickey for core from 10.200.16.10 port 45446 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:48.945345 sshd-session[6211]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:48.949203 systemd-logind[1883]: New session 13 of user core. Mar 7 00:49:48.954311 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 7 00:49:49.227912 sshd[6214]: Connection closed by 10.200.16.10 port 45446 Mar 7 00:49:49.227596 sshd-session[6211]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:49.232327 systemd-logind[1883]: Session 13 logged out. Waiting for processes to exit. Mar 7 00:49:49.232930 systemd[1]: sshd@10-10.200.20.26:22-10.200.16.10:45446.service: Deactivated successfully. Mar 7 00:49:49.234490 systemd[1]: session-13.scope: Deactivated successfully. Mar 7 00:49:49.235785 systemd-logind[1883]: Removed session 13. Mar 7 00:49:49.319629 systemd[1]: Started sshd@11-10.200.20.26:22-10.200.16.10:45448.service - OpenSSH per-connection server daemon (10.200.16.10:45448). Mar 7 00:49:49.735644 sshd[6227]: Accepted publickey for core from 10.200.16.10 port 45448 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:49.736738 sshd-session[6227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:49.740806 systemd-logind[1883]: New session 14 of user core. Mar 7 00:49:49.748328 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 7 00:49:50.071815 sshd[6230]: Connection closed by 10.200.16.10 port 45448 Mar 7 00:49:50.071658 sshd-session[6227]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:50.075806 systemd[1]: sshd@11-10.200.20.26:22-10.200.16.10:45448.service: Deactivated successfully. Mar 7 00:49:50.079000 systemd[1]: session-14.scope: Deactivated successfully. Mar 7 00:49:50.079966 systemd-logind[1883]: Session 14 logged out. Waiting for processes to exit. Mar 7 00:49:50.081564 systemd-logind[1883]: Removed session 14. Mar 7 00:49:50.161343 systemd[1]: Started sshd@12-10.200.20.26:22-10.200.16.10:58822.service - OpenSSH per-connection server daemon (10.200.16.10:58822). Mar 7 00:49:50.574442 sshd[6239]: Accepted publickey for core from 10.200.16.10 port 58822 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:50.575559 sshd-session[6239]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:50.579712 systemd-logind[1883]: New session 15 of user core. Mar 7 00:49:50.585430 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 7 00:49:50.852540 sshd[6242]: Connection closed by 10.200.16.10 port 58822 Mar 7 00:49:50.853086 sshd-session[6239]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:50.857421 systemd[1]: sshd@12-10.200.20.26:22-10.200.16.10:58822.service: Deactivated successfully. Mar 7 00:49:50.859326 systemd[1]: session-15.scope: Deactivated successfully. Mar 7 00:49:50.860309 systemd-logind[1883]: Session 15 logged out. Waiting for processes to exit. Mar 7 00:49:50.861968 systemd-logind[1883]: Removed session 15. Mar 7 00:49:55.941891 systemd[1]: Started sshd@13-10.200.20.26:22-10.200.16.10:58828.service - OpenSSH per-connection server daemon (10.200.16.10:58828). Mar 7 00:49:56.365146 sshd[6256]: Accepted publickey for core from 10.200.16.10 port 58828 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:56.366671 sshd-session[6256]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:56.373338 systemd-logind[1883]: New session 16 of user core. Mar 7 00:49:56.377344 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 7 00:49:56.646315 sshd[6280]: Connection closed by 10.200.16.10 port 58828 Mar 7 00:49:56.646988 sshd-session[6256]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:56.652015 systemd[1]: sshd@13-10.200.20.26:22-10.200.16.10:58828.service: Deactivated successfully. Mar 7 00:49:56.654511 systemd[1]: session-16.scope: Deactivated successfully. Mar 7 00:49:56.659350 systemd-logind[1883]: Session 16 logged out. Waiting for processes to exit. Mar 7 00:49:56.660456 systemd-logind[1883]: Removed session 16. Mar 7 00:49:56.734089 systemd[1]: Started sshd@14-10.200.20.26:22-10.200.16.10:58840.service - OpenSSH per-connection server daemon (10.200.16.10:58840). Mar 7 00:49:57.151492 sshd[6291]: Accepted publickey for core from 10.200.16.10 port 58840 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:57.152615 sshd-session[6291]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:57.156634 systemd-logind[1883]: New session 17 of user core. Mar 7 00:49:57.163342 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 7 00:49:57.536228 sshd[6294]: Connection closed by 10.200.16.10 port 58840 Mar 7 00:49:57.560951 sshd-session[6291]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:57.566241 systemd[1]: sshd@14-10.200.20.26:22-10.200.16.10:58840.service: Deactivated successfully. Mar 7 00:49:57.570379 systemd[1]: session-17.scope: Deactivated successfully. Mar 7 00:49:57.571352 systemd-logind[1883]: Session 17 logged out. Waiting for processes to exit. Mar 7 00:49:57.573276 systemd-logind[1883]: Removed session 17. Mar 7 00:49:57.622956 systemd[1]: Started sshd@15-10.200.20.26:22-10.200.16.10:58844.service - OpenSSH per-connection server daemon (10.200.16.10:58844). Mar 7 00:49:58.045532 sshd[6304]: Accepted publickey for core from 10.200.16.10 port 58844 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:58.046908 sshd-session[6304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:58.051084 systemd-logind[1883]: New session 18 of user core. Mar 7 00:49:58.058328 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 7 00:49:58.822559 sshd[6307]: Connection closed by 10.200.16.10 port 58844 Mar 7 00:49:58.822931 sshd-session[6304]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:58.826506 systemd[1]: sshd@15-10.200.20.26:22-10.200.16.10:58844.service: Deactivated successfully. Mar 7 00:49:58.829571 systemd[1]: session-18.scope: Deactivated successfully. Mar 7 00:49:58.831557 systemd-logind[1883]: Session 18 logged out. Waiting for processes to exit. Mar 7 00:49:58.832916 systemd-logind[1883]: Removed session 18. Mar 7 00:49:58.917299 systemd[1]: Started sshd@16-10.200.20.26:22-10.200.16.10:58846.service - OpenSSH per-connection server daemon (10.200.16.10:58846). Mar 7 00:49:59.334946 sshd[6338]: Accepted publickey for core from 10.200.16.10 port 58846 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:49:59.335765 sshd-session[6338]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:49:59.339431 systemd-logind[1883]: New session 19 of user core. Mar 7 00:49:59.348326 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 7 00:49:59.691992 sshd[6341]: Connection closed by 10.200.16.10 port 58846 Mar 7 00:49:59.691808 sshd-session[6338]: pam_unix(sshd:session): session closed for user core Mar 7 00:49:59.695701 systemd[1]: sshd@16-10.200.20.26:22-10.200.16.10:58846.service: Deactivated successfully. Mar 7 00:49:59.699072 systemd[1]: session-19.scope: Deactivated successfully. Mar 7 00:49:59.701177 systemd-logind[1883]: Session 19 logged out. Waiting for processes to exit. Mar 7 00:49:59.703542 systemd-logind[1883]: Removed session 19. Mar 7 00:49:59.779053 systemd[1]: Started sshd@17-10.200.20.26:22-10.200.16.10:58852.service - OpenSSH per-connection server daemon (10.200.16.10:58852). Mar 7 00:50:00.195146 sshd[6353]: Accepted publickey for core from 10.200.16.10 port 58852 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:50:00.196328 sshd-session[6353]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:50:00.200423 systemd-logind[1883]: New session 20 of user core. Mar 7 00:50:00.207326 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 7 00:50:00.523954 sshd[6356]: Connection closed by 10.200.16.10 port 58852 Mar 7 00:50:00.524547 sshd-session[6353]: pam_unix(sshd:session): session closed for user core Mar 7 00:50:00.527745 systemd[1]: sshd@17-10.200.20.26:22-10.200.16.10:58852.service: Deactivated successfully. Mar 7 00:50:00.529427 systemd[1]: session-20.scope: Deactivated successfully. Mar 7 00:50:00.530103 systemd-logind[1883]: Session 20 logged out. Waiting for processes to exit. Mar 7 00:50:00.531834 systemd-logind[1883]: Removed session 20. Mar 7 00:50:05.622373 systemd[1]: Started sshd@18-10.200.20.26:22-10.200.16.10:40246.service - OpenSSH per-connection server daemon (10.200.16.10:40246). Mar 7 00:50:06.041858 sshd[6400]: Accepted publickey for core from 10.200.16.10 port 40246 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:50:06.043019 sshd-session[6400]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:50:06.046982 systemd-logind[1883]: New session 21 of user core. Mar 7 00:50:06.056326 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 7 00:50:06.321308 sshd[6405]: Connection closed by 10.200.16.10 port 40246 Mar 7 00:50:06.322901 sshd-session[6400]: pam_unix(sshd:session): session closed for user core Mar 7 00:50:06.326278 systemd[1]: sshd@18-10.200.20.26:22-10.200.16.10:40246.service: Deactivated successfully. Mar 7 00:50:06.328287 systemd[1]: session-21.scope: Deactivated successfully. Mar 7 00:50:06.329370 systemd-logind[1883]: Session 21 logged out. Waiting for processes to exit. Mar 7 00:50:06.330724 systemd-logind[1883]: Removed session 21. Mar 7 00:50:11.411803 systemd[1]: Started sshd@19-10.200.20.26:22-10.200.16.10:34384.service - OpenSSH per-connection server daemon (10.200.16.10:34384). Mar 7 00:50:11.829149 sshd[6418]: Accepted publickey for core from 10.200.16.10 port 34384 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:50:11.830405 sshd-session[6418]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:50:11.833985 systemd-logind[1883]: New session 22 of user core. Mar 7 00:50:11.842331 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 7 00:50:12.109586 sshd[6421]: Connection closed by 10.200.16.10 port 34384 Mar 7 00:50:12.110421 sshd-session[6418]: pam_unix(sshd:session): session closed for user core Mar 7 00:50:12.114096 systemd-logind[1883]: Session 22 logged out. Waiting for processes to exit. Mar 7 00:50:12.114364 systemd[1]: sshd@19-10.200.20.26:22-10.200.16.10:34384.service: Deactivated successfully. Mar 7 00:50:12.116880 systemd[1]: session-22.scope: Deactivated successfully. Mar 7 00:50:12.118706 systemd-logind[1883]: Removed session 22. Mar 7 00:50:17.198326 systemd[1]: Started sshd@20-10.200.20.26:22-10.200.16.10:34392.service - OpenSSH per-connection server daemon (10.200.16.10:34392). Mar 7 00:50:17.624902 sshd[6523]: Accepted publickey for core from 10.200.16.10 port 34392 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:50:17.625686 sshd-session[6523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:50:17.629806 systemd-logind[1883]: New session 23 of user core. Mar 7 00:50:17.633484 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 7 00:50:17.898396 sshd[6526]: Connection closed by 10.200.16.10 port 34392 Mar 7 00:50:17.898935 sshd-session[6523]: pam_unix(sshd:session): session closed for user core Mar 7 00:50:17.903001 systemd[1]: sshd@20-10.200.20.26:22-10.200.16.10:34392.service: Deactivated successfully. Mar 7 00:50:17.905179 systemd[1]: session-23.scope: Deactivated successfully. Mar 7 00:50:17.906358 systemd-logind[1883]: Session 23 logged out. Waiting for processes to exit. Mar 7 00:50:17.909056 systemd-logind[1883]: Removed session 23. Mar 7 00:50:22.989413 systemd[1]: Started sshd@21-10.200.20.26:22-10.200.16.10:43264.service - OpenSSH per-connection server daemon (10.200.16.10:43264). Mar 7 00:50:23.404719 sshd[6559]: Accepted publickey for core from 10.200.16.10 port 43264 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:50:23.406537 sshd-session[6559]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:50:23.410915 systemd-logind[1883]: New session 24 of user core. Mar 7 00:50:23.417332 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 7 00:50:23.681920 sshd[6562]: Connection closed by 10.200.16.10 port 43264 Mar 7 00:50:23.681116 sshd-session[6559]: pam_unix(sshd:session): session closed for user core Mar 7 00:50:23.684468 systemd-logind[1883]: Session 24 logged out. Waiting for processes to exit. Mar 7 00:50:23.684610 systemd[1]: sshd@21-10.200.20.26:22-10.200.16.10:43264.service: Deactivated successfully. Mar 7 00:50:23.686556 systemd[1]: session-24.scope: Deactivated successfully. Mar 7 00:50:23.688940 systemd-logind[1883]: Removed session 24. Mar 7 00:50:28.781354 systemd[1]: Started sshd@22-10.200.20.26:22-10.200.16.10:43276.service - OpenSSH per-connection server daemon (10.200.16.10:43276). Mar 7 00:50:29.204678 sshd[6594]: Accepted publickey for core from 10.200.16.10 port 43276 ssh2: RSA SHA256:JE8kgEbSicgM9iPPcpD9A3ndRLJ1370afumEFyydKJ0 Mar 7 00:50:29.205878 sshd-session[6594]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:50:29.210354 systemd-logind[1883]: New session 25 of user core. Mar 7 00:50:29.217312 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 7 00:50:29.482200 sshd[6597]: Connection closed by 10.200.16.10 port 43276 Mar 7 00:50:29.482561 sshd-session[6594]: pam_unix(sshd:session): session closed for user core Mar 7 00:50:29.486286 systemd-logind[1883]: Session 25 logged out. Waiting for processes to exit. Mar 7 00:50:29.487055 systemd[1]: sshd@22-10.200.20.26:22-10.200.16.10:43276.service: Deactivated successfully. Mar 7 00:50:29.489290 systemd[1]: session-25.scope: Deactivated successfully. Mar 7 00:50:29.490986 systemd-logind[1883]: Removed session 25.