Mar 2 12:55:16.083521 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Mar 2 12:55:16.083540 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Mon Mar 2 10:44:26 -00 2026 Mar 2 12:55:16.083547 kernel: KASLR enabled Mar 2 12:55:16.083551 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 2 12:55:16.083554 kernel: printk: legacy bootconsole [pl11] enabled Mar 2 12:55:16.083560 kernel: efi: EFI v2.7 by EDK II Mar 2 12:55:16.083565 kernel: efi: ACPI 2.0=0x3f979018 SMBIOS=0x3f8a0000 SMBIOS 3.0=0x3f880000 MEMATTR=0x3e89d018 RNG=0x3f979998 MEMRESERVE=0x3db83598 Mar 2 12:55:16.083569 kernel: random: crng init done Mar 2 12:55:16.083573 kernel: secureboot: Secure boot disabled Mar 2 12:55:16.083577 kernel: ACPI: Early table checksum verification disabled Mar 2 12:55:16.083581 kernel: ACPI: RSDP 0x000000003F979018 000024 (v02 VRTUAL) Mar 2 12:55:16.083585 kernel: ACPI: XSDT 0x000000003F979F18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083588 kernel: ACPI: FACP 0x000000003F979C18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083592 kernel: ACPI: DSDT 0x000000003F95A018 01E046 (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 2 12:55:16.083599 kernel: ACPI: DBG2 0x000000003F979B18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083603 kernel: ACPI: GTDT 0x000000003F979D98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083607 kernel: ACPI: OEM0 0x000000003F979098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083611 kernel: ACPI: SPCR 0x000000003F979A98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083616 kernel: ACPI: APIC 0x000000003F979818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083621 kernel: ACPI: SRAT 0x000000003F979198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083625 kernel: ACPI: PPTT 0x000000003F979418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 2 12:55:16.083629 kernel: ACPI: BGRT 0x000000003F979E98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 2 12:55:16.083633 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 2 12:55:16.083637 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 2 12:55:16.083642 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 2 12:55:16.083646 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Mar 2 12:55:16.083650 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Mar 2 12:55:16.083654 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 2 12:55:16.083659 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 2 12:55:16.083663 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 2 12:55:16.083668 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 2 12:55:16.083672 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 2 12:55:16.083676 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 2 12:55:16.083680 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 2 12:55:16.083685 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 2 12:55:16.083689 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 2 12:55:16.083693 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Mar 2 12:55:16.083697 kernel: NODE_DATA(0) allocated [mem 0x1bf7ffa00-0x1bf806fff] Mar 2 12:55:16.083702 kernel: Zone ranges: Mar 2 12:55:16.083706 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 2 12:55:16.083713 kernel: DMA32 empty Mar 2 12:55:16.083717 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 2 12:55:16.083721 kernel: Device empty Mar 2 12:55:16.083726 kernel: Movable zone start for each node Mar 2 12:55:16.083730 kernel: Early memory node ranges Mar 2 12:55:16.083734 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 2 12:55:16.083740 kernel: node 0: [mem 0x0000000000824000-0x000000003f38ffff] Mar 2 12:55:16.083744 kernel: node 0: [mem 0x000000003f390000-0x000000003f93ffff] Mar 2 12:55:16.083748 kernel: node 0: [mem 0x000000003f940000-0x000000003f9effff] Mar 2 12:55:16.083753 kernel: node 0: [mem 0x000000003f9f0000-0x000000003fdeffff] Mar 2 12:55:16.083757 kernel: node 0: [mem 0x000000003fdf0000-0x000000003fffffff] Mar 2 12:55:16.083761 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 2 12:55:16.083766 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 2 12:55:16.083770 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 2 12:55:16.083774 kernel: cma: Reserved 16 MiB at 0x000000003ca00000 on node -1 Mar 2 12:55:16.083779 kernel: psci: probing for conduit method from ACPI. Mar 2 12:55:16.083783 kernel: psci: PSCIv1.3 detected in firmware. Mar 2 12:55:16.083788 kernel: psci: Using standard PSCI v0.2 function IDs Mar 2 12:55:16.083793 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 2 12:55:16.085834 kernel: psci: SMC Calling Convention v1.4 Mar 2 12:55:16.085845 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 2 12:55:16.085851 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 2 12:55:16.085855 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 2 12:55:16.085860 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 2 12:55:16.085865 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 2 12:55:16.085869 kernel: Detected PIPT I-cache on CPU0 Mar 2 12:55:16.085874 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Mar 2 12:55:16.085878 kernel: CPU features: detected: GIC system register CPU interface Mar 2 12:55:16.085883 kernel: CPU features: detected: Spectre-v4 Mar 2 12:55:16.085887 kernel: CPU features: detected: Spectre-BHB Mar 2 12:55:16.085896 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 2 12:55:16.085901 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 2 12:55:16.085905 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Mar 2 12:55:16.085909 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 2 12:55:16.085914 kernel: alternatives: applying boot alternatives Mar 2 12:55:16.085920 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=75d5e65dbf56ddb5ea243beb025fcfbdb9b2a65e9b1b7d7db3d24aed3f0a168f Mar 2 12:55:16.085925 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 2 12:55:16.085929 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 2 12:55:16.085934 kernel: Fallback order for Node 0: 0 Mar 2 12:55:16.085938 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Mar 2 12:55:16.085944 kernel: Policy zone: Normal Mar 2 12:55:16.085949 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 2 12:55:16.085953 kernel: software IO TLB: area num 2. Mar 2 12:55:16.085957 kernel: software IO TLB: mapped [mem 0x0000000035900000-0x0000000039900000] (64MB) Mar 2 12:55:16.085962 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 2 12:55:16.085966 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 2 12:55:16.085972 kernel: rcu: RCU event tracing is enabled. Mar 2 12:55:16.085976 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 2 12:55:16.085981 kernel: Trampoline variant of Tasks RCU enabled. Mar 2 12:55:16.085985 kernel: Tracing variant of Tasks RCU enabled. Mar 2 12:55:16.085990 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 2 12:55:16.085994 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 2 12:55:16.086000 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 2 12:55:16.086004 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 2 12:55:16.086009 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 2 12:55:16.086013 kernel: GICv3: 960 SPIs implemented Mar 2 12:55:16.086018 kernel: GICv3: 0 Extended SPIs implemented Mar 2 12:55:16.086022 kernel: Root IRQ handler: gic_handle_irq Mar 2 12:55:16.086026 kernel: GICv3: GICv3 features: 16 PPIs, RSS Mar 2 12:55:16.086031 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Mar 2 12:55:16.086035 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 2 12:55:16.086040 kernel: ITS: No ITS available, not enabling LPIs Mar 2 12:55:16.086044 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 2 12:55:16.086050 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Mar 2 12:55:16.086054 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 2 12:55:16.086059 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Mar 2 12:55:16.086063 kernel: Console: colour dummy device 80x25 Mar 2 12:55:16.086068 kernel: printk: legacy console [tty1] enabled Mar 2 12:55:16.086073 kernel: ACPI: Core revision 20240827 Mar 2 12:55:16.086078 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Mar 2 12:55:16.086082 kernel: pid_max: default: 32768 minimum: 301 Mar 2 12:55:16.086087 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 2 12:55:16.086091 kernel: landlock: Up and running. Mar 2 12:55:16.086097 kernel: SELinux: Initializing. Mar 2 12:55:16.086102 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 2 12:55:16.086106 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 2 12:55:16.086111 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0xa0000e, misc 0x31e1 Mar 2 12:55:16.086116 kernel: Hyper-V: Host Build 10.0.26102.1212-1-0 Mar 2 12:55:16.086124 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 2 12:55:16.086130 kernel: rcu: Hierarchical SRCU implementation. Mar 2 12:55:16.086135 kernel: rcu: Max phase no-delay instances is 400. Mar 2 12:55:16.086139 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 2 12:55:16.086144 kernel: Remapping and enabling EFI services. Mar 2 12:55:16.086149 kernel: smp: Bringing up secondary CPUs ... Mar 2 12:55:16.086154 kernel: Detected PIPT I-cache on CPU1 Mar 2 12:55:16.086159 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 2 12:55:16.086164 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Mar 2 12:55:16.086169 kernel: smp: Brought up 1 node, 2 CPUs Mar 2 12:55:16.086174 kernel: SMP: Total of 2 processors activated. Mar 2 12:55:16.086179 kernel: CPU: All CPU(s) started at EL1 Mar 2 12:55:16.086184 kernel: CPU features: detected: 32-bit EL0 Support Mar 2 12:55:16.086189 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 2 12:55:16.086194 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 2 12:55:16.086199 kernel: CPU features: detected: Common not Private translations Mar 2 12:55:16.086203 kernel: CPU features: detected: CRC32 instructions Mar 2 12:55:16.086208 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Mar 2 12:55:16.086213 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 2 12:55:16.086218 kernel: CPU features: detected: LSE atomic instructions Mar 2 12:55:16.086223 kernel: CPU features: detected: Privileged Access Never Mar 2 12:55:16.086228 kernel: CPU features: detected: Speculation barrier (SB) Mar 2 12:55:16.086233 kernel: CPU features: detected: TLB range maintenance instructions Mar 2 12:55:16.086238 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 2 12:55:16.086243 kernel: CPU features: detected: Scalable Vector Extension Mar 2 12:55:16.086247 kernel: alternatives: applying system-wide alternatives Mar 2 12:55:16.086252 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Mar 2 12:55:16.086257 kernel: SVE: maximum available vector length 16 bytes per vector Mar 2 12:55:16.086262 kernel: SVE: default vector length 16 bytes per vector Mar 2 12:55:16.086267 kernel: Memory: 3952828K/4194160K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 220144K reserved, 16384K cma-reserved) Mar 2 12:55:16.086273 kernel: devtmpfs: initialized Mar 2 12:55:16.086278 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 2 12:55:16.086282 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 2 12:55:16.086287 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 2 12:55:16.086292 kernel: 0 pages in range for non-PLT usage Mar 2 12:55:16.086297 kernel: 508400 pages in range for PLT usage Mar 2 12:55:16.086301 kernel: pinctrl core: initialized pinctrl subsystem Mar 2 12:55:16.086306 kernel: SMBIOS 3.1.0 present. Mar 2 12:55:16.086311 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 06/10/2025 Mar 2 12:55:16.086316 kernel: DMI: Memory slots populated: 2/2 Mar 2 12:55:16.086321 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 2 12:55:16.086326 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 2 12:55:16.086331 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 2 12:55:16.086336 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 2 12:55:16.086340 kernel: audit: initializing netlink subsys (disabled) Mar 2 12:55:16.086345 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Mar 2 12:55:16.086350 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 2 12:55:16.086355 kernel: cpuidle: using governor menu Mar 2 12:55:16.086360 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 2 12:55:16.086365 kernel: ASID allocator initialised with 32768 entries Mar 2 12:55:16.086370 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 2 12:55:16.086374 kernel: Serial: AMBA PL011 UART driver Mar 2 12:55:16.086379 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 2 12:55:16.086384 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 2 12:55:16.086389 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 2 12:55:16.086394 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 2 12:55:16.086399 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 2 12:55:16.086404 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 2 12:55:16.086409 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 2 12:55:16.086414 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 2 12:55:16.086418 kernel: ACPI: Added _OSI(Module Device) Mar 2 12:55:16.086423 kernel: ACPI: Added _OSI(Processor Device) Mar 2 12:55:16.086428 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 2 12:55:16.086433 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 2 12:55:16.086437 kernel: ACPI: Interpreter enabled Mar 2 12:55:16.086443 kernel: ACPI: Using GIC for interrupt routing Mar 2 12:55:16.086448 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 2 12:55:16.086453 kernel: printk: legacy console [ttyAMA0] enabled Mar 2 12:55:16.086458 kernel: printk: legacy bootconsole [pl11] disabled Mar 2 12:55:16.086462 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 2 12:55:16.086467 kernel: ACPI: CPU0 has been hot-added Mar 2 12:55:16.086472 kernel: ACPI: CPU1 has been hot-added Mar 2 12:55:16.086477 kernel: iommu: Default domain type: Translated Mar 2 12:55:16.086481 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 2 12:55:16.086486 kernel: efivars: Registered efivars operations Mar 2 12:55:16.086492 kernel: vgaarb: loaded Mar 2 12:55:16.086497 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 2 12:55:16.086501 kernel: VFS: Disk quotas dquot_6.6.0 Mar 2 12:55:16.086506 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 2 12:55:16.086511 kernel: pnp: PnP ACPI init Mar 2 12:55:16.086516 kernel: pnp: PnP ACPI: found 0 devices Mar 2 12:55:16.086520 kernel: NET: Registered PF_INET protocol family Mar 2 12:55:16.086525 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 2 12:55:16.086530 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 2 12:55:16.086536 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 2 12:55:16.086540 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 2 12:55:16.086545 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 2 12:55:16.086550 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 2 12:55:16.086555 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 2 12:55:16.086559 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 2 12:55:16.086564 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 2 12:55:16.086569 kernel: PCI: CLS 0 bytes, default 64 Mar 2 12:55:16.086574 kernel: kvm [1]: HYP mode not available Mar 2 12:55:16.086579 kernel: Initialise system trusted keyrings Mar 2 12:55:16.086584 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 2 12:55:16.086589 kernel: Key type asymmetric registered Mar 2 12:55:16.086594 kernel: Asymmetric key parser 'x509' registered Mar 2 12:55:16.086598 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 2 12:55:16.086603 kernel: io scheduler mq-deadline registered Mar 2 12:55:16.086608 kernel: io scheduler kyber registered Mar 2 12:55:16.086613 kernel: io scheduler bfq registered Mar 2 12:55:16.086617 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 2 12:55:16.086623 kernel: thunder_xcv, ver 1.0 Mar 2 12:55:16.086628 kernel: thunder_bgx, ver 1.0 Mar 2 12:55:16.086632 kernel: nicpf, ver 1.0 Mar 2 12:55:16.086637 kernel: nicvf, ver 1.0 Mar 2 12:55:16.086770 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 2 12:55:16.086836 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-02T12:55:15 UTC (1772456115) Mar 2 12:55:16.086843 kernel: efifb: probing for efifb Mar 2 12:55:16.086850 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 2 12:55:16.086855 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 2 12:55:16.086859 kernel: efifb: scrolling: redraw Mar 2 12:55:16.086864 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 2 12:55:16.086869 kernel: Console: switching to colour frame buffer device 128x48 Mar 2 12:55:16.086874 kernel: fb0: EFI VGA frame buffer device Mar 2 12:55:16.086878 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 2 12:55:16.086883 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 2 12:55:16.086888 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 2 12:55:16.086894 kernel: NET: Registered PF_INET6 protocol family Mar 2 12:55:16.086899 kernel: watchdog: NMI not fully supported Mar 2 12:55:16.086903 kernel: watchdog: Hard watchdog permanently disabled Mar 2 12:55:16.086908 kernel: Segment Routing with IPv6 Mar 2 12:55:16.086913 kernel: In-situ OAM (IOAM) with IPv6 Mar 2 12:55:16.086918 kernel: NET: Registered PF_PACKET protocol family Mar 2 12:55:16.086922 kernel: Key type dns_resolver registered Mar 2 12:55:16.086927 kernel: registered taskstats version 1 Mar 2 12:55:16.086932 kernel: Loading compiled-in X.509 certificates Mar 2 12:55:16.086937 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 03854795d80c6b1eedd5f94f64a67d19428ce88e' Mar 2 12:55:16.086942 kernel: Demotion targets for Node 0: null Mar 2 12:55:16.086947 kernel: Key type .fscrypt registered Mar 2 12:55:16.086952 kernel: Key type fscrypt-provisioning registered Mar 2 12:55:16.086956 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 2 12:55:16.086961 kernel: ima: Allocated hash algorithm: sha1 Mar 2 12:55:16.086966 kernel: ima: No architecture policies found Mar 2 12:55:16.086971 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 2 12:55:16.086976 kernel: clk: Disabling unused clocks Mar 2 12:55:16.086980 kernel: PM: genpd: Disabling unused power domains Mar 2 12:55:16.086986 kernel: Warning: unable to open an initial console. Mar 2 12:55:16.086991 kernel: Freeing unused kernel memory: 39552K Mar 2 12:55:16.086996 kernel: Run /init as init process Mar 2 12:55:16.087001 kernel: with arguments: Mar 2 12:55:16.087005 kernel: /init Mar 2 12:55:16.087010 kernel: with environment: Mar 2 12:55:16.087015 kernel: HOME=/ Mar 2 12:55:16.087019 kernel: TERM=linux Mar 2 12:55:16.087025 systemd[1]: Successfully made /usr/ read-only. Mar 2 12:55:16.087033 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 2 12:55:16.087039 systemd[1]: Detected virtualization microsoft. Mar 2 12:55:16.087044 systemd[1]: Detected architecture arm64. Mar 2 12:55:16.087049 systemd[1]: Running in initrd. Mar 2 12:55:16.087054 systemd[1]: No hostname configured, using default hostname. Mar 2 12:55:16.087059 systemd[1]: Hostname set to . Mar 2 12:55:16.087064 systemd[1]: Initializing machine ID from random generator. Mar 2 12:55:16.087070 systemd[1]: Queued start job for default target initrd.target. Mar 2 12:55:16.087076 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 2 12:55:16.087081 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 2 12:55:16.087086 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 2 12:55:16.087092 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 2 12:55:16.087097 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 2 12:55:16.087103 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 2 12:55:16.087110 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 2 12:55:16.087115 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 2 12:55:16.087120 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 2 12:55:16.087125 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 2 12:55:16.087131 systemd[1]: Reached target paths.target - Path Units. Mar 2 12:55:16.087136 systemd[1]: Reached target slices.target - Slice Units. Mar 2 12:55:16.087141 systemd[1]: Reached target swap.target - Swaps. Mar 2 12:55:16.087146 systemd[1]: Reached target timers.target - Timer Units. Mar 2 12:55:16.087152 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 2 12:55:16.087157 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 2 12:55:16.087163 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 2 12:55:16.087168 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 2 12:55:16.087173 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 2 12:55:16.087179 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 2 12:55:16.087184 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 2 12:55:16.087189 systemd[1]: Reached target sockets.target - Socket Units. Mar 2 12:55:16.087194 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 2 12:55:16.087200 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 2 12:55:16.087206 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 2 12:55:16.087211 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 2 12:55:16.087216 systemd[1]: Starting systemd-fsck-usr.service... Mar 2 12:55:16.087222 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 2 12:55:16.087227 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 2 12:55:16.087246 systemd-journald[225]: Collecting audit messages is disabled. Mar 2 12:55:16.087261 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:55:16.087267 systemd-journald[225]: Journal started Mar 2 12:55:16.087282 systemd-journald[225]: Runtime Journal (/run/log/journal/03f617d1bc884c0985bd9f1a0640d7ec) is 8M, max 78.3M, 70.3M free. Mar 2 12:55:16.095893 systemd-modules-load[227]: Inserted module 'overlay' Mar 2 12:55:16.109051 systemd[1]: Started systemd-journald.service - Journal Service. Mar 2 12:55:16.121828 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 2 12:55:16.123564 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 2 12:55:16.144417 kernel: Bridge firewalling registered Mar 2 12:55:16.125288 systemd-modules-load[227]: Inserted module 'br_netfilter' Mar 2 12:55:16.130280 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 2 12:55:16.135648 systemd[1]: Finished systemd-fsck-usr.service. Mar 2 12:55:16.139312 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 2 12:55:16.149578 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:55:16.159873 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 2 12:55:16.187729 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 2 12:55:16.192950 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 2 12:55:16.213462 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 2 12:55:16.231873 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 2 12:55:16.246265 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 2 12:55:16.254332 systemd-tmpfiles[251]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 2 12:55:16.255152 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 2 12:55:16.275915 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 2 12:55:16.285315 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 2 12:55:16.294310 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 2 12:55:16.306197 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 2 12:55:16.318901 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 2 12:55:16.343831 dracut-cmdline[265]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=75d5e65dbf56ddb5ea243beb025fcfbdb9b2a65e9b1b7d7db3d24aed3f0a168f Mar 2 12:55:16.372343 systemd-resolved[266]: Positive Trust Anchors: Mar 2 12:55:16.372362 systemd-resolved[266]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 2 12:55:16.372381 systemd-resolved[266]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 2 12:55:16.374259 systemd-resolved[266]: Defaulting to hostname 'linux'. Mar 2 12:55:16.374954 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 2 12:55:16.381487 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 2 12:55:16.489823 kernel: SCSI subsystem initialized Mar 2 12:55:16.495818 kernel: Loading iSCSI transport class v2.0-870. Mar 2 12:55:16.503829 kernel: iscsi: registered transport (tcp) Mar 2 12:55:16.517557 kernel: iscsi: registered transport (qla4xxx) Mar 2 12:55:16.517620 kernel: QLogic iSCSI HBA Driver Mar 2 12:55:16.530990 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 2 12:55:16.556735 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 2 12:55:16.563186 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 2 12:55:16.612986 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 2 12:55:16.618833 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 2 12:55:16.681820 kernel: raid6: neonx8 gen() 18527 MB/s Mar 2 12:55:16.699808 kernel: raid6: neonx4 gen() 18559 MB/s Mar 2 12:55:16.718804 kernel: raid6: neonx2 gen() 17063 MB/s Mar 2 12:55:16.738807 kernel: raid6: neonx1 gen() 15016 MB/s Mar 2 12:55:16.757806 kernel: raid6: int64x8 gen() 10523 MB/s Mar 2 12:55:16.776818 kernel: raid6: int64x4 gen() 10602 MB/s Mar 2 12:55:16.796807 kernel: raid6: int64x2 gen() 8982 MB/s Mar 2 12:55:16.818283 kernel: raid6: int64x1 gen() 7041 MB/s Mar 2 12:55:16.818292 kernel: raid6: using algorithm neonx4 gen() 18559 MB/s Mar 2 12:55:16.839989 kernel: raid6: .... xor() 15144 MB/s, rmw enabled Mar 2 12:55:16.839997 kernel: raid6: using neon recovery algorithm Mar 2 12:55:16.847867 kernel: xor: measuring software checksum speed Mar 2 12:55:16.847876 kernel: 8regs : 28611 MB/sec Mar 2 12:55:16.850601 kernel: 32regs : 28767 MB/sec Mar 2 12:55:16.853837 kernel: arm64_neon : 37538 MB/sec Mar 2 12:55:16.856798 kernel: xor: using function: arm64_neon (37538 MB/sec) Mar 2 12:55:16.895054 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 2 12:55:16.900203 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 2 12:55:16.906446 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 2 12:55:16.948777 systemd-udevd[474]: Using default interface naming scheme 'v255'. Mar 2 12:55:16.955379 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 2 12:55:16.968337 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 2 12:55:16.991016 dracut-pre-trigger[484]: rd.md=0: removing MD RAID activation Mar 2 12:55:17.010951 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 2 12:55:17.017221 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 2 12:55:17.066718 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 2 12:55:17.076120 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 2 12:55:17.139824 kernel: hv_vmbus: Vmbus version:5.3 Mar 2 12:55:17.149375 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 2 12:55:17.149433 kernel: hv_vmbus: registering driver hid_hyperv Mar 2 12:55:17.149441 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 2 12:55:17.161499 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 2 12:55:17.161552 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 2 12:55:17.162882 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 2 12:55:17.190619 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 2 12:55:17.190639 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 2 12:55:17.190767 kernel: PTP clock support registered Mar 2 12:55:17.163180 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:55:17.194951 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:55:17.217863 kernel: hv_utils: Registering HyperV Utility Driver Mar 2 12:55:17.217903 kernel: hv_vmbus: registering driver hv_storvsc Mar 2 12:55:17.217911 kernel: hv_vmbus: registering driver hv_netvsc Mar 2 12:55:17.220820 kernel: scsi host0: storvsc_host_t Mar 2 12:55:17.222650 kernel: hv_vmbus: registering driver hv_utils Mar 2 12:55:17.228220 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 2 12:55:17.228544 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:55:17.477189 kernel: hv_utils: Heartbeat IC version 3.0 Mar 2 12:55:17.477212 kernel: hv_utils: Shutdown IC version 3.2 Mar 2 12:55:17.477219 kernel: hv_utils: TimeSync IC version 4.0 Mar 2 12:55:17.477234 kernel: scsi host1: storvsc_host_t Mar 2 12:55:17.477394 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Mar 2 12:55:17.453792 systemd-resolved[266]: Clock change detected. Flushing caches. Mar 2 12:55:17.484639 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 2 12:55:17.492355 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 2 12:55:17.492451 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:55:17.512438 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 2 12:55:17.512603 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 2 12:55:17.512673 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 2 12:55:17.523705 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 2 12:55:17.523827 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 2 12:55:17.527051 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:55:17.550878 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 2 12:55:17.550923 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 2 12:55:17.557022 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 2 12:55:17.557186 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 2 12:55:17.559925 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:55:17.571374 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 2 12:55:17.586226 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#205 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 2 12:55:17.606168 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#236 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 2 12:55:17.703915 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 2 12:55:17.745737 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 2 12:55:17.759133 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 2 12:55:17.770234 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 2 12:55:17.775362 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 2 12:55:17.781008 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 2 12:55:17.803529 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 2 12:55:17.809161 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 2 12:55:17.818060 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 2 12:55:17.827318 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 2 12:55:17.837709 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 2 12:55:17.862625 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 2 12:55:17.863984 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 2 12:55:17.881168 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 2 12:55:18.749416 kernel: hv_netvsc 7ced8d79-3c63-7ced-8d79-3c637ced8d79 eth0: VF slot 1 added Mar 2 12:55:18.755167 kernel: hv_vmbus: registering driver hv_pci Mar 2 12:55:18.761311 kernel: hv_pci e0ca9d7b-1f4f-405b-911e-dd2b09514da7: PCI VMBus probing: Using version 0x10004 Mar 2 12:55:18.761483 kernel: hv_pci e0ca9d7b-1f4f-405b-911e-dd2b09514da7: PCI host bridge to bus 1f4f:00 Mar 2 12:55:18.769560 kernel: pci_bus 1f4f:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 2 12:55:18.773770 kernel: pci_bus 1f4f:00: No busn resource found for root bus, will use [bus 00-ff] Mar 2 12:55:18.780542 kernel: pci 1f4f:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Mar 2 12:55:18.785186 kernel: pci 1f4f:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 2 12:55:18.790270 kernel: pci 1f4f:00:02.0: enabling Extended Tags Mar 2 12:55:18.805223 kernel: pci 1f4f:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 1f4f:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Mar 2 12:55:18.814746 kernel: pci_bus 1f4f:00: busn_res: [bus 00-ff] end is updated to 00 Mar 2 12:55:18.814874 kernel: pci 1f4f:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Mar 2 12:55:18.874620 kernel: mlx5_core 1f4f:00:02.0: enabling device (0000 -> 0002) Mar 2 12:55:18.885463 kernel: mlx5_core 1f4f:00:02.0: PTM is not supported by PCIe Mar 2 12:55:18.885642 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 2 12:55:18.885658 kernel: mlx5_core 1f4f:00:02.0: firmware version: 16.30.5026 Mar 2 12:55:18.885735 disk-uuid[644]: The operation has completed successfully. Mar 2 12:55:19.067705 kernel: hv_netvsc 7ced8d79-3c63-7ced-8d79-3c637ced8d79 eth0: VF registering: eth1 Mar 2 12:55:19.067911 kernel: mlx5_core 1f4f:00:02.0 eth1: joined to eth0 Mar 2 12:55:19.073280 kernel: mlx5_core 1f4f:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 2 12:55:19.083174 kernel: mlx5_core 1f4f:00:02.0 enP8015s1: renamed from eth1 Mar 2 12:55:19.093044 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 2 12:55:19.095324 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 2 12:55:19.103290 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 2 12:55:19.126624 sh[823]: Success Mar 2 12:55:19.150485 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 2 12:55:19.150550 kernel: device-mapper: uevent: version 1.0.3 Mar 2 12:55:19.155510 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 2 12:55:19.167273 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 2 12:55:19.256259 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 2 12:55:19.275466 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 2 12:55:19.284185 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 2 12:55:19.309195 kernel: BTRFS: device fsid da6bd89d-75a6-483e-9a3e-89df5ed9b6c4 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (841) Mar 2 12:55:19.319980 kernel: BTRFS info (device dm-0): first mount of filesystem da6bd89d-75a6-483e-9a3e-89df5ed9b6c4 Mar 2 12:55:19.320022 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 2 12:55:19.399830 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 2 12:55:19.399901 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 2 12:55:19.412188 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 2 12:55:19.415965 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 2 12:55:19.423545 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 2 12:55:19.424261 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 2 12:55:19.447209 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 2 12:55:19.477185 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (864) Mar 2 12:55:19.487616 kernel: BTRFS info (device sda6): first mount of filesystem fce10a79-d373-45d9-9854-55ae8d2c9f36 Mar 2 12:55:19.487673 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 2 12:55:19.501675 kernel: BTRFS info (device sda6): turning on async discard Mar 2 12:55:19.501731 kernel: BTRFS info (device sda6): enabling free space tree Mar 2 12:55:19.510205 kernel: BTRFS info (device sda6): last unmount of filesystem fce10a79-d373-45d9-9854-55ae8d2c9f36 Mar 2 12:55:19.512243 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 2 12:55:19.519488 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 2 12:55:19.580725 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 2 12:55:19.594687 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 2 12:55:19.627868 systemd-networkd[1010]: lo: Link UP Mar 2 12:55:19.630960 systemd-networkd[1010]: lo: Gained carrier Mar 2 12:55:19.631783 systemd-networkd[1010]: Enumeration completed Mar 2 12:55:19.631892 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 2 12:55:19.632738 systemd-networkd[1010]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:55:19.632741 systemd-networkd[1010]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 2 12:55:19.637922 systemd[1]: Reached target network.target - Network. Mar 2 12:55:19.712176 kernel: mlx5_core 1f4f:00:02.0 enP8015s1: Link up Mar 2 12:55:19.747175 kernel: hv_netvsc 7ced8d79-3c63-7ced-8d79-3c637ced8d79 eth0: Data path switched to VF: enP8015s1 Mar 2 12:55:19.747612 systemd-networkd[1010]: enP8015s1: Link UP Mar 2 12:55:19.747666 systemd-networkd[1010]: eth0: Link UP Mar 2 12:55:19.747755 systemd-networkd[1010]: eth0: Gained carrier Mar 2 12:55:19.747769 systemd-networkd[1010]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:55:19.755299 systemd-networkd[1010]: enP8015s1: Gained carrier Mar 2 12:55:19.777201 systemd-networkd[1010]: eth0: DHCPv4 address 10.200.20.16/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 2 12:55:19.888573 ignition[925]: Ignition 2.22.0 Mar 2 12:55:19.888590 ignition[925]: Stage: fetch-offline Mar 2 12:55:19.892184 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 2 12:55:19.888683 ignition[925]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:55:19.901107 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 2 12:55:19.888690 ignition[925]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 2 12:55:19.888756 ignition[925]: parsed url from cmdline: "" Mar 2 12:55:19.888758 ignition[925]: no config URL provided Mar 2 12:55:19.888761 ignition[925]: reading system config file "/usr/lib/ignition/user.ign" Mar 2 12:55:19.888766 ignition[925]: no config at "/usr/lib/ignition/user.ign" Mar 2 12:55:19.888769 ignition[925]: failed to fetch config: resource requires networking Mar 2 12:55:19.889224 ignition[925]: Ignition finished successfully Mar 2 12:55:19.931356 ignition[1019]: Ignition 2.22.0 Mar 2 12:55:19.931361 ignition[1019]: Stage: fetch Mar 2 12:55:19.931603 ignition[1019]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:55:19.931610 ignition[1019]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 2 12:55:19.931688 ignition[1019]: parsed url from cmdline: "" Mar 2 12:55:19.931691 ignition[1019]: no config URL provided Mar 2 12:55:19.931696 ignition[1019]: reading system config file "/usr/lib/ignition/user.ign" Mar 2 12:55:19.931701 ignition[1019]: no config at "/usr/lib/ignition/user.ign" Mar 2 12:55:19.931716 ignition[1019]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 2 12:55:20.045278 ignition[1019]: GET result: OK Mar 2 12:55:20.045347 ignition[1019]: config has been read from IMDS userdata Mar 2 12:55:20.045368 ignition[1019]: parsing config with SHA512: 2ebf302075941e7a07fe00f053526481567b44d0565b7400e56ecb5321b5d367e14ac2242f0e4de6162e6b0f3f7991ee6e9f07dd2929455906bc1f7fa24ce652 Mar 2 12:55:20.048342 unknown[1019]: fetched base config from "system" Mar 2 12:55:20.048616 ignition[1019]: fetch: fetch complete Mar 2 12:55:20.048347 unknown[1019]: fetched base config from "system" Mar 2 12:55:20.048620 ignition[1019]: fetch: fetch passed Mar 2 12:55:20.048350 unknown[1019]: fetched user config from "azure" Mar 2 12:55:20.048660 ignition[1019]: Ignition finished successfully Mar 2 12:55:20.053188 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 2 12:55:20.062464 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 2 12:55:20.103410 ignition[1026]: Ignition 2.22.0 Mar 2 12:55:20.103420 ignition[1026]: Stage: kargs Mar 2 12:55:20.106811 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 2 12:55:20.103595 ignition[1026]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:55:20.114438 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 2 12:55:20.103601 ignition[1026]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 2 12:55:20.104039 ignition[1026]: kargs: kargs passed Mar 2 12:55:20.104076 ignition[1026]: Ignition finished successfully Mar 2 12:55:20.142394 ignition[1032]: Ignition 2.22.0 Mar 2 12:55:20.142402 ignition[1032]: Stage: disks Mar 2 12:55:20.146318 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 2 12:55:20.142584 ignition[1032]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:55:20.153226 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 2 12:55:20.142591 ignition[1032]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 2 12:55:20.161798 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 2 12:55:20.143272 ignition[1032]: disks: disks passed Mar 2 12:55:20.171504 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 2 12:55:20.143315 ignition[1032]: Ignition finished successfully Mar 2 12:55:20.179590 systemd[1]: Reached target sysinit.target - System Initialization. Mar 2 12:55:20.188140 systemd[1]: Reached target basic.target - Basic System. Mar 2 12:55:20.197081 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 2 12:55:20.245528 systemd-fsck[1040]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Mar 2 12:55:20.254472 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 2 12:55:20.266746 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 2 12:55:20.380169 kernel: EXT4-fs (sda9): mounted filesystem 6408ffd3-d563-490c-803b-1f4582ee0319 r/w with ordered data mode. Quota mode: none. Mar 2 12:55:20.381420 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 2 12:55:20.385434 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 2 12:55:20.398582 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 2 12:55:20.417879 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 2 12:55:20.427500 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 2 12:55:20.438188 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 2 12:55:20.438226 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 2 12:55:20.460477 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 2 12:55:20.474224 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1054) Mar 2 12:55:20.474603 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 2 12:55:20.494956 kernel: BTRFS info (device sda6): first mount of filesystem fce10a79-d373-45d9-9854-55ae8d2c9f36 Mar 2 12:55:20.494976 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 2 12:55:20.510043 kernel: BTRFS info (device sda6): turning on async discard Mar 2 12:55:20.510089 kernel: BTRFS info (device sda6): enabling free space tree Mar 2 12:55:20.506397 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 2 12:55:20.607504 coreos-metadata[1056]: Mar 02 12:55:20.607 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 2 12:55:20.613884 coreos-metadata[1056]: Mar 02 12:55:20.613 INFO Fetch successful Mar 2 12:55:20.613884 coreos-metadata[1056]: Mar 02 12:55:20.613 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 2 12:55:20.626871 coreos-metadata[1056]: Mar 02 12:55:20.626 INFO Fetch successful Mar 2 12:55:20.631471 coreos-metadata[1056]: Mar 02 12:55:20.630 INFO wrote hostname ci-4459.2.101-5c781fe851 to /sysroot/etc/hostname Mar 2 12:55:20.638735 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 2 12:55:20.672046 initrd-setup-root[1084]: cut: /sysroot/etc/passwd: No such file or directory Mar 2 12:55:20.687195 initrd-setup-root[1091]: cut: /sysroot/etc/group: No such file or directory Mar 2 12:55:20.693826 initrd-setup-root[1098]: cut: /sysroot/etc/shadow: No such file or directory Mar 2 12:55:20.705024 initrd-setup-root[1105]: cut: /sysroot/etc/gshadow: No such file or directory Mar 2 12:55:20.990053 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 2 12:55:20.995922 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 2 12:55:21.012861 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 2 12:55:21.030325 kernel: BTRFS info (device sda6): last unmount of filesystem fce10a79-d373-45d9-9854-55ae8d2c9f36 Mar 2 12:55:21.030325 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 2 12:55:21.057210 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 2 12:55:21.069326 ignition[1172]: INFO : Ignition 2.22.0 Mar 2 12:55:21.069326 ignition[1172]: INFO : Stage: mount Mar 2 12:55:21.076253 ignition[1172]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 2 12:55:21.076253 ignition[1172]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 2 12:55:21.076253 ignition[1172]: INFO : mount: mount passed Mar 2 12:55:21.076253 ignition[1172]: INFO : Ignition finished successfully Mar 2 12:55:21.074445 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 2 12:55:21.081191 systemd-networkd[1010]: eth0: Gained IPv6LL Mar 2 12:55:21.082699 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 2 12:55:21.382417 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 2 12:55:21.408165 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1184) Mar 2 12:55:21.418737 kernel: BTRFS info (device sda6): first mount of filesystem fce10a79-d373-45d9-9854-55ae8d2c9f36 Mar 2 12:55:21.418792 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 2 12:55:21.427866 kernel: BTRFS info (device sda6): turning on async discard Mar 2 12:55:21.427910 kernel: BTRFS info (device sda6): enabling free space tree Mar 2 12:55:21.429421 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 2 12:55:21.461627 ignition[1201]: INFO : Ignition 2.22.0 Mar 2 12:55:21.461627 ignition[1201]: INFO : Stage: files Mar 2 12:55:21.467417 ignition[1201]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 2 12:55:21.467417 ignition[1201]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 2 12:55:21.467417 ignition[1201]: DEBUG : files: compiled without relabeling support, skipping Mar 2 12:55:21.480523 ignition[1201]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 2 12:55:21.480523 ignition[1201]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 2 12:55:21.504194 ignition[1201]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 2 12:55:21.510304 ignition[1201]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 2 12:55:21.510304 ignition[1201]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 2 12:55:21.504585 unknown[1201]: wrote ssh authorized keys file for user: core Mar 2 12:55:21.525410 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 2 12:55:21.525410 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 2 12:55:21.590698 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 2 12:55:21.913341 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 2 12:55:21.921325 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 2 12:55:21.979423 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 2 12:55:21.979423 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 2 12:55:21.979423 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 2 12:55:21.979423 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 2 12:55:21.979423 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 2 12:55:21.979423 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-arm64.raw: attempt #1 Mar 2 12:55:22.374939 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 2 12:55:22.634256 ignition[1201]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 2 12:55:22.634256 ignition[1201]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 2 12:55:22.648079 ignition[1201]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 2 12:55:22.656392 ignition[1201]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 2 12:55:22.656392 ignition[1201]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 2 12:55:22.656392 ignition[1201]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 2 12:55:22.656392 ignition[1201]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 2 12:55:22.656392 ignition[1201]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 2 12:55:22.656392 ignition[1201]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 2 12:55:22.656392 ignition[1201]: INFO : files: files passed Mar 2 12:55:22.656392 ignition[1201]: INFO : Ignition finished successfully Mar 2 12:55:22.657212 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 2 12:55:22.668823 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 2 12:55:22.692884 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 2 12:55:22.738115 initrd-setup-root-after-ignition[1231]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 2 12:55:22.738115 initrd-setup-root-after-ignition[1231]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 2 12:55:22.705422 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 2 12:55:22.770214 initrd-setup-root-after-ignition[1235]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 2 12:55:22.705496 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 2 12:55:22.733917 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 2 12:55:22.743292 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 2 12:55:22.753913 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 2 12:55:22.798257 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 2 12:55:22.798368 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 2 12:55:22.807085 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 2 12:55:22.816318 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 2 12:55:22.824029 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 2 12:55:22.824745 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 2 12:55:22.861861 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 2 12:55:22.868281 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 2 12:55:22.892717 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 2 12:55:22.897639 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 2 12:55:22.906742 systemd[1]: Stopped target timers.target - Timer Units. Mar 2 12:55:22.914908 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 2 12:55:22.915024 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 2 12:55:22.926993 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 2 12:55:22.931302 systemd[1]: Stopped target basic.target - Basic System. Mar 2 12:55:22.939761 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 2 12:55:22.948287 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 2 12:55:22.956701 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 2 12:55:22.965345 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 2 12:55:22.974398 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 2 12:55:22.983021 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 2 12:55:22.992418 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 2 12:55:23.000836 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 2 12:55:23.009620 systemd[1]: Stopped target swap.target - Swaps. Mar 2 12:55:23.017332 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 2 12:55:23.017451 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 2 12:55:23.028489 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 2 12:55:23.037288 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 2 12:55:23.046627 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 2 12:55:23.046691 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 2 12:55:23.056113 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 2 12:55:23.056217 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 2 12:55:23.071230 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 2 12:55:23.071311 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 2 12:55:23.076974 systemd[1]: ignition-files.service: Deactivated successfully. Mar 2 12:55:23.077042 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 2 12:55:23.087556 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 2 12:55:23.087621 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 2 12:55:23.098372 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 2 12:55:23.149593 ignition[1255]: INFO : Ignition 2.22.0 Mar 2 12:55:23.149593 ignition[1255]: INFO : Stage: umount Mar 2 12:55:23.149593 ignition[1255]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 2 12:55:23.149593 ignition[1255]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 2 12:55:23.149593 ignition[1255]: INFO : umount: umount passed Mar 2 12:55:23.149593 ignition[1255]: INFO : Ignition finished successfully Mar 2 12:55:23.133340 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 2 12:55:23.144511 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 2 12:55:23.144662 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 2 12:55:23.153349 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 2 12:55:23.153433 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 2 12:55:23.165483 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 2 12:55:23.167139 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 2 12:55:23.179797 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 2 12:55:23.179889 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 2 12:55:23.197616 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 2 12:55:23.197683 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 2 12:55:23.205628 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 2 12:55:23.205679 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 2 12:55:23.213998 systemd[1]: Stopped target network.target - Network. Mar 2 12:55:23.222115 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 2 12:55:23.222198 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 2 12:55:23.231025 systemd[1]: Stopped target paths.target - Path Units. Mar 2 12:55:23.239205 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 2 12:55:23.243165 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 2 12:55:23.253862 systemd[1]: Stopped target slices.target - Slice Units. Mar 2 12:55:23.261112 systemd[1]: Stopped target sockets.target - Socket Units. Mar 2 12:55:23.268893 systemd[1]: iscsid.socket: Deactivated successfully. Mar 2 12:55:23.268938 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 2 12:55:23.277121 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 2 12:55:23.277166 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 2 12:55:23.285168 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 2 12:55:23.285220 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 2 12:55:23.292799 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 2 12:55:23.292825 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 2 12:55:23.300746 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 2 12:55:23.309057 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 2 12:55:23.320495 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 2 12:55:23.320976 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 2 12:55:23.321060 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 2 12:55:23.334684 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 2 12:55:23.334904 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 2 12:55:23.334991 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 2 12:55:23.352662 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 2 12:55:23.352874 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 2 12:55:23.352959 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 2 12:55:23.362455 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 2 12:55:23.369233 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 2 12:55:23.369284 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 2 12:55:23.385275 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 2 12:55:23.397985 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 2 12:55:23.398054 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 2 12:55:23.536628 kernel: hv_netvsc 7ced8d79-3c63-7ced-8d79-3c637ced8d79 eth0: Data path switched from VF: enP8015s1 Mar 2 12:55:23.407340 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 2 12:55:23.407387 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 2 12:55:23.416422 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 2 12:55:23.416455 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 2 12:55:23.421477 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 2 12:55:23.421508 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 2 12:55:23.434767 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 2 12:55:23.443004 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 2 12:55:23.443057 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 2 12:55:23.474433 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 2 12:55:23.474553 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 2 12:55:23.484748 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 2 12:55:23.484786 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 2 12:55:23.498260 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 2 12:55:23.498296 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 2 12:55:23.508758 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 2 12:55:23.508805 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 2 12:55:23.526750 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 2 12:55:23.526834 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 2 12:55:23.536682 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 2 12:55:23.536728 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 2 12:55:23.550233 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 2 12:55:23.565352 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 2 12:55:23.565414 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 2 12:55:23.581293 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 2 12:55:23.581337 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 2 12:55:23.597429 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 2 12:55:23.597478 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 2 12:55:23.611581 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 2 12:55:23.611628 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 2 12:55:23.617023 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 2 12:55:23.617058 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:55:23.754798 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Mar 2 12:55:23.628128 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 2 12:55:23.628185 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Mar 2 12:55:23.628208 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 2 12:55:23.628231 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 2 12:55:23.628507 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 2 12:55:23.630168 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 2 12:55:23.635815 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 2 12:55:23.635888 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 2 12:55:23.644258 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 2 12:55:23.644321 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 2 12:55:23.653823 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 2 12:55:23.661785 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 2 12:55:23.661867 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 2 12:55:23.670564 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 2 12:55:23.698080 systemd[1]: Switching root. Mar 2 12:55:23.820257 systemd-journald[225]: Journal stopped Mar 2 12:55:25.916162 kernel: SELinux: policy capability network_peer_controls=1 Mar 2 12:55:25.916180 kernel: SELinux: policy capability open_perms=1 Mar 2 12:55:25.916188 kernel: SELinux: policy capability extended_socket_class=1 Mar 2 12:55:25.916193 kernel: SELinux: policy capability always_check_network=0 Mar 2 12:55:25.916198 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 2 12:55:25.916205 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 2 12:55:25.916211 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 2 12:55:25.916216 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 2 12:55:25.916221 kernel: SELinux: policy capability userspace_initial_context=0 Mar 2 12:55:25.916227 kernel: audit: type=1403 audit(1772456124.278:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 2 12:55:25.916233 systemd[1]: Successfully loaded SELinux policy in 95.553ms. Mar 2 12:55:25.916241 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.417ms. Mar 2 12:55:25.916248 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 2 12:55:25.916254 systemd[1]: Detected virtualization microsoft. Mar 2 12:55:25.916260 systemd[1]: Detected architecture arm64. Mar 2 12:55:25.916268 systemd[1]: Detected first boot. Mar 2 12:55:25.916275 systemd[1]: Hostname set to . Mar 2 12:55:25.916281 systemd[1]: Initializing machine ID from random generator. Mar 2 12:55:25.916287 zram_generator::config[1298]: No configuration found. Mar 2 12:55:25.916293 kernel: NET: Registered PF_VSOCK protocol family Mar 2 12:55:25.916299 systemd[1]: Populated /etc with preset unit settings. Mar 2 12:55:25.916305 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 2 12:55:25.916311 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 2 12:55:25.916318 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 2 12:55:25.916324 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 2 12:55:25.916330 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 2 12:55:25.916336 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 2 12:55:25.916342 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 2 12:55:25.916348 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 2 12:55:25.916354 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 2 12:55:25.916361 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 2 12:55:25.916367 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 2 12:55:25.916373 systemd[1]: Created slice user.slice - User and Session Slice. Mar 2 12:55:25.916379 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 2 12:55:25.916385 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 2 12:55:25.916391 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 2 12:55:25.916398 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 2 12:55:25.916404 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 2 12:55:25.916411 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 2 12:55:25.916417 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 2 12:55:25.916425 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 2 12:55:25.916431 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 2 12:55:25.916437 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 2 12:55:25.916443 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 2 12:55:25.916449 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 2 12:55:25.916455 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 2 12:55:25.916462 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 2 12:55:25.916468 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 2 12:55:25.916475 systemd[1]: Reached target slices.target - Slice Units. Mar 2 12:55:25.916481 systemd[1]: Reached target swap.target - Swaps. Mar 2 12:55:25.916487 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 2 12:55:25.916493 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 2 12:55:25.916500 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 2 12:55:25.916506 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 2 12:55:25.916513 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 2 12:55:25.916519 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 2 12:55:25.916525 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 2 12:55:25.916532 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 2 12:55:25.916538 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 2 12:55:25.916546 systemd[1]: Mounting media.mount - External Media Directory... Mar 2 12:55:25.916552 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 2 12:55:25.916558 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 2 12:55:25.916564 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 2 12:55:25.916571 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 2 12:55:25.916577 systemd[1]: Reached target machines.target - Containers. Mar 2 12:55:25.916583 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 2 12:55:25.916590 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:55:25.916597 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 2 12:55:25.916603 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 2 12:55:25.916609 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 2 12:55:25.916615 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 2 12:55:25.916622 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 2 12:55:25.916628 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 2 12:55:25.916634 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 2 12:55:25.916641 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 2 12:55:25.916647 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 2 12:55:25.916654 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 2 12:55:25.916661 kernel: ACPI: bus type drm_connector registered Mar 2 12:55:25.916667 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 2 12:55:25.916674 systemd[1]: Stopped systemd-fsck-usr.service. Mar 2 12:55:25.916680 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:55:25.916687 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 2 12:55:25.916693 kernel: loop: module loaded Mar 2 12:55:25.916698 kernel: fuse: init (API version 7.41) Mar 2 12:55:25.916706 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 2 12:55:25.916724 systemd-journald[1402]: Collecting audit messages is disabled. Mar 2 12:55:25.916739 systemd-journald[1402]: Journal started Mar 2 12:55:25.916754 systemd-journald[1402]: Runtime Journal (/run/log/journal/7ac54e9826e34c9c8bc0f6d4ced9afc1) is 8M, max 78.3M, 70.3M free. Mar 2 12:55:25.255042 systemd[1]: Queued start job for default target multi-user.target. Mar 2 12:55:25.259691 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 2 12:55:25.260071 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 2 12:55:25.260363 systemd[1]: systemd-journald.service: Consumed 2.413s CPU time. Mar 2 12:55:25.927183 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 2 12:55:25.945531 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 2 12:55:25.959539 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 2 12:55:25.972559 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 2 12:55:25.979882 systemd[1]: verity-setup.service: Deactivated successfully. Mar 2 12:55:25.979945 systemd[1]: Stopped verity-setup.service. Mar 2 12:55:25.995885 systemd[1]: Started systemd-journald.service - Journal Service. Mar 2 12:55:25.998558 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 2 12:55:26.006532 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 2 12:55:26.011131 systemd[1]: Mounted media.mount - External Media Directory. Mar 2 12:55:26.015709 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 2 12:55:26.020224 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 2 12:55:26.024753 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 2 12:55:26.030176 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 2 12:55:26.036682 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 2 12:55:26.041989 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 2 12:55:26.042134 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 2 12:55:26.047700 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 2 12:55:26.047842 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 2 12:55:26.052528 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 2 12:55:26.052652 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 2 12:55:26.057122 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 2 12:55:26.057282 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 2 12:55:26.063576 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 2 12:55:26.063692 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 2 12:55:26.068659 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 2 12:55:26.068792 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 2 12:55:26.073579 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 2 12:55:26.078808 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 2 12:55:26.084141 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 2 12:55:26.089500 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 2 12:55:26.095381 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 2 12:55:26.109480 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 2 12:55:26.115181 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 2 12:55:26.122599 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 2 12:55:26.127655 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 2 12:55:26.127684 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 2 12:55:26.132655 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 2 12:55:26.139155 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 2 12:55:26.143499 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:55:26.150318 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 2 12:55:26.168300 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 2 12:55:26.173132 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 2 12:55:26.175238 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 2 12:55:26.179894 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 2 12:55:26.189406 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 2 12:55:26.197441 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 2 12:55:26.205894 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 2 12:55:26.213710 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 2 12:55:26.220294 systemd-journald[1402]: Time spent on flushing to /var/log/journal/7ac54e9826e34c9c8bc0f6d4ced9afc1 is 18.749ms for 933 entries. Mar 2 12:55:26.220294 systemd-journald[1402]: System Journal (/var/log/journal/7ac54e9826e34c9c8bc0f6d4ced9afc1) is 8M, max 2.6G, 2.6G free. Mar 2 12:55:26.291346 systemd-journald[1402]: Received client request to flush runtime journal. Mar 2 12:55:26.291493 kernel: loop0: detected capacity change from 0 to 100632 Mar 2 12:55:26.226514 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 2 12:55:26.237293 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 2 12:55:26.248825 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 2 12:55:26.260108 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 2 12:55:26.271942 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 2 12:55:26.295212 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 2 12:55:26.318277 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 2 12:55:26.321628 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 2 12:55:26.327705 systemd-tmpfiles[1439]: ACLs are not supported, ignoring. Mar 2 12:55:26.327971 systemd-tmpfiles[1439]: ACLs are not supported, ignoring. Mar 2 12:55:26.331611 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 2 12:55:26.340413 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 2 12:55:26.394347 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 2 12:55:26.422175 kernel: loop1: detected capacity change from 0 to 27936 Mar 2 12:55:26.435715 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 2 12:55:26.444488 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 2 12:55:26.465018 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Mar 2 12:55:26.465032 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Mar 2 12:55:26.467445 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 2 12:55:26.546340 kernel: loop2: detected capacity change from 0 to 209336 Mar 2 12:55:26.584209 kernel: loop3: detected capacity change from 0 to 119840 Mar 2 12:55:26.599381 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 2 12:55:26.607310 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 2 12:55:26.635498 systemd-udevd[1466]: Using default interface naming scheme 'v255'. Mar 2 12:55:26.688178 kernel: loop4: detected capacity change from 0 to 100632 Mar 2 12:55:26.693893 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 2 12:55:26.713389 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 2 12:55:26.725164 kernel: loop5: detected capacity change from 0 to 27936 Mar 2 12:55:26.752217 kernel: loop6: detected capacity change from 0 to 209336 Mar 2 12:55:26.748406 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 2 12:55:26.780175 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 2 12:55:26.783162 kernel: loop7: detected capacity change from 0 to 119840 Mar 2 12:55:26.802636 (sd-merge)[1468]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 2 12:55:26.807006 (sd-merge)[1468]: Merged extensions into '/usr'. Mar 2 12:55:26.827993 systemd[1]: Reload requested from client PID 1437 ('systemd-sysext') (unit systemd-sysext.service)... Mar 2 12:55:26.828168 systemd[1]: Reloading... Mar 2 12:55:26.897698 kernel: mousedev: PS/2 mouse device common for all mice Mar 2 12:55:26.897771 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#244 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 2 12:55:26.928233 zram_generator::config[1534]: No configuration found. Mar 2 12:55:27.003232 kernel: hv_vmbus: registering driver hv_balloon Mar 2 12:55:27.007873 kernel: hv_vmbus: registering driver hyperv_fb Mar 2 12:55:27.007905 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 2 12:55:27.016128 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 2 12:55:27.016237 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 2 12:55:27.016251 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 2 12:55:27.034962 kernel: Console: switching to colour dummy device 80x25 Mar 2 12:55:27.050850 kernel: Console: switching to colour frame buffer device 128x48 Mar 2 12:55:27.057996 systemd-networkd[1497]: lo: Link UP Mar 2 12:55:27.060866 systemd-networkd[1497]: lo: Gained carrier Mar 2 12:55:27.064056 systemd-networkd[1497]: Enumeration completed Mar 2 12:55:27.067489 systemd-networkd[1497]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:55:27.067590 systemd-networkd[1497]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 2 12:55:27.079183 kernel: MACsec IEEE 802.1AE Mar 2 12:55:27.141182 kernel: mlx5_core 1f4f:00:02.0 enP8015s1: Link up Mar 2 12:55:27.170291 kernel: hv_netvsc 7ced8d79-3c63-7ced-8d79-3c637ced8d79 eth0: Data path switched to VF: enP8015s1 Mar 2 12:55:27.170637 systemd-networkd[1497]: enP8015s1: Link UP Mar 2 12:55:27.170757 systemd-networkd[1497]: eth0: Link UP Mar 2 12:55:27.170760 systemd-networkd[1497]: eth0: Gained carrier Mar 2 12:55:27.170781 systemd-networkd[1497]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:55:27.181363 systemd-networkd[1497]: enP8015s1: Gained carrier Mar 2 12:55:27.188223 systemd-networkd[1497]: eth0: DHCPv4 address 10.200.20.16/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 2 12:55:27.260607 systemd[1]: Reloading finished in 432 ms. Mar 2 12:55:27.280278 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 2 12:55:27.285378 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 2 12:55:27.290620 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 2 12:55:27.324538 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 2 12:55:27.343355 systemd[1]: Starting ensure-sysext.service... Mar 2 12:55:27.347538 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 2 12:55:27.357269 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 2 12:55:27.366010 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 2 12:55:27.380205 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 2 12:55:27.391290 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:55:27.403441 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 2 12:55:27.404306 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 2 12:55:27.404624 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 2 12:55:27.405125 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 2 12:55:27.405678 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 2 12:55:27.405924 systemd-tmpfiles[1684]: ACLs are not supported, ignoring. Mar 2 12:55:27.406023 systemd-tmpfiles[1684]: ACLs are not supported, ignoring. Mar 2 12:55:27.406141 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 2 12:55:27.408845 systemd-tmpfiles[1684]: Detected autofs mount point /boot during canonicalization of boot. Mar 2 12:55:27.408948 systemd-tmpfiles[1684]: Skipping /boot Mar 2 12:55:27.413912 systemd-tmpfiles[1684]: Detected autofs mount point /boot during canonicalization of boot. Mar 2 12:55:27.414005 systemd-tmpfiles[1684]: Skipping /boot Mar 2 12:55:27.418396 systemd[1]: Reload requested from client PID 1679 ('systemctl') (unit ensure-sysext.service)... Mar 2 12:55:27.418412 systemd[1]: Reloading... Mar 2 12:55:27.485210 zram_generator::config[1716]: No configuration found. Mar 2 12:55:27.642755 systemd[1]: Reloading finished in 224 ms. Mar 2 12:55:27.661938 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 2 12:55:27.667744 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 2 12:55:27.673825 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:55:27.685275 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 2 12:55:27.694906 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 2 12:55:27.703435 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 2 12:55:27.721575 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 2 12:55:27.727950 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 2 12:55:27.735517 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:55:27.741335 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 2 12:55:27.748411 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 2 12:55:27.756401 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 2 12:55:27.760670 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:55:27.760762 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:55:27.765738 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:55:27.765867 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:55:27.765919 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:55:27.770318 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 2 12:55:27.776353 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 2 12:55:27.776499 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 2 12:55:27.782519 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 2 12:55:27.782645 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 2 12:55:27.789021 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 2 12:55:27.789176 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 2 12:55:27.799096 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:55:27.801113 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 2 12:55:27.811354 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 2 12:55:27.821931 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 2 12:55:27.828039 augenrules[1815]: No rules Mar 2 12:55:27.830359 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 2 12:55:27.836748 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:55:27.836867 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:55:27.836972 systemd[1]: Reached target time-set.target - System Time Set. Mar 2 12:55:27.845113 systemd[1]: audit-rules.service: Deactivated successfully. Mar 2 12:55:27.845319 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 2 12:55:27.850388 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 2 12:55:27.850812 systemd-resolved[1790]: Positive Trust Anchors: Mar 2 12:55:27.851226 systemd-resolved[1790]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 2 12:55:27.851321 systemd-resolved[1790]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 2 12:55:27.856143 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 2 12:55:27.856294 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 2 12:55:27.861579 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 2 12:55:27.861709 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 2 12:55:27.863407 systemd-resolved[1790]: Using system hostname 'ci-4459.2.101-5c781fe851'. Mar 2 12:55:27.866643 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 2 12:55:27.872725 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 2 12:55:27.872867 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 2 12:55:27.880593 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 2 12:55:27.880750 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 2 12:55:27.888283 systemd[1]: Finished ensure-sysext.service. Mar 2 12:55:27.895943 systemd[1]: Reached target network.target - Network. Mar 2 12:55:27.899644 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 2 12:55:27.904629 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 2 12:55:27.904687 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 2 12:55:27.938311 ldconfig[1432]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 2 12:55:27.952247 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 2 12:55:27.960290 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 2 12:55:27.979772 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 2 12:55:27.999448 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 2 12:55:28.005292 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 2 12:55:28.005340 systemd[1]: Reached target sysinit.target - System Initialization. Mar 2 12:55:28.009529 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 2 12:55:28.014395 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 2 12:55:28.019907 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 2 12:55:28.024107 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 2 12:55:28.029017 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 2 12:55:28.034597 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 2 12:55:28.034623 systemd[1]: Reached target paths.target - Path Units. Mar 2 12:55:28.038578 systemd[1]: Reached target timers.target - Timer Units. Mar 2 12:55:28.043792 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 2 12:55:28.050185 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 2 12:55:28.055346 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 2 12:55:28.060492 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 2 12:55:28.068255 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 2 12:55:28.083788 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 2 12:55:28.088632 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 2 12:55:28.094011 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 2 12:55:28.098789 systemd[1]: Reached target sockets.target - Socket Units. Mar 2 12:55:28.102627 systemd[1]: Reached target basic.target - Basic System. Mar 2 12:55:28.106306 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 2 12:55:28.106329 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 2 12:55:28.108545 systemd[1]: Starting chronyd.service - NTP client/server... Mar 2 12:55:28.120261 systemd[1]: Starting containerd.service - containerd container runtime... Mar 2 12:55:28.133779 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 2 12:55:28.142631 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 2 12:55:28.156361 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 2 12:55:28.162823 chronyd[1835]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 2 12:55:28.163887 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 2 12:55:28.169931 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 2 12:55:28.172285 jq[1843]: false Mar 2 12:55:28.173480 chronyd[1835]: Timezone right/UTC failed leap second check, ignoring Mar 2 12:55:28.174575 chronyd[1835]: Loaded seccomp filter (level 2) Mar 2 12:55:28.177442 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 2 12:55:28.189294 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 2 12:55:28.194623 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 2 12:55:28.202605 KVP[1845]: KVP starting; pid is:1845 Mar 2 12:55:28.204081 extend-filesystems[1844]: Found /dev/sda6 Mar 2 12:55:28.204447 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 2 12:55:28.214173 KVP[1845]: KVP LIC Version: 3.1 Mar 2 12:55:28.215194 kernel: hv_utils: KVP IC version 4.0 Mar 2 12:55:28.217141 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 2 12:55:28.223181 extend-filesystems[1844]: Found /dev/sda9 Mar 2 12:55:28.228396 extend-filesystems[1844]: Checking size of /dev/sda9 Mar 2 12:55:28.227470 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 2 12:55:28.240013 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 2 12:55:28.257444 extend-filesystems[1844]: Old size kept for /dev/sda9 Mar 2 12:55:28.250992 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 2 12:55:28.257839 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 2 12:55:28.258372 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 2 12:55:28.263388 systemd[1]: Starting update-engine.service - Update Engine... Mar 2 12:55:28.276299 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 2 12:55:28.282828 systemd[1]: Started chronyd.service - NTP client/server. Mar 2 12:55:28.290733 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 2 12:55:28.298201 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 2 12:55:28.300835 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 2 12:55:28.301112 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 2 12:55:28.313187 update_engine[1863]: I20260302 12:55:28.313100 1863 main.cc:92] Flatcar Update Engine starting Mar 2 12:55:28.313758 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 2 12:55:28.321017 systemd[1]: motdgen.service: Deactivated successfully. Mar 2 12:55:28.321225 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 2 12:55:28.328731 jq[1872]: true Mar 2 12:55:28.329317 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 2 12:55:28.329562 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 2 12:55:28.359984 jq[1884]: true Mar 2 12:55:28.363937 systemd-logind[1862]: New seat seat0. Mar 2 12:55:28.366926 systemd-logind[1862]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 2 12:55:28.367084 systemd[1]: Started systemd-logind.service - User Login Management. Mar 2 12:55:28.372498 (ntainerd)[1885]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 2 12:55:28.408305 dbus-daemon[1838]: [system] SELinux support is enabled Mar 2 12:55:28.408499 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 2 12:55:28.418516 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 2 12:55:28.420905 dbus-daemon[1838]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 2 12:55:28.418548 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 2 12:55:28.425819 update_engine[1863]: I20260302 12:55:28.425757 1863 update_check_scheduler.cc:74] Next update check in 2m50s Mar 2 12:55:28.426494 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 2 12:55:28.426519 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 2 12:55:28.433377 systemd[1]: Started update-engine.service - Update Engine. Mar 2 12:55:28.457595 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 2 12:55:28.473495 tar[1882]: linux-arm64/LICENSE Mar 2 12:55:28.473495 tar[1882]: linux-arm64/helm Mar 2 12:55:28.481223 coreos-metadata[1837]: Mar 02 12:55:28.479 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 2 12:55:28.488276 bash[1931]: Updated "/home/core/.ssh/authorized_keys" Mar 2 12:55:28.483248 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 2 12:55:28.488481 coreos-metadata[1837]: Mar 02 12:55:28.483 INFO Fetch successful Mar 2 12:55:28.488481 coreos-metadata[1837]: Mar 02 12:55:28.483 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 2 12:55:28.500288 coreos-metadata[1837]: Mar 02 12:55:28.496 INFO Fetch successful Mar 2 12:55:28.500288 coreos-metadata[1837]: Mar 02 12:55:28.496 INFO Fetching http://168.63.129.16/machine/0ba6a6a3-ff15-4b21-9045-9027fb04cf08/611bf47a%2D97ec%2D4bb0%2Dbb45%2D89014d74d762.%5Fci%2D4459.2.101%2D5c781fe851?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 2 12:55:28.502506 coreos-metadata[1837]: Mar 02 12:55:28.502 INFO Fetch successful Mar 2 12:55:28.506499 coreos-metadata[1837]: Mar 02 12:55:28.506 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 2 12:55:28.515827 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 2 12:55:28.524831 coreos-metadata[1837]: Mar 02 12:55:28.523 INFO Fetch successful Mar 2 12:55:28.587859 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 2 12:55:28.594722 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 2 12:55:28.633253 systemd-networkd[1497]: eth0: Gained IPv6LL Mar 2 12:55:28.637964 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 2 12:55:28.644543 locksmithd[1941]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 2 12:55:28.646814 systemd[1]: Reached target network-online.target - Network is Online. Mar 2 12:55:28.655733 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:55:28.664048 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 2 12:55:28.701180 containerd[1885]: time="2026-03-02T12:55:28Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 2 12:55:28.703449 containerd[1885]: time="2026-03-02T12:55:28.703054108Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 2 12:55:28.723343 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 2 12:55:28.732675 containerd[1885]: time="2026-03-02T12:55:28.730566860Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.328µs" Mar 2 12:55:28.733868 sshd_keygen[1878]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 2 12:55:28.734104 containerd[1885]: time="2026-03-02T12:55:28.733128596Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 2 12:55:28.734185 containerd[1885]: time="2026-03-02T12:55:28.734169732Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 2 12:55:28.734394 containerd[1885]: time="2026-03-02T12:55:28.734375132Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 2 12:55:28.734845 containerd[1885]: time="2026-03-02T12:55:28.734800172Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 2 12:55:28.734929 containerd[1885]: time="2026-03-02T12:55:28.734915164Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 2 12:55:28.735305 containerd[1885]: time="2026-03-02T12:55:28.735282428Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 2 12:55:28.736687 containerd[1885]: time="2026-03-02T12:55:28.736137700Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 2 12:55:28.736848 containerd[1885]: time="2026-03-02T12:55:28.736825212Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 2 12:55:28.737106 containerd[1885]: time="2026-03-02T12:55:28.737085180Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 2 12:55:28.737201 containerd[1885]: time="2026-03-02T12:55:28.737185436Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 2 12:55:28.737246 containerd[1885]: time="2026-03-02T12:55:28.737234820Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 2 12:55:28.737491 containerd[1885]: time="2026-03-02T12:55:28.737469908Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 2 12:55:28.738067 containerd[1885]: time="2026-03-02T12:55:28.738047132Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 2 12:55:28.738225 containerd[1885]: time="2026-03-02T12:55:28.738197916Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 2 12:55:28.738690 containerd[1885]: time="2026-03-02T12:55:28.738662676Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 2 12:55:28.738791 containerd[1885]: time="2026-03-02T12:55:28.738778140Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 2 12:55:28.739007 containerd[1885]: time="2026-03-02T12:55:28.738989692Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 2 12:55:28.739497 containerd[1885]: time="2026-03-02T12:55:28.739476324Z" level=info msg="metadata content store policy set" policy=shared Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759668820Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759749348Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759760580Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759780668Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759789148Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759795924Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759805260Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759812708Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759820620Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759830836Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759837460Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759846652Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.759992476Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 2 12:55:28.761492 containerd[1885]: time="2026-03-02T12:55:28.760010164Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760021164Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760029052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760037132Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760043988Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760051116Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760057532Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760064556Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760071060Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760077260Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760121964Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760133644Z" level=info msg="Start snapshots syncer" Mar 2 12:55:28.761809 containerd[1885]: time="2026-03-02T12:55:28.760175388Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 2 12:55:28.761951 containerd[1885]: time="2026-03-02T12:55:28.760383556Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 2 12:55:28.761951 containerd[1885]: time="2026-03-02T12:55:28.760421540Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760461788Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760557492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760573076Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760579852Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760588404Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760595972Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760602388Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760609372Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760626476Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760634252Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760641476Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760670700Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760687548Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 2 12:55:28.762027 containerd[1885]: time="2026-03-02T12:55:28.760693092Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760699220Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760703924Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760709700Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760716020Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760729828Z" level=info msg="runtime interface created" Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760733180Z" level=info msg="created NRI interface" Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760741436Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760750572Z" level=info msg="Connect containerd service" Mar 2 12:55:28.762235 containerd[1885]: time="2026-03-02T12:55:28.760764572Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 2 12:55:28.767172 containerd[1885]: time="2026-03-02T12:55:28.766591292Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 2 12:55:28.769970 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 2 12:55:28.786729 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 2 12:55:28.798410 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 2 12:55:28.829658 systemd[1]: issuegen.service: Deactivated successfully. Mar 2 12:55:28.831321 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 2 12:55:28.837784 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 2 12:55:28.847898 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 2 12:55:28.852093 tar[1882]: linux-arm64/README.md Mar 2 12:55:28.856681 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 2 12:55:28.866394 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 2 12:55:28.875843 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 2 12:55:28.884960 systemd[1]: Reached target getty.target - Login Prompts. Mar 2 12:55:28.893891 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 2 12:55:28.931061 containerd[1885]: time="2026-03-02T12:55:28.930995644Z" level=info msg="Start subscribing containerd event" Mar 2 12:55:28.931315 containerd[1885]: time="2026-03-02T12:55:28.931185532Z" level=info msg="Start recovering state" Mar 2 12:55:28.931429 containerd[1885]: time="2026-03-02T12:55:28.931415900Z" level=info msg="Start event monitor" Mar 2 12:55:28.931511 containerd[1885]: time="2026-03-02T12:55:28.931484260Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 2 12:55:28.931633 containerd[1885]: time="2026-03-02T12:55:28.931492812Z" level=info msg="Start cni network conf syncer for default" Mar 2 12:55:28.931633 containerd[1885]: time="2026-03-02T12:55:28.931556788Z" level=info msg="Start streaming server" Mar 2 12:55:28.931633 containerd[1885]: time="2026-03-02T12:55:28.931566428Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 2 12:55:28.931633 containerd[1885]: time="2026-03-02T12:55:28.931573700Z" level=info msg="runtime interface starting up..." Mar 2 12:55:28.931633 containerd[1885]: time="2026-03-02T12:55:28.931545892Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 2 12:55:28.931633 containerd[1885]: time="2026-03-02T12:55:28.931577844Z" level=info msg="starting plugins..." Mar 2 12:55:28.931633 containerd[1885]: time="2026-03-02T12:55:28.931609396Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 2 12:55:28.931844 containerd[1885]: time="2026-03-02T12:55:28.931706876Z" level=info msg="containerd successfully booted in 0.231716s" Mar 2 12:55:28.932302 systemd[1]: Started containerd.service - containerd container runtime. Mar 2 12:55:29.363483 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:55:29.368374 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 2 12:55:29.374041 systemd[1]: Startup finished in 1.649s (kernel) + 8.334s (initrd) + 5.189s (userspace) = 15.172s. Mar 2 12:55:29.377548 (kubelet)[2029]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:55:29.516422 login[2017]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:29.519054 login[2018]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:29.529661 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 2 12:55:29.531975 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 2 12:55:29.535446 systemd-logind[1862]: New session 1 of user core. Mar 2 12:55:29.540105 systemd-logind[1862]: New session 2 of user core. Mar 2 12:55:29.559670 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 2 12:55:29.564533 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 2 12:55:29.585494 (systemd)[2042]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 2 12:55:29.589342 systemd-logind[1862]: New session c1 of user core. Mar 2 12:55:29.591302 waagent[2013]: 2026-03-02T12:55:29.591242Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Mar 2 12:55:29.601600 waagent[2013]: 2026-03-02T12:55:29.601546Z INFO Daemon Daemon OS: flatcar 4459.2.101 Mar 2 12:55:29.605257 waagent[2013]: 2026-03-02T12:55:29.605212Z INFO Daemon Daemon Python: 3.11.13 Mar 2 12:55:29.608681 waagent[2013]: 2026-03-02T12:55:29.608636Z INFO Daemon Daemon Run daemon Mar 2 12:55:29.618322 waagent[2013]: 2026-03-02T12:55:29.612008Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.2.101' Mar 2 12:55:29.620472 waagent[2013]: 2026-03-02T12:55:29.620422Z INFO Daemon Daemon Using waagent for provisioning Mar 2 12:55:29.624544 waagent[2013]: 2026-03-02T12:55:29.624505Z INFO Daemon Daemon Activate resource disk Mar 2 12:55:29.628906 waagent[2013]: 2026-03-02T12:55:29.628021Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 2 12:55:29.638543 waagent[2013]: 2026-03-02T12:55:29.638239Z INFO Daemon Daemon Found device: None Mar 2 12:55:29.643488 waagent[2013]: 2026-03-02T12:55:29.643444Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 2 12:55:29.651306 waagent[2013]: 2026-03-02T12:55:29.651240Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 2 12:55:29.661184 waagent[2013]: 2026-03-02T12:55:29.660245Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 2 12:55:29.665552 waagent[2013]: 2026-03-02T12:55:29.665172Z INFO Daemon Daemon Running default provisioning handler Mar 2 12:55:29.675108 waagent[2013]: 2026-03-02T12:55:29.675060Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 2 12:55:29.693372 waagent[2013]: 2026-03-02T12:55:29.685957Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 2 12:55:29.693995 waagent[2013]: 2026-03-02T12:55:29.693490Z INFO Daemon Daemon cloud-init is enabled: False Mar 2 12:55:29.697471 waagent[2013]: 2026-03-02T12:55:29.697435Z INFO Daemon Daemon Copying ovf-env.xml Mar 2 12:55:29.729734 waagent[2013]: 2026-03-02T12:55:29.728212Z INFO Daemon Daemon Successfully mounted dvd Mar 2 12:55:29.744772 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 2 12:55:29.747848 waagent[2013]: 2026-03-02T12:55:29.747789Z INFO Daemon Daemon Detect protocol endpoint Mar 2 12:55:29.755480 waagent[2013]: 2026-03-02T12:55:29.751423Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 2 12:55:29.755731 waagent[2013]: 2026-03-02T12:55:29.755698Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 2 12:55:29.763156 waagent[2013]: 2026-03-02T12:55:29.761910Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 2 12:55:29.765844 waagent[2013]: 2026-03-02T12:55:29.765808Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 2 12:55:29.769482 waagent[2013]: 2026-03-02T12:55:29.769445Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 2 12:55:29.794635 waagent[2013]: 2026-03-02T12:55:29.794590Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 2 12:55:29.800183 waagent[2013]: 2026-03-02T12:55:29.800136Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 2 12:55:29.805077 waagent[2013]: 2026-03-02T12:55:29.805023Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 2 12:55:29.806298 systemd[2042]: Queued start job for default target default.target. Mar 2 12:55:29.817331 systemd[2042]: Created slice app.slice - User Application Slice. Mar 2 12:55:29.817487 systemd[2042]: Reached target paths.target - Paths. Mar 2 12:55:29.817571 systemd[2042]: Reached target timers.target - Timers. Mar 2 12:55:29.818955 systemd[2042]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 2 12:55:29.838396 systemd[2042]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 2 12:55:29.838496 systemd[2042]: Reached target sockets.target - Sockets. Mar 2 12:55:29.838536 systemd[2042]: Reached target basic.target - Basic System. Mar 2 12:55:29.838561 systemd[2042]: Reached target default.target - Main User Target. Mar 2 12:55:29.838583 systemd[2042]: Startup finished in 239ms. Mar 2 12:55:29.838617 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 2 12:55:29.843314 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 2 12:55:29.843931 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 2 12:55:29.863365 waagent[2013]: 2026-03-02T12:55:29.863287Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 2 12:55:29.871294 waagent[2013]: 2026-03-02T12:55:29.871076Z INFO Daemon Daemon Forcing an update of the goal state. Mar 2 12:55:29.883164 waagent[2013]: 2026-03-02T12:55:29.883040Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 2 12:55:29.903829 waagent[2013]: 2026-03-02T12:55:29.903520Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.179 Mar 2 12:55:29.908904 waagent[2013]: 2026-03-02T12:55:29.908855Z INFO Daemon Mar 2 12:55:29.913314 waagent[2013]: 2026-03-02T12:55:29.912221Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 867aa659-dfe4-4dc8-a950-347a11e67dc2 eTag: 7531962740110987454 source: Fabric] Mar 2 12:55:29.922501 waagent[2013]: 2026-03-02T12:55:29.922328Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 2 12:55:29.929172 waagent[2013]: 2026-03-02T12:55:29.927853Z INFO Daemon Mar 2 12:55:29.936177 waagent[2013]: 2026-03-02T12:55:29.931355Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 2 12:55:29.942826 waagent[2013]: 2026-03-02T12:55:29.942782Z INFO Daemon Daemon Downloading artifacts profile blob Mar 2 12:55:29.977291 kubelet[2029]: E0302 12:55:29.977240 2029 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:55:29.979586 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:55:29.979697 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:55:29.981227 systemd[1]: kubelet.service: Consumed 540ms CPU time, 261M memory peak. Mar 2 12:55:30.066633 waagent[2013]: 2026-03-02T12:55:30.066557Z INFO Daemon Downloaded certificate {'thumbprint': '6FCC00A88F0A065F8DF3B2248D9D3F6D8D43B40F', 'hasPrivateKey': True} Mar 2 12:55:30.074150 waagent[2013]: 2026-03-02T12:55:30.074104Z INFO Daemon Fetch goal state completed Mar 2 12:55:30.112015 waagent[2013]: 2026-03-02T12:55:30.111971Z INFO Daemon Daemon Starting provisioning Mar 2 12:55:30.116095 waagent[2013]: 2026-03-02T12:55:30.116056Z INFO Daemon Daemon Handle ovf-env.xml. Mar 2 12:55:30.120034 waagent[2013]: 2026-03-02T12:55:30.120004Z INFO Daemon Daemon Set hostname [ci-4459.2.101-5c781fe851] Mar 2 12:55:30.130122 waagent[2013]: 2026-03-02T12:55:30.130074Z INFO Daemon Daemon Publish hostname [ci-4459.2.101-5c781fe851] Mar 2 12:55:30.135612 waagent[2013]: 2026-03-02T12:55:30.135567Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 2 12:55:30.140150 waagent[2013]: 2026-03-02T12:55:30.140112Z INFO Daemon Daemon Primary interface is [eth0] Mar 2 12:55:30.149885 systemd-networkd[1497]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:55:30.149894 systemd-networkd[1497]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 2 12:55:30.149924 systemd-networkd[1497]: eth0: DHCP lease lost Mar 2 12:55:30.150824 waagent[2013]: 2026-03-02T12:55:30.150762Z INFO Daemon Daemon Create user account if not exists Mar 2 12:55:30.154890 waagent[2013]: 2026-03-02T12:55:30.154851Z INFO Daemon Daemon User core already exists, skip useradd Mar 2 12:55:30.159007 waagent[2013]: 2026-03-02T12:55:30.158971Z INFO Daemon Daemon Configure sudoer Mar 2 12:55:30.168778 waagent[2013]: 2026-03-02T12:55:30.168723Z INFO Daemon Daemon Configure sshd Mar 2 12:55:30.176659 waagent[2013]: 2026-03-02T12:55:30.176609Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 2 12:55:30.186123 waagent[2013]: 2026-03-02T12:55:30.186082Z INFO Daemon Daemon Deploy ssh public key. Mar 2 12:55:30.186208 systemd-networkd[1497]: eth0: DHCPv4 address 10.200.20.16/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 2 12:55:31.297003 waagent[2013]: 2026-03-02T12:55:31.296952Z INFO Daemon Daemon Provisioning complete Mar 2 12:55:31.311694 waagent[2013]: 2026-03-02T12:55:31.311647Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 2 12:55:31.316465 waagent[2013]: 2026-03-02T12:55:31.316427Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 2 12:55:31.323746 waagent[2013]: 2026-03-02T12:55:31.323712Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Mar 2 12:55:31.426204 waagent[2094]: 2026-03-02T12:55:31.425519Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Mar 2 12:55:31.426204 waagent[2094]: 2026-03-02T12:55:31.425665Z INFO ExtHandler ExtHandler OS: flatcar 4459.2.101 Mar 2 12:55:31.426204 waagent[2094]: 2026-03-02T12:55:31.425702Z INFO ExtHandler ExtHandler Python: 3.11.13 Mar 2 12:55:31.426204 waagent[2094]: 2026-03-02T12:55:31.425737Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Mar 2 12:55:31.443502 waagent[2094]: 2026-03-02T12:55:31.443432Z INFO ExtHandler ExtHandler Distro: flatcar-4459.2.101; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Mar 2 12:55:31.443651 waagent[2094]: 2026-03-02T12:55:31.443621Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 2 12:55:31.443692 waagent[2094]: 2026-03-02T12:55:31.443675Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 2 12:55:31.453821 waagent[2094]: 2026-03-02T12:55:31.453754Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 2 12:55:31.461275 waagent[2094]: 2026-03-02T12:55:31.461238Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.179 Mar 2 12:55:31.461707 waagent[2094]: 2026-03-02T12:55:31.461671Z INFO ExtHandler Mar 2 12:55:31.461760 waagent[2094]: 2026-03-02T12:55:31.461741Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 2b3962df-7686-4bc2-ad39-1bf16dba773e eTag: 7531962740110987454 source: Fabric] Mar 2 12:55:31.461982 waagent[2094]: 2026-03-02T12:55:31.461956Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 2 12:55:31.462416 waagent[2094]: 2026-03-02T12:55:31.462385Z INFO ExtHandler Mar 2 12:55:31.462455 waagent[2094]: 2026-03-02T12:55:31.462439Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 2 12:55:31.465988 waagent[2094]: 2026-03-02T12:55:31.465958Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 2 12:55:31.523521 waagent[2094]: 2026-03-02T12:55:31.523438Z INFO ExtHandler Downloaded certificate {'thumbprint': '6FCC00A88F0A065F8DF3B2248D9D3F6D8D43B40F', 'hasPrivateKey': True} Mar 2 12:55:31.523916 waagent[2094]: 2026-03-02T12:55:31.523879Z INFO ExtHandler Fetch goal state completed Mar 2 12:55:31.537251 waagent[2094]: 2026-03-02T12:55:31.537199Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.4 27 Jan 2026 (Library: OpenSSL 3.4.4 27 Jan 2026) Mar 2 12:55:31.540685 waagent[2094]: 2026-03-02T12:55:31.540634Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2094 Mar 2 12:55:31.540803 waagent[2094]: 2026-03-02T12:55:31.540766Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 2 12:55:31.541049 waagent[2094]: 2026-03-02T12:55:31.541021Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Mar 2 12:55:31.542258 waagent[2094]: 2026-03-02T12:55:31.542219Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.2.101', '', 'Flatcar Container Linux by Kinvolk'] Mar 2 12:55:31.542586 waagent[2094]: 2026-03-02T12:55:31.542553Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.2.101', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Mar 2 12:55:31.542710 waagent[2094]: 2026-03-02T12:55:31.542685Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Mar 2 12:55:31.543141 waagent[2094]: 2026-03-02T12:55:31.543110Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 2 12:55:31.558531 waagent[2094]: 2026-03-02T12:55:31.558445Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 2 12:55:31.558645 waagent[2094]: 2026-03-02T12:55:31.558616Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 2 12:55:31.563216 waagent[2094]: 2026-03-02T12:55:31.563115Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 2 12:55:31.567631 systemd[1]: Reload requested from client PID 2109 ('systemctl') (unit waagent.service)... Mar 2 12:55:31.567847 systemd[1]: Reloading... Mar 2 12:55:31.638168 zram_generator::config[2151]: No configuration found. Mar 2 12:55:31.789224 systemd[1]: Reloading finished in 221 ms. Mar 2 12:55:31.815464 waagent[2094]: 2026-03-02T12:55:31.815343Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 2 12:55:31.815544 waagent[2094]: 2026-03-02T12:55:31.815486Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 2 12:55:31.891858 waagent[2094]: 2026-03-02T12:55:31.891777Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 2 12:55:31.892128 waagent[2094]: 2026-03-02T12:55:31.892095Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Mar 2 12:55:31.892783 waagent[2094]: 2026-03-02T12:55:31.892741Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 2 12:55:31.893040 waagent[2094]: 2026-03-02T12:55:31.893005Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 2 12:55:31.893817 waagent[2094]: 2026-03-02T12:55:31.893244Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 2 12:55:31.893817 waagent[2094]: 2026-03-02T12:55:31.893316Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 2 12:55:31.893817 waagent[2094]: 2026-03-02T12:55:31.893469Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 2 12:55:31.893817 waagent[2094]: 2026-03-02T12:55:31.893599Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 2 12:55:31.893817 waagent[2094]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 2 12:55:31.893817 waagent[2094]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 2 12:55:31.893817 waagent[2094]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 2 12:55:31.893817 waagent[2094]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 2 12:55:31.893817 waagent[2094]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 2 12:55:31.893817 waagent[2094]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 2 12:55:31.894093 waagent[2094]: 2026-03-02T12:55:31.894056Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 2 12:55:31.894144 waagent[2094]: 2026-03-02T12:55:31.894101Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 2 12:55:31.894607 waagent[2094]: 2026-03-02T12:55:31.894571Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 2 12:55:31.894647 waagent[2094]: 2026-03-02T12:55:31.894609Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 2 12:55:31.895011 waagent[2094]: 2026-03-02T12:55:31.894984Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 2 12:55:31.895192 waagent[2094]: 2026-03-02T12:55:31.895135Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 2 12:55:31.895320 waagent[2094]: 2026-03-02T12:55:31.895296Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 2 12:55:31.897412 waagent[2094]: 2026-03-02T12:55:31.897375Z INFO EnvHandler ExtHandler Configure routes Mar 2 12:55:31.897587 waagent[2094]: 2026-03-02T12:55:31.897569Z INFO EnvHandler ExtHandler Gateway:None Mar 2 12:55:31.897677 waagent[2094]: 2026-03-02T12:55:31.897660Z INFO EnvHandler ExtHandler Routes:None Mar 2 12:55:31.903182 waagent[2094]: 2026-03-02T12:55:31.902862Z INFO ExtHandler ExtHandler Mar 2 12:55:31.903182 waagent[2094]: 2026-03-02T12:55:31.902923Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 8560090c-f70f-4a31-9e7f-5fecf6ac13b7 correlation 152ae919-754b-49f2-a7a1-8465b4b61c6d created: 2026-03-02T12:54:54.501785Z] Mar 2 12:55:31.903362 waagent[2094]: 2026-03-02T12:55:31.903325Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 2 12:55:31.905018 waagent[2094]: 2026-03-02T12:55:31.903841Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Mar 2 12:55:31.913696 waagent[2094]: 2026-03-02T12:55:31.913648Z INFO MonitorHandler ExtHandler Network interfaces: Mar 2 12:55:31.913696 waagent[2094]: Executing ['ip', '-a', '-o', 'link']: Mar 2 12:55:31.913696 waagent[2094]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 2 12:55:31.913696 waagent[2094]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:79:3c:63 brd ff:ff:ff:ff:ff:ff Mar 2 12:55:31.913696 waagent[2094]: 3: enP8015s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:79:3c:63 brd ff:ff:ff:ff:ff:ff\ altname enP8015p0s2 Mar 2 12:55:31.913696 waagent[2094]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 2 12:55:31.913696 waagent[2094]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 2 12:55:31.913696 waagent[2094]: 2: eth0 inet 10.200.20.16/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 2 12:55:31.913696 waagent[2094]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 2 12:55:31.913696 waagent[2094]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 2 12:55:31.913696 waagent[2094]: 2: eth0 inet6 fe80::7eed:8dff:fe79:3c63/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 2 12:55:31.941267 waagent[2094]: 2026-03-02T12:55:31.941220Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Mar 2 12:55:31.941267 waagent[2094]: Try `iptables -h' or 'iptables --help' for more information.) Mar 2 12:55:31.941753 waagent[2094]: 2026-03-02T12:55:31.941719Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 35BE9D2B-5B0F-42AF-ACF1-2660E405B321;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Mar 2 12:55:32.329903 waagent[2094]: 2026-03-02T12:55:32.329123Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Mar 2 12:55:32.329903 waagent[2094]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 2 12:55:32.329903 waagent[2094]: pkts bytes target prot opt in out source destination Mar 2 12:55:32.329903 waagent[2094]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 2 12:55:32.329903 waagent[2094]: pkts bytes target prot opt in out source destination Mar 2 12:55:32.329903 waagent[2094]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 2 12:55:32.329903 waagent[2094]: pkts bytes target prot opt in out source destination Mar 2 12:55:32.329903 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 2 12:55:32.329903 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 2 12:55:32.329903 waagent[2094]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 2 12:55:32.331626 waagent[2094]: 2026-03-02T12:55:32.331589Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 2 12:55:32.331626 waagent[2094]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 2 12:55:32.331626 waagent[2094]: pkts bytes target prot opt in out source destination Mar 2 12:55:32.331626 waagent[2094]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 2 12:55:32.331626 waagent[2094]: pkts bytes target prot opt in out source destination Mar 2 12:55:32.331626 waagent[2094]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 2 12:55:32.331626 waagent[2094]: pkts bytes target prot opt in out source destination Mar 2 12:55:32.331626 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 2 12:55:32.331626 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 2 12:55:32.331626 waagent[2094]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 2 12:55:32.332034 waagent[2094]: 2026-03-02T12:55:32.332010Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 2 12:55:40.230455 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 2 12:55:40.231733 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:55:40.349343 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:55:40.353406 (kubelet)[2244]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:55:40.457693 kubelet[2244]: E0302 12:55:40.457627 2244 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:55:40.460831 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:55:40.461072 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:55:40.461565 systemd[1]: kubelet.service: Consumed 113ms CPU time, 107M memory peak. Mar 2 12:55:40.505598 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 2 12:55:40.506904 systemd[1]: Started sshd@0-10.200.20.16:22-10.200.16.10:40980.service - OpenSSH per-connection server daemon (10.200.16.10:40980). Mar 2 12:55:41.254259 sshd[2252]: Accepted publickey for core from 10.200.16.10 port 40980 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:55:41.255344 sshd-session[2252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:41.258891 systemd-logind[1862]: New session 3 of user core. Mar 2 12:55:41.277319 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 2 12:55:41.578002 systemd[1]: Started sshd@1-10.200.20.16:22-10.200.16.10:40982.service - OpenSSH per-connection server daemon (10.200.16.10:40982). Mar 2 12:55:42.000263 sshd[2258]: Accepted publickey for core from 10.200.16.10 port 40982 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:55:42.000957 sshd-session[2258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:42.004834 systemd-logind[1862]: New session 4 of user core. Mar 2 12:55:42.010292 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 2 12:55:42.235672 sshd[2261]: Connection closed by 10.200.16.10 port 40982 Mar 2 12:55:42.236276 sshd-session[2258]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:42.239432 systemd[1]: sshd@1-10.200.20.16:22-10.200.16.10:40982.service: Deactivated successfully. Mar 2 12:55:42.240896 systemd[1]: session-4.scope: Deactivated successfully. Mar 2 12:55:42.241531 systemd-logind[1862]: Session 4 logged out. Waiting for processes to exit. Mar 2 12:55:42.242703 systemd-logind[1862]: Removed session 4. Mar 2 12:55:42.327947 systemd[1]: Started sshd@2-10.200.20.16:22-10.200.16.10:40996.service - OpenSSH per-connection server daemon (10.200.16.10:40996). Mar 2 12:55:42.748026 sshd[2267]: Accepted publickey for core from 10.200.16.10 port 40996 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:55:42.749271 sshd-session[2267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:42.752838 systemd-logind[1862]: New session 5 of user core. Mar 2 12:55:42.761303 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 2 12:55:42.979080 sshd[2270]: Connection closed by 10.200.16.10 port 40996 Mar 2 12:55:42.979667 sshd-session[2267]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:42.983390 systemd[1]: sshd@2-10.200.20.16:22-10.200.16.10:40996.service: Deactivated successfully. Mar 2 12:55:42.985134 systemd[1]: session-5.scope: Deactivated successfully. Mar 2 12:55:42.985995 systemd-logind[1862]: Session 5 logged out. Waiting for processes to exit. Mar 2 12:55:42.987639 systemd-logind[1862]: Removed session 5. Mar 2 12:55:43.069398 systemd[1]: Started sshd@3-10.200.20.16:22-10.200.16.10:41006.service - OpenSSH per-connection server daemon (10.200.16.10:41006). Mar 2 12:55:43.503325 sshd[2276]: Accepted publickey for core from 10.200.16.10 port 41006 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:55:43.504237 sshd-session[2276]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:43.507839 systemd-logind[1862]: New session 6 of user core. Mar 2 12:55:43.513278 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 2 12:55:43.739511 sshd[2279]: Connection closed by 10.200.16.10 port 41006 Mar 2 12:55:43.740371 sshd-session[2276]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:43.744400 systemd-logind[1862]: Session 6 logged out. Waiting for processes to exit. Mar 2 12:55:43.744664 systemd[1]: sshd@3-10.200.20.16:22-10.200.16.10:41006.service: Deactivated successfully. Mar 2 12:55:43.745987 systemd[1]: session-6.scope: Deactivated successfully. Mar 2 12:55:43.748406 systemd-logind[1862]: Removed session 6. Mar 2 12:55:43.826955 systemd[1]: Started sshd@4-10.200.20.16:22-10.200.16.10:41014.service - OpenSSH per-connection server daemon (10.200.16.10:41014). Mar 2 12:55:44.247685 sshd[2285]: Accepted publickey for core from 10.200.16.10 port 41014 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:55:44.248797 sshd-session[2285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:44.252388 systemd-logind[1862]: New session 7 of user core. Mar 2 12:55:44.262504 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 2 12:55:44.431498 sudo[2289]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 2 12:55:44.431706 sudo[2289]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:55:44.442868 sudo[2289]: pam_unix(sudo:session): session closed for user root Mar 2 12:55:44.520525 sshd[2288]: Connection closed by 10.200.16.10 port 41014 Mar 2 12:55:44.521204 sshd-session[2285]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:44.525094 systemd[1]: sshd@4-10.200.20.16:22-10.200.16.10:41014.service: Deactivated successfully. Mar 2 12:55:44.526749 systemd[1]: session-7.scope: Deactivated successfully. Mar 2 12:55:44.527661 systemd-logind[1862]: Session 7 logged out. Waiting for processes to exit. Mar 2 12:55:44.528958 systemd-logind[1862]: Removed session 7. Mar 2 12:55:44.620076 systemd[1]: Started sshd@5-10.200.20.16:22-10.200.16.10:41024.service - OpenSSH per-connection server daemon (10.200.16.10:41024). Mar 2 12:55:45.040839 sshd[2295]: Accepted publickey for core from 10.200.16.10 port 41024 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:55:45.041993 sshd-session[2295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:45.045393 systemd-logind[1862]: New session 8 of user core. Mar 2 12:55:45.055530 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 2 12:55:45.198414 sudo[2300]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 2 12:55:45.198624 sudo[2300]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:55:45.206008 sudo[2300]: pam_unix(sudo:session): session closed for user root Mar 2 12:55:45.210112 sudo[2299]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 2 12:55:45.210658 sudo[2299]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:55:45.218757 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 2 12:55:45.247826 augenrules[2322]: No rules Mar 2 12:55:45.249101 systemd[1]: audit-rules.service: Deactivated successfully. Mar 2 12:55:45.249344 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 2 12:55:45.250584 sudo[2299]: pam_unix(sudo:session): session closed for user root Mar 2 12:55:45.327432 sshd[2298]: Connection closed by 10.200.16.10 port 41024 Mar 2 12:55:45.327901 sshd-session[2295]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:45.331901 systemd[1]: sshd@5-10.200.20.16:22-10.200.16.10:41024.service: Deactivated successfully. Mar 2 12:55:45.333328 systemd[1]: session-8.scope: Deactivated successfully. Mar 2 12:55:45.333920 systemd-logind[1862]: Session 8 logged out. Waiting for processes to exit. Mar 2 12:55:45.335007 systemd-logind[1862]: Removed session 8. Mar 2 12:55:45.419854 systemd[1]: Started sshd@6-10.200.20.16:22-10.200.16.10:41028.service - OpenSSH per-connection server daemon (10.200.16.10:41028). Mar 2 12:55:45.842727 sshd[2331]: Accepted publickey for core from 10.200.16.10 port 41028 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:55:45.843795 sshd-session[2331]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:45.847635 systemd-logind[1862]: New session 9 of user core. Mar 2 12:55:45.854295 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 2 12:55:46.001255 sudo[2335]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 2 12:55:46.001801 sudo[2335]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:55:48.397371 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 2 12:55:48.409436 (dockerd)[2353]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 2 12:55:49.700157 dockerd[2353]: time="2026-03-02T12:55:49.698214340Z" level=info msg="Starting up" Mar 2 12:55:49.701364 dockerd[2353]: time="2026-03-02T12:55:49.701339796Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 2 12:55:49.709554 dockerd[2353]: time="2026-03-02T12:55:49.709509452Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 2 12:55:49.743444 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3392432967-merged.mount: Deactivated successfully. Mar 2 12:55:49.810122 systemd[1]: var-lib-docker-metacopy\x2dcheck1814259522-merged.mount: Deactivated successfully. Mar 2 12:55:49.827826 dockerd[2353]: time="2026-03-02T12:55:49.827621244Z" level=info msg="Loading containers: start." Mar 2 12:55:49.850182 kernel: Initializing XFRM netlink socket Mar 2 12:55:50.200965 systemd-networkd[1497]: docker0: Link UP Mar 2 12:55:50.226266 dockerd[2353]: time="2026-03-02T12:55:50.226215012Z" level=info msg="Loading containers: done." Mar 2 12:55:50.251124 dockerd[2353]: time="2026-03-02T12:55:50.250784780Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 2 12:55:50.251124 dockerd[2353]: time="2026-03-02T12:55:50.250876804Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 2 12:55:50.251124 dockerd[2353]: time="2026-03-02T12:55:50.250966844Z" level=info msg="Initializing buildkit" Mar 2 12:55:50.303130 dockerd[2353]: time="2026-03-02T12:55:50.303085972Z" level=info msg="Completed buildkit initialization" Mar 2 12:55:50.308519 dockerd[2353]: time="2026-03-02T12:55:50.308460388Z" level=info msg="Daemon has completed initialization" Mar 2 12:55:50.308645 dockerd[2353]: time="2026-03-02T12:55:50.308533556Z" level=info msg="API listen on /run/docker.sock" Mar 2 12:55:50.308865 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 2 12:55:50.648476 containerd[1885]: time="2026-03-02T12:55:50.648329092Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 2 12:55:50.711322 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 2 12:55:50.715161 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:55:50.742219 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2528745498-merged.mount: Deactivated successfully. Mar 2 12:55:50.831456 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:55:50.841625 (kubelet)[2569]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:55:50.912731 kubelet[2569]: E0302 12:55:50.912581 2569 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:55:50.915124 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:55:50.915258 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:55:50.916262 systemd[1]: kubelet.service: Consumed 113ms CPU time, 104.4M memory peak. Mar 2 12:55:51.977808 chronyd[1835]: Selected source PHC0 Mar 2 12:55:52.018310 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2833776545.mount: Deactivated successfully. Mar 2 12:55:53.340857 containerd[1885]: time="2026-03-02T12:55:53.340794458Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:53.344505 containerd[1885]: time="2026-03-02T12:55:53.344472498Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=27390174" Mar 2 12:55:53.347407 containerd[1885]: time="2026-03-02T12:55:53.347378066Z" level=info msg="ImageCreate event name:\"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:53.352599 containerd[1885]: time="2026-03-02T12:55:53.352561306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:53.353418 containerd[1885]: time="2026-03-02T12:55:53.353200681Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"27386773\" in 2.704836061s" Mar 2 12:55:53.353418 containerd[1885]: time="2026-03-02T12:55:53.353231633Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\"" Mar 2 12:55:53.353816 containerd[1885]: time="2026-03-02T12:55:53.353793857Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 2 12:55:54.835178 containerd[1885]: time="2026-03-02T12:55:54.834987041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:54.838886 containerd[1885]: time="2026-03-02T12:55:54.838845625Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=23552106" Mar 2 12:55:54.841774 containerd[1885]: time="2026-03-02T12:55:54.841733721Z" level=info msg="ImageCreate event name:\"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:54.846464 containerd[1885]: time="2026-03-02T12:55:54.846437809Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:54.847642 containerd[1885]: time="2026-03-02T12:55:54.847618065Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"25136510\" in 1.493801448s" Mar 2 12:55:54.847837 containerd[1885]: time="2026-03-02T12:55:54.847721777Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\"" Mar 2 12:55:54.848295 containerd[1885]: time="2026-03-02T12:55:54.848139961Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 2 12:55:56.662329 containerd[1885]: time="2026-03-02T12:55:56.661671817Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:56.665172 containerd[1885]: time="2026-03-02T12:55:56.665135385Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=18301305" Mar 2 12:55:56.668465 containerd[1885]: time="2026-03-02T12:55:56.668442857Z" level=info msg="ImageCreate event name:\"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:56.672594 containerd[1885]: time="2026-03-02T12:55:56.672559057Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:56.673535 containerd[1885]: time="2026-03-02T12:55:56.673504713Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"19885727\" in 1.825110936s" Mar 2 12:55:56.673535 containerd[1885]: time="2026-03-02T12:55:56.673535481Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\"" Mar 2 12:55:56.674162 containerd[1885]: time="2026-03-02T12:55:56.674099961Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 2 12:55:57.728623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1443310978.mount: Deactivated successfully. Mar 2 12:55:58.022265 containerd[1885]: time="2026-03-02T12:55:58.021848841Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:58.025604 containerd[1885]: time="2026-03-02T12:55:58.025571465Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=28148870" Mar 2 12:55:58.028555 containerd[1885]: time="2026-03-02T12:55:58.028515473Z" level=info msg="ImageCreate event name:\"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:58.033189 containerd[1885]: time="2026-03-02T12:55:58.032649505Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:58.033287 containerd[1885]: time="2026-03-02T12:55:58.033136657Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"28147889\" in 1.359009112s" Mar 2 12:55:58.033349 containerd[1885]: time="2026-03-02T12:55:58.033334129Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\"" Mar 2 12:55:58.033803 containerd[1885]: time="2026-03-02T12:55:58.033784785Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 2 12:55:58.691327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2132512222.mount: Deactivated successfully. Mar 2 12:55:59.746672 containerd[1885]: time="2026-03-02T12:55:59.746615985Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:59.750238 containerd[1885]: time="2026-03-02T12:55:59.750203489Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" Mar 2 12:55:59.754175 containerd[1885]: time="2026-03-02T12:55:59.753881209Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:59.758791 containerd[1885]: time="2026-03-02T12:55:59.758739657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:55:59.759992 containerd[1885]: time="2026-03-02T12:55:59.759294985Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.72542068s" Mar 2 12:55:59.759992 containerd[1885]: time="2026-03-02T12:55:59.759328689Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Mar 2 12:55:59.760284 containerd[1885]: time="2026-03-02T12:55:59.760267713Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 2 12:56:00.311674 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3262449274.mount: Deactivated successfully. Mar 2 12:56:00.334067 containerd[1885]: time="2026-03-02T12:56:00.333604377Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 2 12:56:00.336489 containerd[1885]: time="2026-03-02T12:56:00.336287465Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 2 12:56:00.340111 containerd[1885]: time="2026-03-02T12:56:00.340081839Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 2 12:56:00.344713 containerd[1885]: time="2026-03-02T12:56:00.344672135Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 2 12:56:00.345271 containerd[1885]: time="2026-03-02T12:56:00.345141701Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 584.709947ms" Mar 2 12:56:00.345271 containerd[1885]: time="2026-03-02T12:56:00.345193042Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 2 12:56:00.345978 containerd[1885]: time="2026-03-02T12:56:00.345924741Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 2 12:56:01.010429 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 2 12:56:01.012970 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:56:01.035672 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2880871407.mount: Deactivated successfully. Mar 2 12:56:01.722081 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:56:01.725523 (kubelet)[2717]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:56:01.754290 kubelet[2717]: E0302 12:56:01.754241 2717 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:56:01.756714 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:56:01.756830 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:56:01.757386 systemd[1]: kubelet.service: Consumed 114ms CPU time, 107M memory peak. Mar 2 12:56:03.695833 containerd[1885]: time="2026-03-02T12:56:03.695776182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:03.699945 containerd[1885]: time="2026-03-02T12:56:03.699912782Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=21885780" Mar 2 12:56:03.703619 containerd[1885]: time="2026-03-02T12:56:03.703591294Z" level=info msg="ImageCreate event name:\"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:03.708224 containerd[1885]: time="2026-03-02T12:56:03.708197462Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:03.709309 containerd[1885]: time="2026-03-02T12:56:03.709278662Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"21882972\" in 3.363331071s" Mar 2 12:56:03.709329 containerd[1885]: time="2026-03-02T12:56:03.709315142Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\"" Mar 2 12:56:06.377878 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:56:06.377991 systemd[1]: kubelet.service: Consumed 114ms CPU time, 107M memory peak. Mar 2 12:56:06.380445 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:56:06.401338 systemd[1]: Reload requested from client PID 2811 ('systemctl') (unit session-9.scope)... Mar 2 12:56:06.401477 systemd[1]: Reloading... Mar 2 12:56:06.499279 zram_generator::config[2864]: No configuration found. Mar 2 12:56:06.650570 systemd[1]: Reloading finished in 248 ms. Mar 2 12:56:06.697493 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 2 12:56:06.697563 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 2 12:56:06.697833 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:56:06.697890 systemd[1]: kubelet.service: Consumed 77ms CPU time, 95.2M memory peak. Mar 2 12:56:06.699240 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:56:06.961812 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:56:06.969601 (kubelet)[2926]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 2 12:56:06.993654 kubelet[2926]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:56:06.995166 kubelet[2926]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 2 12:56:06.995166 kubelet[2926]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:56:06.995166 kubelet[2926]: I0302 12:56:06.994065 2926 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 2 12:56:07.910972 kubelet[2926]: I0302 12:56:07.910126 2926 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 2 12:56:07.910972 kubelet[2926]: I0302 12:56:07.910184 2926 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 2 12:56:07.910972 kubelet[2926]: I0302 12:56:07.910362 2926 server.go:956] "Client rotation is on, will bootstrap in background" Mar 2 12:56:07.931331 kubelet[2926]: E0302 12:56:07.931288 2926 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.16:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 2 12:56:07.932215 kubelet[2926]: I0302 12:56:07.932191 2926 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 2 12:56:07.938469 kubelet[2926]: I0302 12:56:07.938445 2926 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 2 12:56:07.942422 kubelet[2926]: I0302 12:56:07.942395 2926 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 2 12:56:07.944288 kubelet[2926]: I0302 12:56:07.944247 2926 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 2 12:56:07.944412 kubelet[2926]: I0302 12:56:07.944292 2926 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.101-5c781fe851","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 2 12:56:07.944491 kubelet[2926]: I0302 12:56:07.944416 2926 topology_manager.go:138] "Creating topology manager with none policy" Mar 2 12:56:07.944491 kubelet[2926]: I0302 12:56:07.944424 2926 container_manager_linux.go:303] "Creating device plugin manager" Mar 2 12:56:07.945324 kubelet[2926]: I0302 12:56:07.945304 2926 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:56:07.947898 kubelet[2926]: I0302 12:56:07.947874 2926 kubelet.go:480] "Attempting to sync node with API server" Mar 2 12:56:07.947942 kubelet[2926]: I0302 12:56:07.947908 2926 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 2 12:56:07.947942 kubelet[2926]: I0302 12:56:07.947928 2926 kubelet.go:386] "Adding apiserver pod source" Mar 2 12:56:07.947942 kubelet[2926]: I0302 12:56:07.947941 2926 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 2 12:56:07.949760 kubelet[2926]: I0302 12:56:07.949739 2926 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 2 12:56:07.950170 kubelet[2926]: I0302 12:56:07.950137 2926 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 2 12:56:07.950232 kubelet[2926]: W0302 12:56:07.950219 2926 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 2 12:56:07.952399 kubelet[2926]: I0302 12:56:07.952133 2926 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 2 12:56:07.952399 kubelet[2926]: I0302 12:56:07.952180 2926 server.go:1289] "Started kubelet" Mar 2 12:56:07.956438 kubelet[2926]: I0302 12:56:07.956413 2926 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 2 12:56:07.960180 kubelet[2926]: E0302 12:56:07.958357 2926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.16:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.16:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.101-5c781fe851.1899077e6ebd0956 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.101-5c781fe851,UID:ci-4459.2.101-5c781fe851,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.101-5c781fe851,},FirstTimestamp:2026-03-02 12:56:07.952157014 +0000 UTC m=+0.978792561,LastTimestamp:2026-03-02 12:56:07.952157014 +0000 UTC m=+0.978792561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.101-5c781fe851,}" Mar 2 12:56:07.960180 kubelet[2926]: E0302 12:56:07.959321 2926 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.16:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.101-5c781fe851&limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 2 12:56:07.960180 kubelet[2926]: E0302 12:56:07.959385 2926 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.16:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 2 12:56:07.960427 kubelet[2926]: I0302 12:56:07.960398 2926 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 2 12:56:07.960599 kubelet[2926]: I0302 12:56:07.960567 2926 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 2 12:56:07.960874 kubelet[2926]: I0302 12:56:07.960856 2926 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 2 12:56:07.963011 kubelet[2926]: I0302 12:56:07.962987 2926 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 2 12:56:07.963211 kubelet[2926]: I0302 12:56:07.963191 2926 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 2 12:56:07.963398 kubelet[2926]: E0302 12:56:07.963378 2926 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.2.101-5c781fe851\" not found" Mar 2 12:56:07.964029 kubelet[2926]: I0302 12:56:07.964011 2926 server.go:317] "Adding debug handlers to kubelet server" Mar 2 12:56:07.964931 kubelet[2926]: E0302 12:56:07.964905 2926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.16:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.101-5c781fe851?timeout=10s\": dial tcp 10.200.20.16:6443: connect: connection refused" interval="200ms" Mar 2 12:56:07.965419 kubelet[2926]: I0302 12:56:07.965405 2926 factory.go:223] Registration of the systemd container factory successfully Mar 2 12:56:07.965563 kubelet[2926]: I0302 12:56:07.965549 2926 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 2 12:56:07.966415 kubelet[2926]: I0302 12:56:07.966404 2926 reconciler.go:26] "Reconciler: start to sync state" Mar 2 12:56:07.966547 kubelet[2926]: I0302 12:56:07.966526 2926 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 2 12:56:07.966911 kubelet[2926]: E0302 12:56:07.966886 2926 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.16:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 2 12:56:07.967067 kubelet[2926]: I0302 12:56:07.967052 2926 factory.go:223] Registration of the containerd container factory successfully Mar 2 12:56:07.970242 kubelet[2926]: E0302 12:56:07.970218 2926 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 2 12:56:07.985089 kubelet[2926]: I0302 12:56:07.985071 2926 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 2 12:56:07.985236 kubelet[2926]: I0302 12:56:07.985224 2926 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 2 12:56:07.985298 kubelet[2926]: I0302 12:56:07.985291 2926 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:56:08.064363 kubelet[2926]: E0302 12:56:08.064318 2926 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.2.101-5c781fe851\" not found" Mar 2 12:56:08.115431 kubelet[2926]: I0302 12:56:08.115175 2926 policy_none.go:49] "None policy: Start" Mar 2 12:56:08.115431 kubelet[2926]: I0302 12:56:08.115207 2926 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 2 12:56:08.115431 kubelet[2926]: I0302 12:56:08.115218 2926 state_mem.go:35] "Initializing new in-memory state store" Mar 2 12:56:08.118283 kubelet[2926]: I0302 12:56:08.118221 2926 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 2 12:56:08.121886 kubelet[2926]: I0302 12:56:08.121865 2926 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 2 12:56:08.124803 kubelet[2926]: I0302 12:56:08.121956 2926 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 2 12:56:08.124803 kubelet[2926]: I0302 12:56:08.121979 2926 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 2 12:56:08.124803 kubelet[2926]: I0302 12:56:08.121985 2926 kubelet.go:2436] "Starting kubelet main sync loop" Mar 2 12:56:08.124803 kubelet[2926]: E0302 12:56:08.122022 2926 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 2 12:56:08.124803 kubelet[2926]: E0302 12:56:08.123389 2926 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.16:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 2 12:56:08.129900 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 2 12:56:08.139025 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 2 12:56:08.142111 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 2 12:56:08.149998 kubelet[2926]: E0302 12:56:08.149970 2926 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 2 12:56:08.150206 kubelet[2926]: I0302 12:56:08.150190 2926 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 2 12:56:08.150249 kubelet[2926]: I0302 12:56:08.150208 2926 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 2 12:56:08.150857 kubelet[2926]: I0302 12:56:08.150713 2926 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 2 12:56:08.152036 kubelet[2926]: E0302 12:56:08.151998 2926 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 2 12:56:08.152115 kubelet[2926]: E0302 12:56:08.152041 2926 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.2.101-5c781fe851\" not found" Mar 2 12:56:08.166204 kubelet[2926]: E0302 12:56:08.165681 2926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.16:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.101-5c781fe851?timeout=10s\": dial tcp 10.200.20.16:6443: connect: connection refused" interval="400ms" Mar 2 12:56:08.237009 systemd[1]: Created slice kubepods-burstable-podff83b43189496ea97b64fae57099098c.slice - libcontainer container kubepods-burstable-podff83b43189496ea97b64fae57099098c.slice. Mar 2 12:56:08.243922 kubelet[2926]: E0302 12:56:08.243848 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.248697 systemd[1]: Created slice kubepods-burstable-pod119af1514cab0a50bc76f2fc28e3f1bf.slice - libcontainer container kubepods-burstable-pod119af1514cab0a50bc76f2fc28e3f1bf.slice. Mar 2 12:56:08.250247 kubelet[2926]: E0302 12:56:08.250223 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.251625 kubelet[2926]: I0302 12:56:08.251604 2926 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.251979 kubelet[2926]: E0302 12:56:08.251955 2926 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.16:6443/api/v1/nodes\": dial tcp 10.200.20.16:6443: connect: connection refused" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.259634 systemd[1]: Created slice kubepods-burstable-pod66b520b4ed22b5df46fca945135a03a4.slice - libcontainer container kubepods-burstable-pod66b520b4ed22b5df46fca945135a03a4.slice. Mar 2 12:56:08.261333 kubelet[2926]: E0302 12:56:08.261241 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268560 kubelet[2926]: I0302 12:56:08.268528 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ff83b43189496ea97b64fae57099098c-ca-certs\") pod \"kube-apiserver-ci-4459.2.101-5c781fe851\" (UID: \"ff83b43189496ea97b64fae57099098c\") " pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268560 kubelet[2926]: I0302 12:56:08.268564 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268665 kubelet[2926]: I0302 12:56:08.268580 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268665 kubelet[2926]: I0302 12:56:08.268590 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/66b520b4ed22b5df46fca945135a03a4-kubeconfig\") pod \"kube-scheduler-ci-4459.2.101-5c781fe851\" (UID: \"66b520b4ed22b5df46fca945135a03a4\") " pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268665 kubelet[2926]: I0302 12:56:08.268601 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ff83b43189496ea97b64fae57099098c-k8s-certs\") pod \"kube-apiserver-ci-4459.2.101-5c781fe851\" (UID: \"ff83b43189496ea97b64fae57099098c\") " pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268665 kubelet[2926]: I0302 12:56:08.268610 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ff83b43189496ea97b64fae57099098c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.101-5c781fe851\" (UID: \"ff83b43189496ea97b64fae57099098c\") " pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268665 kubelet[2926]: I0302 12:56:08.268620 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-ca-certs\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268750 kubelet[2926]: I0302 12:56:08.268636 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.268750 kubelet[2926]: I0302 12:56:08.268646 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:08.454486 kubelet[2926]: I0302 12:56:08.453896 2926 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.454486 kubelet[2926]: E0302 12:56:08.454212 2926 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.16:6443/api/v1/nodes\": dial tcp 10.200.20.16:6443: connect: connection refused" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.545829 containerd[1885]: time="2026-03-02T12:56:08.545789280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.101-5c781fe851,Uid:ff83b43189496ea97b64fae57099098c,Namespace:kube-system,Attempt:0,}" Mar 2 12:56:08.551292 containerd[1885]: time="2026-03-02T12:56:08.551263030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.101-5c781fe851,Uid:119af1514cab0a50bc76f2fc28e3f1bf,Namespace:kube-system,Attempt:0,}" Mar 2 12:56:08.564507 containerd[1885]: time="2026-03-02T12:56:08.564330198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.101-5c781fe851,Uid:66b520b4ed22b5df46fca945135a03a4,Namespace:kube-system,Attempt:0,}" Mar 2 12:56:08.566058 kubelet[2926]: E0302 12:56:08.566029 2926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.16:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.101-5c781fe851?timeout=10s\": dial tcp 10.200.20.16:6443: connect: connection refused" interval="800ms" Mar 2 12:56:08.624363 containerd[1885]: time="2026-03-02T12:56:08.624280332Z" level=info msg="connecting to shim 72ff9b22ba37df63bbad9ce1acea2c7390e3ded574221c45e0ac41636fad99fe" address="unix:///run/containerd/s/93918d970e9be180dca323beae8ca02d7c4731042f28b4628606da46155f9a3b" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:08.625612 containerd[1885]: time="2026-03-02T12:56:08.625576376Z" level=info msg="connecting to shim cad4703e000c01ddb0b4ca99573f70345af62bc53b5765be745c653510df5909" address="unix:///run/containerd/s/61c68223fadbc05255c5fac17cff877cdda1bec24bf27788d0cafd802af986a8" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:08.647456 systemd[1]: Started cri-containerd-cad4703e000c01ddb0b4ca99573f70345af62bc53b5765be745c653510df5909.scope - libcontainer container cad4703e000c01ddb0b4ca99573f70345af62bc53b5765be745c653510df5909. Mar 2 12:56:08.654629 systemd[1]: Started cri-containerd-72ff9b22ba37df63bbad9ce1acea2c7390e3ded574221c45e0ac41636fad99fe.scope - libcontainer container 72ff9b22ba37df63bbad9ce1acea2c7390e3ded574221c45e0ac41636fad99fe. Mar 2 12:56:08.657362 containerd[1885]: time="2026-03-02T12:56:08.657246726Z" level=info msg="connecting to shim 7d131850ebfbb28b6c5a84c1ad70d1d866906e7870e0eb7ef1a31443e0245dd2" address="unix:///run/containerd/s/a0c6e3cf4e00561abf334e2c87063f6b333e4c423564306f4c2ab2a38a33bd22" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:08.687430 systemd[1]: Started cri-containerd-7d131850ebfbb28b6c5a84c1ad70d1d866906e7870e0eb7ef1a31443e0245dd2.scope - libcontainer container 7d131850ebfbb28b6c5a84c1ad70d1d866906e7870e0eb7ef1a31443e0245dd2. Mar 2 12:56:08.712655 containerd[1885]: time="2026-03-02T12:56:08.712177009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.101-5c781fe851,Uid:ff83b43189496ea97b64fae57099098c,Namespace:kube-system,Attempt:0,} returns sandbox id \"cad4703e000c01ddb0b4ca99573f70345af62bc53b5765be745c653510df5909\"" Mar 2 12:56:08.718349 containerd[1885]: time="2026-03-02T12:56:08.718302393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.101-5c781fe851,Uid:119af1514cab0a50bc76f2fc28e3f1bf,Namespace:kube-system,Attempt:0,} returns sandbox id \"72ff9b22ba37df63bbad9ce1acea2c7390e3ded574221c45e0ac41636fad99fe\"" Mar 2 12:56:08.728044 containerd[1885]: time="2026-03-02T12:56:08.727349260Z" level=info msg="CreateContainer within sandbox \"cad4703e000c01ddb0b4ca99573f70345af62bc53b5765be745c653510df5909\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 2 12:56:08.732876 containerd[1885]: time="2026-03-02T12:56:08.732844380Z" level=info msg="CreateContainer within sandbox \"72ff9b22ba37df63bbad9ce1acea2c7390e3ded574221c45e0ac41636fad99fe\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 2 12:56:08.751187 containerd[1885]: time="2026-03-02T12:56:08.751123985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.101-5c781fe851,Uid:66b520b4ed22b5df46fca945135a03a4,Namespace:kube-system,Attempt:0,} returns sandbox id \"7d131850ebfbb28b6c5a84c1ad70d1d866906e7870e0eb7ef1a31443e0245dd2\"" Mar 2 12:56:08.761301 containerd[1885]: time="2026-03-02T12:56:08.761250660Z" level=info msg="Container 4add8066b4591f09b49d21794b3cb9c677e9ebfff30f1deab9fbccd73bdbc338: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:08.762105 containerd[1885]: time="2026-03-02T12:56:08.762076333Z" level=info msg="CreateContainer within sandbox \"7d131850ebfbb28b6c5a84c1ad70d1d866906e7870e0eb7ef1a31443e0245dd2\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 2 12:56:08.794213 containerd[1885]: time="2026-03-02T12:56:08.793707670Z" level=info msg="Container 80dffe8beb77f523e52de0bf22c6742c9924596d419340e779fd54eebbebb4dd: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:08.805538 containerd[1885]: time="2026-03-02T12:56:08.805487099Z" level=info msg="CreateContainer within sandbox \"cad4703e000c01ddb0b4ca99573f70345af62bc53b5765be745c653510df5909\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4add8066b4591f09b49d21794b3cb9c677e9ebfff30f1deab9fbccd73bdbc338\"" Mar 2 12:56:08.806486 containerd[1885]: time="2026-03-02T12:56:08.806295850Z" level=info msg="StartContainer for \"4add8066b4591f09b49d21794b3cb9c677e9ebfff30f1deab9fbccd73bdbc338\"" Mar 2 12:56:08.807163 containerd[1885]: time="2026-03-02T12:56:08.807129739Z" level=info msg="connecting to shim 4add8066b4591f09b49d21794b3cb9c677e9ebfff30f1deab9fbccd73bdbc338" address="unix:///run/containerd/s/61c68223fadbc05255c5fac17cff877cdda1bec24bf27788d0cafd802af986a8" protocol=ttrpc version=3 Mar 2 12:56:08.810436 containerd[1885]: time="2026-03-02T12:56:08.810260424Z" level=info msg="Container e8e1f69fa757c61955eec8cf294112ba0b226942c1fda6cd24f27b406c2192bf: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:08.825068 containerd[1885]: time="2026-03-02T12:56:08.825018495Z" level=info msg="CreateContainer within sandbox \"72ff9b22ba37df63bbad9ce1acea2c7390e3ded574221c45e0ac41636fad99fe\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"80dffe8beb77f523e52de0bf22c6742c9924596d419340e779fd54eebbebb4dd\"" Mar 2 12:56:08.825448 systemd[1]: Started cri-containerd-4add8066b4591f09b49d21794b3cb9c677e9ebfff30f1deab9fbccd73bdbc338.scope - libcontainer container 4add8066b4591f09b49d21794b3cb9c677e9ebfff30f1deab9fbccd73bdbc338. Mar 2 12:56:08.826567 containerd[1885]: time="2026-03-02T12:56:08.826506523Z" level=info msg="StartContainer for \"80dffe8beb77f523e52de0bf22c6742c9924596d419340e779fd54eebbebb4dd\"" Mar 2 12:56:08.828505 containerd[1885]: time="2026-03-02T12:56:08.828473293Z" level=info msg="connecting to shim 80dffe8beb77f523e52de0bf22c6742c9924596d419340e779fd54eebbebb4dd" address="unix:///run/containerd/s/93918d970e9be180dca323beae8ca02d7c4731042f28b4628606da46155f9a3b" protocol=ttrpc version=3 Mar 2 12:56:08.835765 containerd[1885]: time="2026-03-02T12:56:08.835662948Z" level=info msg="CreateContainer within sandbox \"7d131850ebfbb28b6c5a84c1ad70d1d866906e7870e0eb7ef1a31443e0245dd2\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e8e1f69fa757c61955eec8cf294112ba0b226942c1fda6cd24f27b406c2192bf\"" Mar 2 12:56:08.836845 containerd[1885]: time="2026-03-02T12:56:08.836802852Z" level=info msg="StartContainer for \"e8e1f69fa757c61955eec8cf294112ba0b226942c1fda6cd24f27b406c2192bf\"" Mar 2 12:56:08.838197 containerd[1885]: time="2026-03-02T12:56:08.838166977Z" level=info msg="connecting to shim e8e1f69fa757c61955eec8cf294112ba0b226942c1fda6cd24f27b406c2192bf" address="unix:///run/containerd/s/a0c6e3cf4e00561abf334e2c87063f6b333e4c423564306f4c2ab2a38a33bd22" protocol=ttrpc version=3 Mar 2 12:56:08.856913 kubelet[2926]: I0302 12:56:08.856883 2926 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.857404 systemd[1]: Started cri-containerd-80dffe8beb77f523e52de0bf22c6742c9924596d419340e779fd54eebbebb4dd.scope - libcontainer container 80dffe8beb77f523e52de0bf22c6742c9924596d419340e779fd54eebbebb4dd. Mar 2 12:56:08.858212 kubelet[2926]: E0302 12:56:08.858181 2926 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.16:6443/api/v1/nodes\": dial tcp 10.200.20.16:6443: connect: connection refused" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:08.863203 systemd[1]: Started cri-containerd-e8e1f69fa757c61955eec8cf294112ba0b226942c1fda6cd24f27b406c2192bf.scope - libcontainer container e8e1f69fa757c61955eec8cf294112ba0b226942c1fda6cd24f27b406c2192bf. Mar 2 12:56:08.891935 containerd[1885]: time="2026-03-02T12:56:08.891884490Z" level=info msg="StartContainer for \"4add8066b4591f09b49d21794b3cb9c677e9ebfff30f1deab9fbccd73bdbc338\" returns successfully" Mar 2 12:56:08.926637 containerd[1885]: time="2026-03-02T12:56:08.926593577Z" level=info msg="StartContainer for \"80dffe8beb77f523e52de0bf22c6742c9924596d419340e779fd54eebbebb4dd\" returns successfully" Mar 2 12:56:08.930618 containerd[1885]: time="2026-03-02T12:56:08.930520314Z" level=info msg="StartContainer for \"e8e1f69fa757c61955eec8cf294112ba0b226942c1fda6cd24f27b406c2192bf\" returns successfully" Mar 2 12:56:09.132415 kubelet[2926]: E0302 12:56:09.130756 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:09.134524 kubelet[2926]: E0302 12:56:09.134406 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:09.136314 kubelet[2926]: E0302 12:56:09.136301 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:09.660239 kubelet[2926]: I0302 12:56:09.660201 2926 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:10.141173 kubelet[2926]: E0302 12:56:10.140987 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:10.142620 kubelet[2926]: E0302 12:56:10.142532 2926 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:10.176254 kubelet[2926]: E0302 12:56:10.176205 2926 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459.2.101-5c781fe851\" not found" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:10.255984 kubelet[2926]: I0302 12:56:10.255843 2926 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:10.255984 kubelet[2926]: E0302 12:56:10.255881 2926 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459.2.101-5c781fe851\": node \"ci-4459.2.101-5c781fe851\" not found" Mar 2 12:56:10.264074 kubelet[2926]: I0302 12:56:10.263855 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:10.327622 kubelet[2926]: E0302 12:56:10.327583 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.101-5c781fe851\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:10.328005 kubelet[2926]: I0302 12:56:10.327788 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:10.329542 kubelet[2926]: E0302 12:56:10.329520 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.101-5c781fe851\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:10.329638 kubelet[2926]: I0302 12:56:10.329627 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" Mar 2 12:56:10.333341 kubelet[2926]: E0302 12:56:10.333313 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.101-5c781fe851\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" Mar 2 12:56:10.959904 kubelet[2926]: I0302 12:56:10.959865 2926 apiserver.go:52] "Watching apiserver" Mar 2 12:56:10.967197 kubelet[2926]: I0302 12:56:10.967160 2926 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 2 12:56:11.141552 kubelet[2926]: I0302 12:56:11.141500 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:11.159518 kubelet[2926]: I0302 12:56:11.159478 2926 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:56:12.544112 systemd[1]: Reload requested from client PID 3203 ('systemctl') (unit session-9.scope)... Mar 2 12:56:12.544227 systemd[1]: Reloading... Mar 2 12:56:12.623181 zram_generator::config[3265]: No configuration found. Mar 2 12:56:12.789108 systemd[1]: Reloading finished in 244 ms. Mar 2 12:56:12.812958 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:56:12.834965 systemd[1]: kubelet.service: Deactivated successfully. Mar 2 12:56:12.835210 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:56:12.835276 systemd[1]: kubelet.service: Consumed 1.255s CPU time, 125.4M memory peak. Mar 2 12:56:12.837659 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:56:12.939034 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:56:12.947603 (kubelet)[3314]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 2 12:56:12.981041 kubelet[3314]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:56:12.982177 kubelet[3314]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 2 12:56:12.982177 kubelet[3314]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:56:12.982177 kubelet[3314]: I0302 12:56:12.981473 3314 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 2 12:56:12.985939 kubelet[3314]: I0302 12:56:12.985907 3314 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 2 12:56:12.986042 kubelet[3314]: I0302 12:56:12.986032 3314 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 2 12:56:12.986292 kubelet[3314]: I0302 12:56:12.986272 3314 server.go:956] "Client rotation is on, will bootstrap in background" Mar 2 12:56:12.987220 kubelet[3314]: I0302 12:56:12.987200 3314 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 2 12:56:12.988796 kubelet[3314]: I0302 12:56:12.988769 3314 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 2 12:56:12.994923 kubelet[3314]: I0302 12:56:12.994908 3314 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 2 12:56:12.997416 kubelet[3314]: I0302 12:56:12.997398 3314 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 2 12:56:12.997696 kubelet[3314]: I0302 12:56:12.997668 3314 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 2 12:56:12.997864 kubelet[3314]: I0302 12:56:12.997753 3314 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.101-5c781fe851","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 2 12:56:12.997982 kubelet[3314]: I0302 12:56:12.997971 3314 topology_manager.go:138] "Creating topology manager with none policy" Mar 2 12:56:12.998027 kubelet[3314]: I0302 12:56:12.998021 3314 container_manager_linux.go:303] "Creating device plugin manager" Mar 2 12:56:12.998110 kubelet[3314]: I0302 12:56:12.998102 3314 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:56:12.998380 kubelet[3314]: I0302 12:56:12.998366 3314 kubelet.go:480] "Attempting to sync node with API server" Mar 2 12:56:12.998447 kubelet[3314]: I0302 12:56:12.998439 3314 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 2 12:56:12.998504 kubelet[3314]: I0302 12:56:12.998497 3314 kubelet.go:386] "Adding apiserver pod source" Mar 2 12:56:12.998557 kubelet[3314]: I0302 12:56:12.998550 3314 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 2 12:56:13.002992 kubelet[3314]: I0302 12:56:13.002975 3314 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 2 12:56:13.003740 kubelet[3314]: I0302 12:56:13.003715 3314 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 2 12:56:13.007736 kubelet[3314]: I0302 12:56:13.007719 3314 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 2 12:56:13.008199 kubelet[3314]: I0302 12:56:13.008189 3314 server.go:1289] "Started kubelet" Mar 2 12:56:13.010349 kubelet[3314]: I0302 12:56:13.010333 3314 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 2 12:56:13.012283 kubelet[3314]: I0302 12:56:13.012257 3314 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 2 12:56:13.013217 kubelet[3314]: I0302 12:56:13.012834 3314 server.go:317] "Adding debug handlers to kubelet server" Mar 2 12:56:13.015445 kubelet[3314]: I0302 12:56:13.015401 3314 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 2 12:56:13.015813 kubelet[3314]: I0302 12:56:13.015799 3314 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 2 12:56:13.016051 kubelet[3314]: I0302 12:56:13.016035 3314 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 2 12:56:13.017474 kubelet[3314]: I0302 12:56:13.017460 3314 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 2 12:56:13.017661 kubelet[3314]: I0302 12:56:13.017649 3314 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 2 12:56:13.017794 kubelet[3314]: I0302 12:56:13.017785 3314 reconciler.go:26] "Reconciler: start to sync state" Mar 2 12:56:13.019286 kubelet[3314]: I0302 12:56:13.019263 3314 factory.go:223] Registration of the systemd container factory successfully Mar 2 12:56:13.019496 kubelet[3314]: I0302 12:56:13.019479 3314 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 2 12:56:13.020431 kubelet[3314]: E0302 12:56:13.020410 3314 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 2 12:56:13.021224 kubelet[3314]: I0302 12:56:13.020791 3314 factory.go:223] Registration of the containerd container factory successfully Mar 2 12:56:13.033916 kubelet[3314]: I0302 12:56:13.033888 3314 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 2 12:56:13.034907 kubelet[3314]: I0302 12:56:13.034890 3314 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 2 12:56:13.035012 kubelet[3314]: I0302 12:56:13.035003 3314 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 2 12:56:13.035166 kubelet[3314]: I0302 12:56:13.035098 3314 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 2 12:56:13.035166 kubelet[3314]: I0302 12:56:13.035111 3314 kubelet.go:2436] "Starting kubelet main sync loop" Mar 2 12:56:13.035488 kubelet[3314]: E0302 12:56:13.035245 3314 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 2 12:56:13.061191 kubelet[3314]: I0302 12:56:13.061167 3314 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061344 3314 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061370 3314 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061484 3314 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061490 3314 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061505 3314 policy_none.go:49] "None policy: Start" Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061512 3314 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061519 3314 state_mem.go:35] "Initializing new in-memory state store" Mar 2 12:56:13.061661 kubelet[3314]: I0302 12:56:13.061596 3314 state_mem.go:75] "Updated machine memory state" Mar 2 12:56:13.067937 kubelet[3314]: E0302 12:56:13.067298 3314 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 2 12:56:13.067937 kubelet[3314]: I0302 12:56:13.067472 3314 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 2 12:56:13.067937 kubelet[3314]: I0302 12:56:13.067484 3314 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 2 12:56:13.067937 kubelet[3314]: I0302 12:56:13.067923 3314 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 2 12:56:13.071853 kubelet[3314]: E0302 12:56:13.071755 3314 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 2 12:56:13.136848 kubelet[3314]: I0302 12:56:13.136778 3314 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.137256 kubelet[3314]: I0302 12:56:13.136963 3314 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.137256 kubelet[3314]: I0302 12:56:13.136781 3314 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.152031 kubelet[3314]: I0302 12:56:13.151986 3314 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:56:13.152356 kubelet[3314]: I0302 12:56:13.152198 3314 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:56:13.152356 kubelet[3314]: E0302 12:56:13.152240 3314 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.101-5c781fe851\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.152731 kubelet[3314]: I0302 12:56:13.152709 3314 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:56:13.178603 kubelet[3314]: I0302 12:56:13.178568 3314 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:13.190012 kubelet[3314]: I0302 12:56:13.189977 3314 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:13.190263 kubelet[3314]: I0302 12:56:13.190089 3314 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.101-5c781fe851" Mar 2 12:56:13.201249 update_engine[1863]: I20260302 12:56:13.201179 1863 update_attempter.cc:509] Updating boot flags... Mar 2 12:56:13.218276 kubelet[3314]: I0302 12:56:13.218235 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218276 kubelet[3314]: I0302 12:56:13.218273 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218276 kubelet[3314]: I0302 12:56:13.218287 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/66b520b4ed22b5df46fca945135a03a4-kubeconfig\") pod \"kube-scheduler-ci-4459.2.101-5c781fe851\" (UID: \"66b520b4ed22b5df46fca945135a03a4\") " pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218276 kubelet[3314]: I0302 12:56:13.218299 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ff83b43189496ea97b64fae57099098c-k8s-certs\") pod \"kube-apiserver-ci-4459.2.101-5c781fe851\" (UID: \"ff83b43189496ea97b64fae57099098c\") " pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218575 kubelet[3314]: I0302 12:56:13.218326 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ff83b43189496ea97b64fae57099098c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.101-5c781fe851\" (UID: \"ff83b43189496ea97b64fae57099098c\") " pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218575 kubelet[3314]: I0302 12:56:13.218339 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-ca-certs\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218575 kubelet[3314]: I0302 12:56:13.218349 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218575 kubelet[3314]: I0302 12:56:13.218362 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/119af1514cab0a50bc76f2fc28e3f1bf-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.101-5c781fe851\" (UID: \"119af1514cab0a50bc76f2fc28e3f1bf\") " pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" Mar 2 12:56:13.218575 kubelet[3314]: I0302 12:56:13.218372 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ff83b43189496ea97b64fae57099098c-ca-certs\") pod \"kube-apiserver-ci-4459.2.101-5c781fe851\" (UID: \"ff83b43189496ea97b64fae57099098c\") " pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:14.002392 kubelet[3314]: I0302 12:56:14.002318 3314 apiserver.go:52] "Watching apiserver" Mar 2 12:56:14.363592 kubelet[3314]: I0302 12:56:14.018269 3314 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 2 12:56:14.363592 kubelet[3314]: I0302 12:56:14.053918 3314 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:14.363592 kubelet[3314]: I0302 12:56:14.054266 3314 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" Mar 2 12:56:14.363592 kubelet[3314]: I0302 12:56:14.070736 3314 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:56:14.363592 kubelet[3314]: E0302 12:56:14.070787 3314 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.101-5c781fe851\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" Mar 2 12:56:14.363592 kubelet[3314]: I0302 12:56:14.072211 3314 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:56:14.363592 kubelet[3314]: E0302 12:56:14.072249 3314 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.101-5c781fe851\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" Mar 2 12:56:14.363592 kubelet[3314]: I0302 12:56:14.092313 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.2.101-5c781fe851" podStartSLOduration=3.092299055 podStartE2EDuration="3.092299055s" podCreationTimestamp="2026-03-02 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:56:14.081868325 +0000 UTC m=+1.129545515" watchObservedRunningTime="2026-03-02 12:56:14.092299055 +0000 UTC m=+1.139976237" Mar 2 12:56:14.363945 kubelet[3314]: I0302 12:56:14.092411 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.2.101-5c781fe851" podStartSLOduration=1.092408354 podStartE2EDuration="1.092408354s" podCreationTimestamp="2026-03-02 12:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:56:14.092018583 +0000 UTC m=+1.139695765" watchObservedRunningTime="2026-03-02 12:56:14.092408354 +0000 UTC m=+1.140085536" Mar 2 12:56:14.363945 kubelet[3314]: I0302 12:56:14.117497 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.2.101-5c781fe851" podStartSLOduration=1.117483033 podStartE2EDuration="1.117483033s" podCreationTimestamp="2026-03-02 12:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:56:14.105596311 +0000 UTC m=+1.153273493" watchObservedRunningTime="2026-03-02 12:56:14.117483033 +0000 UTC m=+1.165160223" Mar 2 12:56:15.148123 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 2 12:56:17.519288 kubelet[3314]: I0302 12:56:17.519249 3314 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 2 12:56:17.519909 containerd[1885]: time="2026-03-02T12:56:17.519847669Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 2 12:56:17.520200 kubelet[3314]: I0302 12:56:17.519967 3314 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 2 12:56:18.507206 systemd[1]: Created slice kubepods-besteffort-poda0bc5a3e_3c71_486d_9af1_0fbcfa983e00.slice - libcontainer container kubepods-besteffort-poda0bc5a3e_3c71_486d_9af1_0fbcfa983e00.slice. Mar 2 12:56:18.549164 kubelet[3314]: I0302 12:56:18.549124 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a0bc5a3e-3c71-486d-9af1-0fbcfa983e00-kube-proxy\") pod \"kube-proxy-vtk4b\" (UID: \"a0bc5a3e-3c71-486d-9af1-0fbcfa983e00\") " pod="kube-system/kube-proxy-vtk4b" Mar 2 12:56:18.549645 kubelet[3314]: I0302 12:56:18.549550 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a0bc5a3e-3c71-486d-9af1-0fbcfa983e00-xtables-lock\") pod \"kube-proxy-vtk4b\" (UID: \"a0bc5a3e-3c71-486d-9af1-0fbcfa983e00\") " pod="kube-system/kube-proxy-vtk4b" Mar 2 12:56:18.549645 kubelet[3314]: I0302 12:56:18.549574 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctct\" (UniqueName: \"kubernetes.io/projected/a0bc5a3e-3c71-486d-9af1-0fbcfa983e00-kube-api-access-rctct\") pod \"kube-proxy-vtk4b\" (UID: \"a0bc5a3e-3c71-486d-9af1-0fbcfa983e00\") " pod="kube-system/kube-proxy-vtk4b" Mar 2 12:56:18.549645 kubelet[3314]: I0302 12:56:18.549614 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a0bc5a3e-3c71-486d-9af1-0fbcfa983e00-lib-modules\") pod \"kube-proxy-vtk4b\" (UID: \"a0bc5a3e-3c71-486d-9af1-0fbcfa983e00\") " pod="kube-system/kube-proxy-vtk4b" Mar 2 12:56:18.716793 systemd[1]: Created slice kubepods-besteffort-pod075a527e_5b5f_4a0d_b999_b2cafe806520.slice - libcontainer container kubepods-besteffort-pod075a527e_5b5f_4a0d_b999_b2cafe806520.slice. Mar 2 12:56:18.750959 kubelet[3314]: I0302 12:56:18.750735 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hcl5\" (UniqueName: \"kubernetes.io/projected/075a527e-5b5f-4a0d-b999-b2cafe806520-kube-api-access-6hcl5\") pod \"tigera-operator-7d4578d8d-vscv5\" (UID: \"075a527e-5b5f-4a0d-b999-b2cafe806520\") " pod="tigera-operator/tigera-operator-7d4578d8d-vscv5" Mar 2 12:56:18.750959 kubelet[3314]: I0302 12:56:18.750767 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/075a527e-5b5f-4a0d-b999-b2cafe806520-var-lib-calico\") pod \"tigera-operator-7d4578d8d-vscv5\" (UID: \"075a527e-5b5f-4a0d-b999-b2cafe806520\") " pod="tigera-operator/tigera-operator-7d4578d8d-vscv5" Mar 2 12:56:18.815247 containerd[1885]: time="2026-03-02T12:56:18.814930521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vtk4b,Uid:a0bc5a3e-3c71-486d-9af1-0fbcfa983e00,Namespace:kube-system,Attempt:0,}" Mar 2 12:56:18.859740 containerd[1885]: time="2026-03-02T12:56:18.859623184Z" level=info msg="connecting to shim 5aa94c839954e19586aa1ce240d05f603b1cdfb9fa04671df76c9fc4429825d1" address="unix:///run/containerd/s/997fcf890b98aac21ac883f4e26ea7d2adb4170aa917999448d115296d81ab50" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:18.881314 systemd[1]: Started cri-containerd-5aa94c839954e19586aa1ce240d05f603b1cdfb9fa04671df76c9fc4429825d1.scope - libcontainer container 5aa94c839954e19586aa1ce240d05f603b1cdfb9fa04671df76c9fc4429825d1. Mar 2 12:56:18.908194 containerd[1885]: time="2026-03-02T12:56:18.908051429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vtk4b,Uid:a0bc5a3e-3c71-486d-9af1-0fbcfa983e00,Namespace:kube-system,Attempt:0,} returns sandbox id \"5aa94c839954e19586aa1ce240d05f603b1cdfb9fa04671df76c9fc4429825d1\"" Mar 2 12:56:18.918543 containerd[1885]: time="2026-03-02T12:56:18.918490470Z" level=info msg="CreateContainer within sandbox \"5aa94c839954e19586aa1ce240d05f603b1cdfb9fa04671df76c9fc4429825d1\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 2 12:56:18.945440 containerd[1885]: time="2026-03-02T12:56:18.945402589Z" level=info msg="Container 0ceac07aa2bc519bf4d68e59e67a941dd6d06f4b74b169818c01a0d9fe4dc0cb: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:18.965019 containerd[1885]: time="2026-03-02T12:56:18.964929812Z" level=info msg="CreateContainer within sandbox \"5aa94c839954e19586aa1ce240d05f603b1cdfb9fa04671df76c9fc4429825d1\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0ceac07aa2bc519bf4d68e59e67a941dd6d06f4b74b169818c01a0d9fe4dc0cb\"" Mar 2 12:56:18.965765 containerd[1885]: time="2026-03-02T12:56:18.965580841Z" level=info msg="StartContainer for \"0ceac07aa2bc519bf4d68e59e67a941dd6d06f4b74b169818c01a0d9fe4dc0cb\"" Mar 2 12:56:18.967658 containerd[1885]: time="2026-03-02T12:56:18.967351768Z" level=info msg="connecting to shim 0ceac07aa2bc519bf4d68e59e67a941dd6d06f4b74b169818c01a0d9fe4dc0cb" address="unix:///run/containerd/s/997fcf890b98aac21ac883f4e26ea7d2adb4170aa917999448d115296d81ab50" protocol=ttrpc version=3 Mar 2 12:56:18.986365 systemd[1]: Started cri-containerd-0ceac07aa2bc519bf4d68e59e67a941dd6d06f4b74b169818c01a0d9fe4dc0cb.scope - libcontainer container 0ceac07aa2bc519bf4d68e59e67a941dd6d06f4b74b169818c01a0d9fe4dc0cb. Mar 2 12:56:19.020344 containerd[1885]: time="2026-03-02T12:56:19.020275787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d4578d8d-vscv5,Uid:075a527e-5b5f-4a0d-b999-b2cafe806520,Namespace:tigera-operator,Attempt:0,}" Mar 2 12:56:19.049266 containerd[1885]: time="2026-03-02T12:56:19.049231571Z" level=info msg="StartContainer for \"0ceac07aa2bc519bf4d68e59e67a941dd6d06f4b74b169818c01a0d9fe4dc0cb\" returns successfully" Mar 2 12:56:19.075761 containerd[1885]: time="2026-03-02T12:56:19.075588897Z" level=info msg="connecting to shim 6c08cb96f48e2be84c7e5c2fc5b5b138a7b364c84fb983c0cc123c1c87f1ee8b" address="unix:///run/containerd/s/cf59f5e218af7fda023eddb0916c61fbdbd125f4628de01376b7e3b672c0b7dd" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:19.082955 kubelet[3314]: I0302 12:56:19.082863 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vtk4b" podStartSLOduration=1.082846037 podStartE2EDuration="1.082846037s" podCreationTimestamp="2026-03-02 12:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:56:19.082775507 +0000 UTC m=+6.130452689" watchObservedRunningTime="2026-03-02 12:56:19.082846037 +0000 UTC m=+6.130523219" Mar 2 12:56:19.101299 systemd[1]: Started cri-containerd-6c08cb96f48e2be84c7e5c2fc5b5b138a7b364c84fb983c0cc123c1c87f1ee8b.scope - libcontainer container 6c08cb96f48e2be84c7e5c2fc5b5b138a7b364c84fb983c0cc123c1c87f1ee8b. Mar 2 12:56:19.139532 containerd[1885]: time="2026-03-02T12:56:19.139419394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d4578d8d-vscv5,Uid:075a527e-5b5f-4a0d-b999-b2cafe806520,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"6c08cb96f48e2be84c7e5c2fc5b5b138a7b364c84fb983c0cc123c1c87f1ee8b\"" Mar 2 12:56:19.141381 containerd[1885]: time="2026-03-02T12:56:19.141234404Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.3\"" Mar 2 12:56:19.660515 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4291356269.mount: Deactivated successfully. Mar 2 12:56:20.695337 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1068903550.mount: Deactivated successfully. Mar 2 12:56:21.306173 containerd[1885]: time="2026-03-02T12:56:21.306097483Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:21.310243 containerd[1885]: time="2026-03-02T12:56:21.310193948Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.3: active requests=0, bytes read=25060789" Mar 2 12:56:21.315101 containerd[1885]: time="2026-03-02T12:56:21.314914936Z" level=info msg="ImageCreate event name:\"sha256:a94b0dfe779f8dc351e02e8988fd60aecb466000f13b6f00042ab83ebb237d87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:21.318837 containerd[1885]: time="2026-03-02T12:56:21.318789242Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3b1a6762e1f3fae8490773b8f06ddd1e6775850febbece4d6002416f39adc670\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:21.319381 containerd[1885]: time="2026-03-02T12:56:21.319084532Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.3\" with image id \"sha256:a94b0dfe779f8dc351e02e8988fd60aecb466000f13b6f00042ab83ebb237d87\", repo tag \"quay.io/tigera/operator:v1.40.3\", repo digest \"quay.io/tigera/operator@sha256:3b1a6762e1f3fae8490773b8f06ddd1e6775850febbece4d6002416f39adc670\", size \"25056784\" in 2.177815992s" Mar 2 12:56:21.319381 containerd[1885]: time="2026-03-02T12:56:21.319114397Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.3\" returns image reference \"sha256:a94b0dfe779f8dc351e02e8988fd60aecb466000f13b6f00042ab83ebb237d87\"" Mar 2 12:56:21.327019 containerd[1885]: time="2026-03-02T12:56:21.326982869Z" level=info msg="CreateContainer within sandbox \"6c08cb96f48e2be84c7e5c2fc5b5b138a7b364c84fb983c0cc123c1c87f1ee8b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 2 12:56:21.350015 containerd[1885]: time="2026-03-02T12:56:21.348374446Z" level=info msg="Container d3f278282d9e41464c58f22ac18850e3237ab9870fbc2ca8086eacfa8ff98b22: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:21.350372 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3138579039.mount: Deactivated successfully. Mar 2 12:56:21.365531 containerd[1885]: time="2026-03-02T12:56:21.365488305Z" level=info msg="CreateContainer within sandbox \"6c08cb96f48e2be84c7e5c2fc5b5b138a7b364c84fb983c0cc123c1c87f1ee8b\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d3f278282d9e41464c58f22ac18850e3237ab9870fbc2ca8086eacfa8ff98b22\"" Mar 2 12:56:21.367478 containerd[1885]: time="2026-03-02T12:56:21.367445383Z" level=info msg="StartContainer for \"d3f278282d9e41464c58f22ac18850e3237ab9870fbc2ca8086eacfa8ff98b22\"" Mar 2 12:56:21.368139 containerd[1885]: time="2026-03-02T12:56:21.368106164Z" level=info msg="connecting to shim d3f278282d9e41464c58f22ac18850e3237ab9870fbc2ca8086eacfa8ff98b22" address="unix:///run/containerd/s/cf59f5e218af7fda023eddb0916c61fbdbd125f4628de01376b7e3b672c0b7dd" protocol=ttrpc version=3 Mar 2 12:56:21.387293 systemd[1]: Started cri-containerd-d3f278282d9e41464c58f22ac18850e3237ab9870fbc2ca8086eacfa8ff98b22.scope - libcontainer container d3f278282d9e41464c58f22ac18850e3237ab9870fbc2ca8086eacfa8ff98b22. Mar 2 12:56:21.416247 containerd[1885]: time="2026-03-02T12:56:21.416202838Z" level=info msg="StartContainer for \"d3f278282d9e41464c58f22ac18850e3237ab9870fbc2ca8086eacfa8ff98b22\" returns successfully" Mar 2 12:56:22.095436 kubelet[3314]: I0302 12:56:22.095358 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7d4578d8d-vscv5" podStartSLOduration=1.916488152 podStartE2EDuration="4.095342736s" podCreationTimestamp="2026-03-02 12:56:18 +0000 UTC" firstStartedPulling="2026-03-02 12:56:19.140888649 +0000 UTC m=+6.188565831" lastFinishedPulling="2026-03-02 12:56:21.319743233 +0000 UTC m=+8.367420415" observedRunningTime="2026-03-02 12:56:22.083368423 +0000 UTC m=+9.131045605" watchObservedRunningTime="2026-03-02 12:56:22.095342736 +0000 UTC m=+9.143019918" Mar 2 12:56:26.603556 sudo[2335]: pam_unix(sudo:session): session closed for user root Mar 2 12:56:26.683628 sshd[2334]: Connection closed by 10.200.16.10 port 41028 Mar 2 12:56:26.684284 sshd-session[2331]: pam_unix(sshd:session): session closed for user core Mar 2 12:56:26.690643 systemd[1]: sshd@6-10.200.20.16:22-10.200.16.10:41028.service: Deactivated successfully. Mar 2 12:56:26.691534 systemd-logind[1862]: Session 9 logged out. Waiting for processes to exit. Mar 2 12:56:26.694242 systemd[1]: session-9.scope: Deactivated successfully. Mar 2 12:56:26.696293 systemd[1]: session-9.scope: Consumed 4.006s CPU time, 227M memory peak. Mar 2 12:56:26.703181 systemd-logind[1862]: Removed session 9. Mar 2 12:56:30.955591 systemd[1]: Created slice kubepods-besteffort-pod4a955df2_2493_4c75_9918_23d0517c1af8.slice - libcontainer container kubepods-besteffort-pod4a955df2_2493_4c75_9918_23d0517c1af8.slice. Mar 2 12:56:31.023043 kubelet[3314]: I0302 12:56:31.022757 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a955df2-2493-4c75-9918-23d0517c1af8-tigera-ca-bundle\") pod \"calico-typha-76dc88c875-g2n92\" (UID: \"4a955df2-2493-4c75-9918-23d0517c1af8\") " pod="calico-system/calico-typha-76dc88c875-g2n92" Mar 2 12:56:31.023043 kubelet[3314]: I0302 12:56:31.022798 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrck5\" (UniqueName: \"kubernetes.io/projected/4a955df2-2493-4c75-9918-23d0517c1af8-kube-api-access-wrck5\") pod \"calico-typha-76dc88c875-g2n92\" (UID: \"4a955df2-2493-4c75-9918-23d0517c1af8\") " pod="calico-system/calico-typha-76dc88c875-g2n92" Mar 2 12:56:31.023043 kubelet[3314]: I0302 12:56:31.022814 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4a955df2-2493-4c75-9918-23d0517c1af8-typha-certs\") pod \"calico-typha-76dc88c875-g2n92\" (UID: \"4a955df2-2493-4c75-9918-23d0517c1af8\") " pod="calico-system/calico-typha-76dc88c875-g2n92" Mar 2 12:56:31.060302 systemd[1]: Created slice kubepods-besteffort-pod9aca326c_c354_4822_ac22_d66820f46931.slice - libcontainer container kubepods-besteffort-pod9aca326c_c354_4822_ac22_d66820f46931.slice. Mar 2 12:56:31.124048 kubelet[3314]: I0302 12:56:31.123996 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-policysync\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124048 kubelet[3314]: I0302 12:56:31.124047 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-cni-bin-dir\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124048 kubelet[3314]: I0302 12:56:31.124059 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-var-run-calico\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124258 kubelet[3314]: I0302 12:56:31.124070 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-cni-log-dir\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124258 kubelet[3314]: I0302 12:56:31.124080 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-lib-modules\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124258 kubelet[3314]: I0302 12:56:31.124089 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-sys-fs\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124258 kubelet[3314]: I0302 12:56:31.124100 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9aca326c-c354-4822-ac22-d66820f46931-node-certs\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124258 kubelet[3314]: I0302 12:56:31.124120 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-xtables-lock\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124339 kubelet[3314]: I0302 12:56:31.124135 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-nodeproc\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124339 kubelet[3314]: I0302 12:56:31.124154 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-cni-net-dir\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124339 kubelet[3314]: I0302 12:56:31.124169 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-flexvol-driver-host\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124339 kubelet[3314]: I0302 12:56:31.124178 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-var-lib-calico\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124339 kubelet[3314]: I0302 12:56:31.124212 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aca326c-c354-4822-ac22-d66820f46931-tigera-ca-bundle\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124413 kubelet[3314]: I0302 12:56:31.124258 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxcs\" (UniqueName: \"kubernetes.io/projected/9aca326c-c354-4822-ac22-d66820f46931-kube-api-access-8kxcs\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.124413 kubelet[3314]: I0302 12:56:31.124277 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/9aca326c-c354-4822-ac22-d66820f46931-bpffs\") pod \"calico-node-h894f\" (UID: \"9aca326c-c354-4822-ac22-d66820f46931\") " pod="calico-system/calico-node-h894f" Mar 2 12:56:31.170790 kubelet[3314]: E0302 12:56:31.170441 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:31.226367 kubelet[3314]: I0302 12:56:31.225053 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfg9v\" (UniqueName: \"kubernetes.io/projected/4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f-kube-api-access-gfg9v\") pod \"csi-node-driver-86mlk\" (UID: \"4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f\") " pod="calico-system/csi-node-driver-86mlk" Mar 2 12:56:31.226639 kubelet[3314]: I0302 12:56:31.226618 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f-varrun\") pod \"csi-node-driver-86mlk\" (UID: \"4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f\") " pod="calico-system/csi-node-driver-86mlk" Mar 2 12:56:31.226942 kubelet[3314]: I0302 12:56:31.226841 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f-socket-dir\") pod \"csi-node-driver-86mlk\" (UID: \"4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f\") " pod="calico-system/csi-node-driver-86mlk" Mar 2 12:56:31.227164 kubelet[3314]: I0302 12:56:31.227063 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f-kubelet-dir\") pod \"csi-node-driver-86mlk\" (UID: \"4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f\") " pod="calico-system/csi-node-driver-86mlk" Mar 2 12:56:31.228005 kubelet[3314]: I0302 12:56:31.227894 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f-registration-dir\") pod \"csi-node-driver-86mlk\" (UID: \"4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f\") " pod="calico-system/csi-node-driver-86mlk" Mar 2 12:56:31.229472 kubelet[3314]: E0302 12:56:31.229367 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.229704 kubelet[3314]: W0302 12:56:31.229640 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.229704 kubelet[3314]: E0302 12:56:31.229677 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.230354 kubelet[3314]: E0302 12:56:31.230257 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.230598 kubelet[3314]: W0302 12:56:31.230532 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.230598 kubelet[3314]: E0302 12:56:31.230551 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.231219 kubelet[3314]: E0302 12:56:31.231108 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.231219 kubelet[3314]: W0302 12:56:31.231122 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.231631 kubelet[3314]: E0302 12:56:31.231542 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.232338 kubelet[3314]: E0302 12:56:31.232325 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.233235 kubelet[3314]: W0302 12:56:31.232399 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.233235 kubelet[3314]: E0302 12:56:31.232413 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.233532 kubelet[3314]: E0302 12:56:31.233492 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.233532 kubelet[3314]: W0302 12:56:31.233512 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.233532 kubelet[3314]: E0302 12:56:31.233522 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.234457 kubelet[3314]: E0302 12:56:31.234270 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.234457 kubelet[3314]: W0302 12:56:31.234283 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.234457 kubelet[3314]: E0302 12:56:31.234293 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.234766 kubelet[3314]: E0302 12:56:31.234732 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.234981 kubelet[3314]: W0302 12:56:31.234911 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.234981 kubelet[3314]: E0302 12:56:31.234928 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.235494 kubelet[3314]: E0302 12:56:31.235394 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.235494 kubelet[3314]: W0302 12:56:31.235414 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.235494 kubelet[3314]: E0302 12:56:31.235428 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.235981 kubelet[3314]: E0302 12:56:31.235904 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.235981 kubelet[3314]: W0302 12:56:31.235916 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.235981 kubelet[3314]: E0302 12:56:31.235926 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.236379 kubelet[3314]: E0302 12:56:31.236315 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.236379 kubelet[3314]: W0302 12:56:31.236327 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.236379 kubelet[3314]: E0302 12:56:31.236336 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.236807 kubelet[3314]: E0302 12:56:31.236734 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.236807 kubelet[3314]: W0302 12:56:31.236744 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.236807 kubelet[3314]: E0302 12:56:31.236754 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.237051 kubelet[3314]: E0302 12:56:31.237030 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.237244 kubelet[3314]: W0302 12:56:31.237110 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.237244 kubelet[3314]: E0302 12:56:31.237132 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.237404 kubelet[3314]: E0302 12:56:31.237382 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.237459 kubelet[3314]: W0302 12:56:31.237450 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.237589 kubelet[3314]: E0302 12:56:31.237519 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.237843 kubelet[3314]: E0302 12:56:31.237832 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.237980 kubelet[3314]: W0302 12:56:31.237876 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.237980 kubelet[3314]: E0302 12:56:31.237886 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.238381 kubelet[3314]: E0302 12:56:31.238260 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.238381 kubelet[3314]: W0302 12:56:31.238274 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.238381 kubelet[3314]: E0302 12:56:31.238283 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.238701 kubelet[3314]: E0302 12:56:31.238646 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.238866 kubelet[3314]: W0302 12:56:31.238773 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.238866 kubelet[3314]: E0302 12:56:31.238793 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.239144 kubelet[3314]: E0302 12:56:31.239086 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.239144 kubelet[3314]: W0302 12:56:31.239098 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.239144 kubelet[3314]: E0302 12:56:31.239107 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.239489 kubelet[3314]: E0302 12:56:31.239427 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.239489 kubelet[3314]: W0302 12:56:31.239439 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.239489 kubelet[3314]: E0302 12:56:31.239449 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.239747 kubelet[3314]: E0302 12:56:31.239713 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.239873 kubelet[3314]: W0302 12:56:31.239724 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.239873 kubelet[3314]: E0302 12:56:31.239803 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.240078 kubelet[3314]: E0302 12:56:31.240067 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.240224 kubelet[3314]: W0302 12:56:31.240139 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.240224 kubelet[3314]: E0302 12:56:31.240179 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.240522 kubelet[3314]: E0302 12:56:31.240430 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.240522 kubelet[3314]: W0302 12:56:31.240441 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.240522 kubelet[3314]: E0302 12:56:31.240451 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.240728 kubelet[3314]: E0302 12:56:31.240664 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.240728 kubelet[3314]: W0302 12:56:31.240674 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.240728 kubelet[3314]: E0302 12:56:31.240683 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.241262 kubelet[3314]: E0302 12:56:31.241232 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.241413 kubelet[3314]: W0302 12:56:31.241336 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.241413 kubelet[3314]: E0302 12:56:31.241353 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.241647 kubelet[3314]: E0302 12:56:31.241625 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.241729 kubelet[3314]: W0302 12:56:31.241706 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.241785 kubelet[3314]: E0302 12:56:31.241775 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.242105 kubelet[3314]: E0302 12:56:31.242093 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.242269 kubelet[3314]: W0302 12:56:31.242197 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.242269 kubelet[3314]: E0302 12:56:31.242213 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.242482 kubelet[3314]: E0302 12:56:31.242472 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.242622 kubelet[3314]: W0302 12:56:31.242540 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.242622 kubelet[3314]: E0302 12:56:31.242554 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.242866 kubelet[3314]: E0302 12:56:31.242796 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.242866 kubelet[3314]: W0302 12:56:31.242807 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.242866 kubelet[3314]: E0302 12:56:31.242815 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.243574 kubelet[3314]: E0302 12:56:31.243096 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.243828 kubelet[3314]: W0302 12:56:31.243662 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.243828 kubelet[3314]: E0302 12:56:31.243684 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.244172 kubelet[3314]: E0302 12:56:31.244103 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.244172 kubelet[3314]: W0302 12:56:31.244114 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.244172 kubelet[3314]: E0302 12:56:31.244124 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.244450 kubelet[3314]: E0302 12:56:31.244393 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.244450 kubelet[3314]: W0302 12:56:31.244404 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.244450 kubelet[3314]: E0302 12:56:31.244416 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.244850 kubelet[3314]: E0302 12:56:31.244788 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.244850 kubelet[3314]: W0302 12:56:31.244800 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.244850 kubelet[3314]: E0302 12:56:31.244810 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.245410 kubelet[3314]: E0302 12:56:31.245248 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.245410 kubelet[3314]: W0302 12:56:31.245347 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.245410 kubelet[3314]: E0302 12:56:31.245361 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.245949 kubelet[3314]: E0302 12:56:31.245785 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.245949 kubelet[3314]: W0302 12:56:31.245797 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.245949 kubelet[3314]: E0302 12:56:31.245807 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.246342 kubelet[3314]: E0302 12:56:31.246292 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.246342 kubelet[3314]: W0302 12:56:31.246310 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.246342 kubelet[3314]: E0302 12:56:31.246321 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.246660 kubelet[3314]: E0302 12:56:31.246648 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.246797 kubelet[3314]: W0302 12:56:31.246729 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.246797 kubelet[3314]: E0302 12:56:31.246744 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.246962 kubelet[3314]: E0302 12:56:31.246953 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.247228 kubelet[3314]: W0302 12:56:31.247077 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.247228 kubelet[3314]: E0302 12:56:31.247096 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.248103 kubelet[3314]: E0302 12:56:31.248000 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.248103 kubelet[3314]: W0302 12:56:31.248013 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.248103 kubelet[3314]: E0302 12:56:31.248023 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.248534 kubelet[3314]: E0302 12:56:31.248514 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.248701 kubelet[3314]: W0302 12:56:31.248642 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.248701 kubelet[3314]: E0302 12:56:31.248660 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.256333 kubelet[3314]: E0302 12:56:31.256258 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.256333 kubelet[3314]: W0302 12:56:31.256276 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.256333 kubelet[3314]: E0302 12:56:31.256290 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.260059 containerd[1885]: time="2026-03-02T12:56:31.259969977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-76dc88c875-g2n92,Uid:4a955df2-2493-4c75-9918-23d0517c1af8,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:31.308645 containerd[1885]: time="2026-03-02T12:56:31.308425220Z" level=info msg="connecting to shim 55ff2ebe19f26a8082a74e562cbef80b403215b6478be25b3e536468ab469e84" address="unix:///run/containerd/s/c73ca99b9e1a9ef04e54053e45d71e9031bf5299c3e0cfd434299705a2916604" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:31.327328 systemd[1]: Started cri-containerd-55ff2ebe19f26a8082a74e562cbef80b403215b6478be25b3e536468ab469e84.scope - libcontainer container 55ff2ebe19f26a8082a74e562cbef80b403215b6478be25b3e536468ab469e84. Mar 2 12:56:31.329892 kubelet[3314]: E0302 12:56:31.329740 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.329892 kubelet[3314]: W0302 12:56:31.329889 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.330215 kubelet[3314]: E0302 12:56:31.329909 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.330358 kubelet[3314]: E0302 12:56:31.330344 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.330580 kubelet[3314]: W0302 12:56:31.330480 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.330580 kubelet[3314]: E0302 12:56:31.330500 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.331059 kubelet[3314]: E0302 12:56:31.330947 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.331059 kubelet[3314]: W0302 12:56:31.330961 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.331059 kubelet[3314]: E0302 12:56:31.330972 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.331338 kubelet[3314]: E0302 12:56:31.331203 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.331338 kubelet[3314]: W0302 12:56:31.331213 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.331338 kubelet[3314]: E0302 12:56:31.331225 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.331578 kubelet[3314]: E0302 12:56:31.331483 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.331578 kubelet[3314]: W0302 12:56:31.331495 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.331578 kubelet[3314]: E0302 12:56:31.331504 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.331735 kubelet[3314]: E0302 12:56:31.331724 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.331792 kubelet[3314]: W0302 12:56:31.331783 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.331836 kubelet[3314]: E0302 12:56:31.331825 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.332041 kubelet[3314]: E0302 12:56:31.332030 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.332165 kubelet[3314]: W0302 12:56:31.332099 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.332165 kubelet[3314]: E0302 12:56:31.332116 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.332475 kubelet[3314]: E0302 12:56:31.332453 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.332475 kubelet[3314]: W0302 12:56:31.332472 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.332525 kubelet[3314]: E0302 12:56:31.332486 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.332983 kubelet[3314]: E0302 12:56:31.332963 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.332983 kubelet[3314]: W0302 12:56:31.332981 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.333042 kubelet[3314]: E0302 12:56:31.332993 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.333453 kubelet[3314]: E0302 12:56:31.333436 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.333453 kubelet[3314]: W0302 12:56:31.333450 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.333574 kubelet[3314]: E0302 12:56:31.333555 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.334049 kubelet[3314]: E0302 12:56:31.334029 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.334049 kubelet[3314]: W0302 12:56:31.334046 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.334093 kubelet[3314]: E0302 12:56:31.334057 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.334482 kubelet[3314]: E0302 12:56:31.334464 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.334482 kubelet[3314]: W0302 12:56:31.334479 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.334534 kubelet[3314]: E0302 12:56:31.334492 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.335066 kubelet[3314]: E0302 12:56:31.335046 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.335066 kubelet[3314]: W0302 12:56:31.335063 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.335116 kubelet[3314]: E0302 12:56:31.335074 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.335534 kubelet[3314]: E0302 12:56:31.335514 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.335534 kubelet[3314]: W0302 12:56:31.335530 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.335595 kubelet[3314]: E0302 12:56:31.335539 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.335901 kubelet[3314]: E0302 12:56:31.335883 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.335931 kubelet[3314]: W0302 12:56:31.335901 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.335931 kubelet[3314]: E0302 12:56:31.335912 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.336310 kubelet[3314]: E0302 12:56:31.336292 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.336310 kubelet[3314]: W0302 12:56:31.336308 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.336359 kubelet[3314]: E0302 12:56:31.336318 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.336697 kubelet[3314]: E0302 12:56:31.336677 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.336697 kubelet[3314]: W0302 12:56:31.336694 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.336761 kubelet[3314]: E0302 12:56:31.336705 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.336982 kubelet[3314]: E0302 12:56:31.336964 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.337171 kubelet[3314]: W0302 12:56:31.336979 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.337202 kubelet[3314]: E0302 12:56:31.337176 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.337639 kubelet[3314]: E0302 12:56:31.337620 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.337639 kubelet[3314]: W0302 12:56:31.337636 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.337693 kubelet[3314]: E0302 12:56:31.337647 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.338210 kubelet[3314]: E0302 12:56:31.338176 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.338210 kubelet[3314]: W0302 12:56:31.338190 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.338210 kubelet[3314]: E0302 12:56:31.338201 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.339078 kubelet[3314]: E0302 12:56:31.339055 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.339078 kubelet[3314]: W0302 12:56:31.339072 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.339251 kubelet[3314]: E0302 12:56:31.339084 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.339548 kubelet[3314]: E0302 12:56:31.339522 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.339548 kubelet[3314]: W0302 12:56:31.339538 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.339548 kubelet[3314]: E0302 12:56:31.339550 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.339891 kubelet[3314]: E0302 12:56:31.339873 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.339891 kubelet[3314]: W0302 12:56:31.339888 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.339964 kubelet[3314]: E0302 12:56:31.339898 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.340469 kubelet[3314]: E0302 12:56:31.340442 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.340469 kubelet[3314]: W0302 12:56:31.340459 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.340575 kubelet[3314]: E0302 12:56:31.340474 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.341094 kubelet[3314]: E0302 12:56:31.341073 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.341094 kubelet[3314]: W0302 12:56:31.341090 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.341700 kubelet[3314]: E0302 12:56:31.341101 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.349302 kubelet[3314]: E0302 12:56:31.349227 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:31.349302 kubelet[3314]: W0302 12:56:31.349249 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:31.349302 kubelet[3314]: E0302 12:56:31.349265 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:31.364140 containerd[1885]: time="2026-03-02T12:56:31.364093075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h894f,Uid:9aca326c-c354-4822-ac22-d66820f46931,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:31.371952 containerd[1885]: time="2026-03-02T12:56:31.371881145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-76dc88c875-g2n92,Uid:4a955df2-2493-4c75-9918-23d0517c1af8,Namespace:calico-system,Attempt:0,} returns sandbox id \"55ff2ebe19f26a8082a74e562cbef80b403215b6478be25b3e536468ab469e84\"" Mar 2 12:56:31.376178 containerd[1885]: time="2026-03-02T12:56:31.375169193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.3\"" Mar 2 12:56:31.407758 containerd[1885]: time="2026-03-02T12:56:31.407707917Z" level=info msg="connecting to shim 280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec" address="unix:///run/containerd/s/7cad2b59037163b0fd8ac608128a02922bab91c5f77c42b9590a2e36882238dd" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:31.426310 systemd[1]: Started cri-containerd-280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec.scope - libcontainer container 280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec. Mar 2 12:56:31.449321 containerd[1885]: time="2026-03-02T12:56:31.449276375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h894f,Uid:9aca326c-c354-4822-ac22-d66820f46931,Namespace:calico-system,Attempt:0,} returns sandbox id \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\"" Mar 2 12:56:32.662646 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2918647249.mount: Deactivated successfully. Mar 2 12:56:33.036140 kubelet[3314]: E0302 12:56:33.035729 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:33.623483 containerd[1885]: time="2026-03-02T12:56:33.623434751Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:33.626098 containerd[1885]: time="2026-03-02T12:56:33.626073045Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.3: active requests=0, bytes read=33841852" Mar 2 12:56:33.629843 containerd[1885]: time="2026-03-02T12:56:33.629172777Z" level=info msg="ImageCreate event name:\"sha256:d28a261c14ff1c1c526940695055ffc414471b39d275a706eac99ccbbd5fdc62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:33.633066 containerd[1885]: time="2026-03-02T12:56:33.633032387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:3e62cf98a20c42a1786397d0192cfb639634ef95c6f463ab92f0439a5c1a4ae5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:33.633489 containerd[1885]: time="2026-03-02T12:56:33.633451727Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.3\" with image id \"sha256:d28a261c14ff1c1c526940695055ffc414471b39d275a706eac99ccbbd5fdc62\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:3e62cf98a20c42a1786397d0192cfb639634ef95c6f463ab92f0439a5c1a4ae5\", size \"33841706\" in 2.257132978s" Mar 2 12:56:33.633489 containerd[1885]: time="2026-03-02T12:56:33.633490552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.3\" returns image reference \"sha256:d28a261c14ff1c1c526940695055ffc414471b39d275a706eac99ccbbd5fdc62\"" Mar 2 12:56:33.634633 containerd[1885]: time="2026-03-02T12:56:33.634612082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\"" Mar 2 12:56:33.652936 containerd[1885]: time="2026-03-02T12:56:33.652898830Z" level=info msg="CreateContainer within sandbox \"55ff2ebe19f26a8082a74e562cbef80b403215b6478be25b3e536468ab469e84\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 2 12:56:33.679172 containerd[1885]: time="2026-03-02T12:56:33.678778786Z" level=info msg="Container 9dd0501c4aaf735ee252d8c0e0145600450af4cdd1b042ae044602b3ec19ea5c: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:33.697742 containerd[1885]: time="2026-03-02T12:56:33.697630198Z" level=info msg="CreateContainer within sandbox \"55ff2ebe19f26a8082a74e562cbef80b403215b6478be25b3e536468ab469e84\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9dd0501c4aaf735ee252d8c0e0145600450af4cdd1b042ae044602b3ec19ea5c\"" Mar 2 12:56:33.698355 containerd[1885]: time="2026-03-02T12:56:33.698329419Z" level=info msg="StartContainer for \"9dd0501c4aaf735ee252d8c0e0145600450af4cdd1b042ae044602b3ec19ea5c\"" Mar 2 12:56:33.700413 containerd[1885]: time="2026-03-02T12:56:33.700351775Z" level=info msg="connecting to shim 9dd0501c4aaf735ee252d8c0e0145600450af4cdd1b042ae044602b3ec19ea5c" address="unix:///run/containerd/s/c73ca99b9e1a9ef04e54053e45d71e9031bf5299c3e0cfd434299705a2916604" protocol=ttrpc version=3 Mar 2 12:56:33.719283 systemd[1]: Started cri-containerd-9dd0501c4aaf735ee252d8c0e0145600450af4cdd1b042ae044602b3ec19ea5c.scope - libcontainer container 9dd0501c4aaf735ee252d8c0e0145600450af4cdd1b042ae044602b3ec19ea5c. Mar 2 12:56:33.755041 containerd[1885]: time="2026-03-02T12:56:33.755000013Z" level=info msg="StartContainer for \"9dd0501c4aaf735ee252d8c0e0145600450af4cdd1b042ae044602b3ec19ea5c\" returns successfully" Mar 2 12:56:34.135842 kubelet[3314]: E0302 12:56:34.135811 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.135842 kubelet[3314]: W0302 12:56:34.135835 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136405 kubelet[3314]: E0302 12:56:34.135858 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136405 kubelet[3314]: E0302 12:56:34.135986 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136405 kubelet[3314]: W0302 12:56:34.135992 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136405 kubelet[3314]: E0302 12:56:34.136033 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136405 kubelet[3314]: E0302 12:56:34.136141 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136405 kubelet[3314]: W0302 12:56:34.136170 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136405 kubelet[3314]: E0302 12:56:34.136177 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136405 kubelet[3314]: E0302 12:56:34.136285 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136405 kubelet[3314]: W0302 12:56:34.136290 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136405 kubelet[3314]: E0302 12:56:34.136295 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136653 kubelet[3314]: E0302 12:56:34.136389 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136653 kubelet[3314]: W0302 12:56:34.136393 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136653 kubelet[3314]: E0302 12:56:34.136399 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136653 kubelet[3314]: E0302 12:56:34.136474 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136653 kubelet[3314]: W0302 12:56:34.136479 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136653 kubelet[3314]: E0302 12:56:34.136483 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136653 kubelet[3314]: E0302 12:56:34.136561 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136653 kubelet[3314]: W0302 12:56:34.136566 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136653 kubelet[3314]: E0302 12:56:34.136570 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136909 kubelet[3314]: E0302 12:56:34.136671 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136909 kubelet[3314]: W0302 12:56:34.136677 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136909 kubelet[3314]: E0302 12:56:34.136682 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136909 kubelet[3314]: E0302 12:56:34.136775 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136909 kubelet[3314]: W0302 12:56:34.136780 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136909 kubelet[3314]: E0302 12:56:34.136785 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.136909 kubelet[3314]: E0302 12:56:34.136860 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.136909 kubelet[3314]: W0302 12:56:34.136864 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.136909 kubelet[3314]: E0302 12:56:34.136868 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.137122 kubelet[3314]: E0302 12:56:34.136940 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.137122 kubelet[3314]: W0302 12:56:34.136944 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.137122 kubelet[3314]: E0302 12:56:34.136948 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.137122 kubelet[3314]: E0302 12:56:34.137018 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.137122 kubelet[3314]: W0302 12:56:34.137021 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.137122 kubelet[3314]: E0302 12:56:34.137026 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.137122 kubelet[3314]: E0302 12:56:34.137103 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.137122 kubelet[3314]: W0302 12:56:34.137107 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.137122 kubelet[3314]: E0302 12:56:34.137112 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.137350 kubelet[3314]: E0302 12:56:34.137213 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.137350 kubelet[3314]: W0302 12:56:34.137218 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.137350 kubelet[3314]: E0302 12:56:34.137223 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.137350 kubelet[3314]: E0302 12:56:34.137303 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.137350 kubelet[3314]: W0302 12:56:34.137307 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.137350 kubelet[3314]: E0302 12:56:34.137311 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.156029 kubelet[3314]: E0302 12:56:34.155998 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.156029 kubelet[3314]: W0302 12:56:34.156020 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.156328 kubelet[3314]: E0302 12:56:34.156043 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.156328 kubelet[3314]: E0302 12:56:34.156201 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.156328 kubelet[3314]: W0302 12:56:34.156208 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.156328 kubelet[3314]: E0302 12:56:34.156215 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.156578 kubelet[3314]: E0302 12:56:34.156546 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.156578 kubelet[3314]: W0302 12:56:34.156561 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.156730 kubelet[3314]: E0302 12:56:34.156638 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.156868 kubelet[3314]: E0302 12:56:34.156831 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.156868 kubelet[3314]: W0302 12:56:34.156843 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.156868 kubelet[3314]: E0302 12:56:34.156853 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.157142 kubelet[3314]: E0302 12:56:34.157111 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.157142 kubelet[3314]: W0302 12:56:34.157122 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.157142 kubelet[3314]: E0302 12:56:34.157132 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.157477 kubelet[3314]: E0302 12:56:34.157442 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.157477 kubelet[3314]: W0302 12:56:34.157455 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.157477 kubelet[3314]: E0302 12:56:34.157466 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.158040 kubelet[3314]: E0302 12:56:34.158027 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.158040 kubelet[3314]: W0302 12:56:34.158058 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.158040 kubelet[3314]: E0302 12:56:34.158070 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.158451 kubelet[3314]: E0302 12:56:34.158386 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.158451 kubelet[3314]: W0302 12:56:34.158400 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.158451 kubelet[3314]: E0302 12:56:34.158410 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.158707 kubelet[3314]: E0302 12:56:34.158660 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.158707 kubelet[3314]: W0302 12:56:34.158671 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.158707 kubelet[3314]: E0302 12:56:34.158680 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.158890 kubelet[3314]: E0302 12:56:34.158875 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.158890 kubelet[3314]: W0302 12:56:34.158886 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.158987 kubelet[3314]: E0302 12:56:34.158896 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.159013 kubelet[3314]: E0302 12:56:34.158996 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.159013 kubelet[3314]: W0302 12:56:34.159001 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.159013 kubelet[3314]: E0302 12:56:34.159007 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.159135 kubelet[3314]: E0302 12:56:34.159096 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.159135 kubelet[3314]: W0302 12:56:34.159102 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.159135 kubelet[3314]: E0302 12:56:34.159107 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.159299 kubelet[3314]: E0302 12:56:34.159283 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.159299 kubelet[3314]: W0302 12:56:34.159293 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.159464 kubelet[3314]: E0302 12:56:34.159302 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.159618 kubelet[3314]: E0302 12:56:34.159606 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.159681 kubelet[3314]: W0302 12:56:34.159671 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.159731 kubelet[3314]: E0302 12:56:34.159723 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.159986 kubelet[3314]: E0302 12:56:34.159939 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.159986 kubelet[3314]: W0302 12:56:34.159951 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.159986 kubelet[3314]: E0302 12:56:34.159961 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.160284 kubelet[3314]: E0302 12:56:34.160270 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.160501 kubelet[3314]: W0302 12:56:34.160347 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.160501 kubelet[3314]: E0302 12:56:34.160360 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.160581 kubelet[3314]: E0302 12:56:34.160565 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.160581 kubelet[3314]: W0302 12:56:34.160578 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.160622 kubelet[3314]: E0302 12:56:34.160587 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.160871 kubelet[3314]: E0302 12:56:34.160859 3314 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:56:34.160967 kubelet[3314]: W0302 12:56:34.160930 3314 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:56:34.160967 kubelet[3314]: E0302 12:56:34.160945 3314 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:56:34.828608 containerd[1885]: time="2026-03-02T12:56:34.828538458Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:34.831416 containerd[1885]: time="2026-03-02T12:56:34.831377814Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3: active requests=0, bytes read=4456989" Mar 2 12:56:34.835135 containerd[1885]: time="2026-03-02T12:56:34.834698024Z" level=info msg="ImageCreate event name:\"sha256:3c477f840adeca332cbee81ef65da50ec7be99ded887a8de75d5cf25b896d6a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:34.838692 containerd[1885]: time="2026-03-02T12:56:34.838637188Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:6cdc6cc2f7cdcbd4bf2d9b6a59c03ed98b5c47f22e467d78b5c06e5fd7bff132\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:34.839139 containerd[1885]: time="2026-03-02T12:56:34.838941837Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\" with image id \"sha256:3c477f840adeca332cbee81ef65da50ec7be99ded887a8de75d5cf25b896d6a9\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:6cdc6cc2f7cdcbd4bf2d9b6a59c03ed98b5c47f22e467d78b5c06e5fd7bff132\", size \"5854474\" in 1.204003498s" Mar 2 12:56:34.839139 containerd[1885]: time="2026-03-02T12:56:34.838977062Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\" returns image reference \"sha256:3c477f840adeca332cbee81ef65da50ec7be99ded887a8de75d5cf25b896d6a9\"" Mar 2 12:56:34.848646 containerd[1885]: time="2026-03-02T12:56:34.848601610Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 2 12:56:34.873831 containerd[1885]: time="2026-03-02T12:56:34.872313686Z" level=info msg="Container 9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:34.891695 containerd[1885]: time="2026-03-02T12:56:34.891647297Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b\"" Mar 2 12:56:34.892638 containerd[1885]: time="2026-03-02T12:56:34.892596157Z" level=info msg="StartContainer for \"9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b\"" Mar 2 12:56:34.894774 containerd[1885]: time="2026-03-02T12:56:34.894709636Z" level=info msg="connecting to shim 9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b" address="unix:///run/containerd/s/7cad2b59037163b0fd8ac608128a02922bab91c5f77c42b9590a2e36882238dd" protocol=ttrpc version=3 Mar 2 12:56:34.910315 systemd[1]: Started cri-containerd-9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b.scope - libcontainer container 9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b. Mar 2 12:56:34.973645 containerd[1885]: time="2026-03-02T12:56:34.973604629Z" level=info msg="StartContainer for \"9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b\" returns successfully" Mar 2 12:56:34.979397 systemd[1]: cri-containerd-9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b.scope: Deactivated successfully. Mar 2 12:56:34.983853 containerd[1885]: time="2026-03-02T12:56:34.983761729Z" level=info msg="received container exit event container_id:\"9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b\" id:\"9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b\" pid:4033 exited_at:{seconds:1772456194 nanos:983210889}" Mar 2 12:56:35.001248 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9d7c3a52d017ece78b7b94cda74170ab5f8c727024c84ca7e2a087fbd5afb92b-rootfs.mount: Deactivated successfully. Mar 2 12:56:35.036796 kubelet[3314]: E0302 12:56:35.036439 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:35.102326 kubelet[3314]: I0302 12:56:35.102207 3314 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:56:35.121173 kubelet[3314]: I0302 12:56:35.119199 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-76dc88c875-g2n92" podStartSLOduration=2.859590098 podStartE2EDuration="5.11918292s" podCreationTimestamp="2026-03-02 12:56:30 +0000 UTC" firstStartedPulling="2026-03-02 12:56:31.374822214 +0000 UTC m=+18.422499396" lastFinishedPulling="2026-03-02 12:56:33.634415036 +0000 UTC m=+20.682092218" observedRunningTime="2026-03-02 12:56:34.110310753 +0000 UTC m=+21.157987943" watchObservedRunningTime="2026-03-02 12:56:35.11918292 +0000 UTC m=+22.166860102" Mar 2 12:56:37.036525 kubelet[3314]: E0302 12:56:37.035791 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:37.110125 containerd[1885]: time="2026-03-02T12:56:37.110027741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.3\"" Mar 2 12:56:39.036815 kubelet[3314]: E0302 12:56:39.036396 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:41.037169 kubelet[3314]: E0302 12:56:41.036601 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:41.213259 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1435272778.mount: Deactivated successfully. Mar 2 12:56:41.422651 containerd[1885]: time="2026-03-02T12:56:41.422588095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:41.425693 containerd[1885]: time="2026-03-02T12:56:41.425551991Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.3: active requests=0, bytes read=153583198" Mar 2 12:56:41.428664 containerd[1885]: time="2026-03-02T12:56:41.428629835Z" level=info msg="ImageCreate event name:\"sha256:98788f64d6cabef718c2551eb8b42ec11d1bfaa912cfeb4f6bf240f79159575d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:41.433001 containerd[1885]: time="2026-03-02T12:56:41.432944028Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:c7aefc80042b94800407ab45640b59402d2897ae8755b9d8370516e7b0e404bc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:41.433417 containerd[1885]: time="2026-03-02T12:56:41.433227757Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.3\" with image id \"sha256:98788f64d6cabef718c2551eb8b42ec11d1bfaa912cfeb4f6bf240f79159575d\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:c7aefc80042b94800407ab45640b59402d2897ae8755b9d8370516e7b0e404bc\", size \"153583060\" in 4.32315991s" Mar 2 12:56:41.433417 containerd[1885]: time="2026-03-02T12:56:41.433256957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.3\" returns image reference \"sha256:98788f64d6cabef718c2551eb8b42ec11d1bfaa912cfeb4f6bf240f79159575d\"" Mar 2 12:56:41.441684 containerd[1885]: time="2026-03-02T12:56:41.441648112Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 2 12:56:41.468400 containerd[1885]: time="2026-03-02T12:56:41.468348109Z" level=info msg="Container 4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:41.489216 containerd[1885]: time="2026-03-02T12:56:41.489140346Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52\"" Mar 2 12:56:41.490989 containerd[1885]: time="2026-03-02T12:56:41.490959392Z" level=info msg="StartContainer for \"4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52\"" Mar 2 12:56:41.492009 containerd[1885]: time="2026-03-02T12:56:41.491977455Z" level=info msg="connecting to shim 4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52" address="unix:///run/containerd/s/7cad2b59037163b0fd8ac608128a02922bab91c5f77c42b9590a2e36882238dd" protocol=ttrpc version=3 Mar 2 12:56:41.512311 systemd[1]: Started cri-containerd-4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52.scope - libcontainer container 4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52. Mar 2 12:56:41.578227 containerd[1885]: time="2026-03-02T12:56:41.578183933Z" level=info msg="StartContainer for \"4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52\" returns successfully" Mar 2 12:56:41.606752 systemd[1]: cri-containerd-4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52.scope: Deactivated successfully. Mar 2 12:56:41.609532 containerd[1885]: time="2026-03-02T12:56:41.609459635Z" level=info msg="received container exit event container_id:\"4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52\" id:\"4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52\" pid:4092 exited_at:{seconds:1772456201 nanos:609246181}" Mar 2 12:56:42.213309 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4ecdb2c356bd287e3992c871de42a4b29a480d855f2396c5955224ea02afbd52-rootfs.mount: Deactivated successfully. Mar 2 12:56:43.036610 kubelet[3314]: E0302 12:56:43.036383 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:44.125876 containerd[1885]: time="2026-03-02T12:56:44.125835342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.3\"" Mar 2 12:56:45.036173 kubelet[3314]: E0302 12:56:45.036057 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:46.679111 containerd[1885]: time="2026-03-02T12:56:46.679052059Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:46.681816 containerd[1885]: time="2026-03-02T12:56:46.681781053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.3: active requests=0, bytes read=65998037" Mar 2 12:56:46.684650 containerd[1885]: time="2026-03-02T12:56:46.684620266Z" level=info msg="ImageCreate event name:\"sha256:2aba526dc0b0f95b83ab38a811f41d3daf3ec5ae8876bf273b65b9f142277231\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:46.689041 containerd[1885]: time="2026-03-02T12:56:46.689006189Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:c25deb6a4b79f5e595eb464adf9fb3735ea5623889e249d5b3efa0b42ffcbb47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:46.689583 containerd[1885]: time="2026-03-02T12:56:46.689443498Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.3\" with image id \"sha256:2aba526dc0b0f95b83ab38a811f41d3daf3ec5ae8876bf273b65b9f142277231\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:c25deb6a4b79f5e595eb464adf9fb3735ea5623889e249d5b3efa0b42ffcbb47\", size \"67395562\" in 2.563519626s" Mar 2 12:56:46.689583 containerd[1885]: time="2026-03-02T12:56:46.689472051Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.3\" returns image reference \"sha256:2aba526dc0b0f95b83ab38a811f41d3daf3ec5ae8876bf273b65b9f142277231\"" Mar 2 12:56:46.698651 containerd[1885]: time="2026-03-02T12:56:46.698612092Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 2 12:56:46.722596 containerd[1885]: time="2026-03-02T12:56:46.722554863Z" level=info msg="Container af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:46.723917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3267487715.mount: Deactivated successfully. Mar 2 12:56:46.745327 containerd[1885]: time="2026-03-02T12:56:46.745277149Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23\"" Mar 2 12:56:46.745997 containerd[1885]: time="2026-03-02T12:56:46.745963610Z" level=info msg="StartContainer for \"af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23\"" Mar 2 12:56:46.747441 containerd[1885]: time="2026-03-02T12:56:46.747418365Z" level=info msg="connecting to shim af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23" address="unix:///run/containerd/s/7cad2b59037163b0fd8ac608128a02922bab91c5f77c42b9590a2e36882238dd" protocol=ttrpc version=3 Mar 2 12:56:46.767323 systemd[1]: Started cri-containerd-af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23.scope - libcontainer container af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23. Mar 2 12:56:46.824812 containerd[1885]: time="2026-03-02T12:56:46.824765051Z" level=info msg="StartContainer for \"af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23\" returns successfully" Mar 2 12:56:47.036458 kubelet[3314]: E0302 12:56:47.036086 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:48.861605 containerd[1885]: time="2026-03-02T12:56:48.861543366Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 2 12:56:48.864282 systemd[1]: cri-containerd-af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23.scope: Deactivated successfully. Mar 2 12:56:48.864790 systemd[1]: cri-containerd-af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23.scope: Consumed 358ms CPU time, 188.5M memory peak, 244K read from disk, 171.3M written to disk. Mar 2 12:56:48.867190 containerd[1885]: time="2026-03-02T12:56:48.866134948Z" level=info msg="received container exit event container_id:\"af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23\" id:\"af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23\" pid:4149 exited_at:{seconds:1772456208 nanos:865928374}" Mar 2 12:56:48.884868 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-af1262e32a222224f599e7ca4f1dc944bf244695393a6247fb88cd4250faeb23-rootfs.mount: Deactivated successfully. Mar 2 12:56:48.944168 kubelet[3314]: I0302 12:56:48.943051 3314 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 2 12:56:48.985121 systemd[1]: Created slice kubepods-besteffort-pod9d59030b_e361_4da2_a1e1_9c6d616c14df.slice - libcontainer container kubepods-besteffort-pod9d59030b_e361_4da2_a1e1_9c6d616c14df.slice. Mar 2 12:56:48.999199 systemd[1]: Created slice kubepods-besteffort-podde3e0983_3c6e_42d1_bdf9_b39c3e0a5faf.slice - libcontainer container kubepods-besteffort-podde3e0983_3c6e_42d1_bdf9_b39c3e0a5faf.slice. Mar 2 12:56:49.004782 systemd[1]: Created slice kubepods-besteffort-pod82b77641_f21a_43bf_bab0_caae6ec56911.slice - libcontainer container kubepods-besteffort-pod82b77641_f21a_43bf_bab0_caae6ec56911.slice. Mar 2 12:56:49.011331 kubelet[3314]: I0302 12:56:49.011298 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82b77641-f21a-43bf-bab0-caae6ec56911-tigera-ca-bundle\") pod \"calico-kube-controllers-6b74df9d96-k4hhf\" (UID: \"82b77641-f21a-43bf-bab0-caae6ec56911\") " pod="calico-system/calico-kube-controllers-6b74df9d96-k4hhf" Mar 2 12:56:49.011331 kubelet[3314]: I0302 12:56:49.011331 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llvqh\" (UniqueName: \"kubernetes.io/projected/9d59030b-e361-4da2-a1e1-9c6d616c14df-kube-api-access-llvqh\") pod \"whisker-cb7c4b8dd-jxd4t\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " pod="calico-system/whisker-cb7c4b8dd-jxd4t" Mar 2 12:56:49.011611 kubelet[3314]: I0302 12:56:49.011348 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5a505ea2-0da7-4eaf-9451-354939c31e56-calico-apiserver-certs\") pod \"calico-apiserver-5cddb65cc8-c59pl\" (UID: \"5a505ea2-0da7-4eaf-9451-354939c31e56\") " pod="calico-system/calico-apiserver-5cddb65cc8-c59pl" Mar 2 12:56:49.011611 kubelet[3314]: I0302 12:56:49.011360 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/018390e1-07bc-43c8-afb7-0f774f706259-goldmane-key-pair\") pod \"goldmane-9566f57b5-s9xt2\" (UID: \"018390e1-07bc-43c8-afb7-0f774f706259\") " pod="calico-system/goldmane-9566f57b5-s9xt2" Mar 2 12:56:49.011611 kubelet[3314]: I0302 12:56:49.011373 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-ca-bundle\") pod \"whisker-cb7c4b8dd-jxd4t\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " pod="calico-system/whisker-cb7c4b8dd-jxd4t" Mar 2 12:56:49.011611 kubelet[3314]: I0302 12:56:49.011384 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf-calico-apiserver-certs\") pod \"calico-apiserver-5cddb65cc8-jz2bw\" (UID: \"de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf\") " pod="calico-system/calico-apiserver-5cddb65cc8-jz2bw" Mar 2 12:56:49.011611 kubelet[3314]: I0302 12:56:49.011395 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zpx\" (UniqueName: \"kubernetes.io/projected/de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf-kube-api-access-d9zpx\") pod \"calico-apiserver-5cddb65cc8-jz2bw\" (UID: \"de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf\") " pod="calico-system/calico-apiserver-5cddb65cc8-jz2bw" Mar 2 12:56:49.012301 kubelet[3314]: I0302 12:56:49.011408 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-backend-key-pair\") pod \"whisker-cb7c4b8dd-jxd4t\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " pod="calico-system/whisker-cb7c4b8dd-jxd4t" Mar 2 12:56:49.012301 kubelet[3314]: I0302 12:56:49.011419 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xt5s\" (UniqueName: \"kubernetes.io/projected/5a505ea2-0da7-4eaf-9451-354939c31e56-kube-api-access-5xt5s\") pod \"calico-apiserver-5cddb65cc8-c59pl\" (UID: \"5a505ea2-0da7-4eaf-9451-354939c31e56\") " pod="calico-system/calico-apiserver-5cddb65cc8-c59pl" Mar 2 12:56:49.012301 kubelet[3314]: I0302 12:56:49.011429 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/018390e1-07bc-43c8-afb7-0f774f706259-goldmane-ca-bundle\") pod \"goldmane-9566f57b5-s9xt2\" (UID: \"018390e1-07bc-43c8-afb7-0f774f706259\") " pod="calico-system/goldmane-9566f57b5-s9xt2" Mar 2 12:56:49.012301 kubelet[3314]: I0302 12:56:49.011445 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-nginx-config\") pod \"whisker-cb7c4b8dd-jxd4t\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " pod="calico-system/whisker-cb7c4b8dd-jxd4t" Mar 2 12:56:49.012301 kubelet[3314]: I0302 12:56:49.011455 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxh8\" (UniqueName: \"kubernetes.io/projected/82b77641-f21a-43bf-bab0-caae6ec56911-kube-api-access-8nxh8\") pod \"calico-kube-controllers-6b74df9d96-k4hhf\" (UID: \"82b77641-f21a-43bf-bab0-caae6ec56911\") " pod="calico-system/calico-kube-controllers-6b74df9d96-k4hhf" Mar 2 12:56:49.012386 kubelet[3314]: I0302 12:56:49.011465 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018390e1-07bc-43c8-afb7-0f774f706259-config\") pod \"goldmane-9566f57b5-s9xt2\" (UID: \"018390e1-07bc-43c8-afb7-0f774f706259\") " pod="calico-system/goldmane-9566f57b5-s9xt2" Mar 2 12:56:49.012386 kubelet[3314]: I0302 12:56:49.011478 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46c8\" (UniqueName: \"kubernetes.io/projected/018390e1-07bc-43c8-afb7-0f774f706259-kube-api-access-r46c8\") pod \"goldmane-9566f57b5-s9xt2\" (UID: \"018390e1-07bc-43c8-afb7-0f774f706259\") " pod="calico-system/goldmane-9566f57b5-s9xt2" Mar 2 12:56:49.017237 systemd[1]: Created slice kubepods-besteffort-pod5a505ea2_0da7_4eaf_9451_354939c31e56.slice - libcontainer container kubepods-besteffort-pod5a505ea2_0da7_4eaf_9451_354939c31e56.slice. Mar 2 12:56:49.024920 systemd[1]: Created slice kubepods-besteffort-pod018390e1_07bc_43c8_afb7_0f774f706259.slice - libcontainer container kubepods-besteffort-pod018390e1_07bc_43c8_afb7_0f774f706259.slice. Mar 2 12:56:49.029470 systemd[1]: Created slice kubepods-burstable-poda9b47b71_026d_450a_8133_78f1e1d6309a.slice - libcontainer container kubepods-burstable-poda9b47b71_026d_450a_8133_78f1e1d6309a.slice. Mar 2 12:56:49.038058 systemd[1]: Created slice kubepods-burstable-podad0dd550_658f_4383_892c_c32b6335909f.slice - libcontainer container kubepods-burstable-podad0dd550_658f_4383_892c_c32b6335909f.slice. Mar 2 12:56:49.045015 systemd[1]: Created slice kubepods-besteffort-pod4ed1d8a5_5e5e_4b7e_b72f_84fd1fe8207f.slice - libcontainer container kubepods-besteffort-pod4ed1d8a5_5e5e_4b7e_b72f_84fd1fe8207f.slice. Mar 2 12:56:49.047628 containerd[1885]: time="2026-03-02T12:56:49.047595788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86mlk,Uid:4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:49.153695 containerd[1885]: time="2026-03-02T12:56:49.152591790Z" level=error msg="Failed to destroy network for sandbox \"93217cae47ef5b33ef02833ea96d058d2314574a2bd0e64b6cc724131f308448\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.158438 containerd[1885]: time="2026-03-02T12:56:49.157673434Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86mlk,Uid:4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"93217cae47ef5b33ef02833ea96d058d2314574a2bd0e64b6cc724131f308448\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.159801 kubelet[3314]: E0302 12:56:49.159661 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93217cae47ef5b33ef02833ea96d058d2314574a2bd0e64b6cc724131f308448\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.159801 kubelet[3314]: E0302 12:56:49.159715 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93217cae47ef5b33ef02833ea96d058d2314574a2bd0e64b6cc724131f308448\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-86mlk" Mar 2 12:56:49.159801 kubelet[3314]: E0302 12:56:49.159730 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93217cae47ef5b33ef02833ea96d058d2314574a2bd0e64b6cc724131f308448\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-86mlk" Mar 2 12:56:49.159939 kubelet[3314]: E0302 12:56:49.159764 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-86mlk_calico-system(4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-86mlk_calico-system(4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"93217cae47ef5b33ef02833ea96d058d2314574a2bd0e64b6cc724131f308448\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-86mlk" podUID="4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f" Mar 2 12:56:49.169076 containerd[1885]: time="2026-03-02T12:56:49.168937417Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 2 12:56:49.220355 kubelet[3314]: I0302 12:56:49.219901 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dn8q\" (UniqueName: \"kubernetes.io/projected/a9b47b71-026d-450a-8133-78f1e1d6309a-kube-api-access-4dn8q\") pod \"coredns-674b8bbfcf-lqjzn\" (UID: \"a9b47b71-026d-450a-8133-78f1e1d6309a\") " pod="kube-system/coredns-674b8bbfcf-lqjzn" Mar 2 12:56:49.220514 kubelet[3314]: I0302 12:56:49.220374 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9b47b71-026d-450a-8133-78f1e1d6309a-config-volume\") pod \"coredns-674b8bbfcf-lqjzn\" (UID: \"a9b47b71-026d-450a-8133-78f1e1d6309a\") " pod="kube-system/coredns-674b8bbfcf-lqjzn" Mar 2 12:56:49.220514 kubelet[3314]: I0302 12:56:49.220401 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6vd\" (UniqueName: \"kubernetes.io/projected/ad0dd550-658f-4383-892c-c32b6335909f-kube-api-access-4z6vd\") pod \"coredns-674b8bbfcf-vfq4x\" (UID: \"ad0dd550-658f-4383-892c-c32b6335909f\") " pod="kube-system/coredns-674b8bbfcf-vfq4x" Mar 2 12:56:49.220514 kubelet[3314]: I0302 12:56:49.220418 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad0dd550-658f-4383-892c-c32b6335909f-config-volume\") pod \"coredns-674b8bbfcf-vfq4x\" (UID: \"ad0dd550-658f-4383-892c-c32b6335909f\") " pod="kube-system/coredns-674b8bbfcf-vfq4x" Mar 2 12:56:49.268760 containerd[1885]: time="2026-03-02T12:56:49.268669122Z" level=info msg="Container 84861ad3784e52cb957588fab38b4b5a383a4e83c48319c22fb9c55097aa14de: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:49.288061 containerd[1885]: time="2026-03-02T12:56:49.288011060Z" level=info msg="CreateContainer within sandbox \"280aa07d3af4f95b9587c658db7c3379df72d59db7c2ab8efb9487eba55d9bec\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"84861ad3784e52cb957588fab38b4b5a383a4e83c48319c22fb9c55097aa14de\"" Mar 2 12:56:49.288785 containerd[1885]: time="2026-03-02T12:56:49.288730705Z" level=info msg="StartContainer for \"84861ad3784e52cb957588fab38b4b5a383a4e83c48319c22fb9c55097aa14de\"" Mar 2 12:56:49.290054 containerd[1885]: time="2026-03-02T12:56:49.290024215Z" level=info msg="connecting to shim 84861ad3784e52cb957588fab38b4b5a383a4e83c48319c22fb9c55097aa14de" address="unix:///run/containerd/s/7cad2b59037163b0fd8ac608128a02922bab91c5f77c42b9590a2e36882238dd" protocol=ttrpc version=3 Mar 2 12:56:49.291929 containerd[1885]: time="2026-03-02T12:56:49.291890581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-cb7c4b8dd-jxd4t,Uid:9d59030b-e361-4da2-a1e1-9c6d616c14df,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:49.305342 systemd[1]: Started cri-containerd-84861ad3784e52cb957588fab38b4b5a383a4e83c48319c22fb9c55097aa14de.scope - libcontainer container 84861ad3784e52cb957588fab38b4b5a383a4e83c48319c22fb9c55097aa14de. Mar 2 12:56:49.308058 containerd[1885]: time="2026-03-02T12:56:49.307967592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-jz2bw,Uid:de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:49.313693 containerd[1885]: time="2026-03-02T12:56:49.313184128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b74df9d96-k4hhf,Uid:82b77641-f21a-43bf-bab0-caae6ec56911,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:49.324667 containerd[1885]: time="2026-03-02T12:56:49.324625644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-c59pl,Uid:5a505ea2-0da7-4eaf-9451-354939c31e56,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:49.329660 containerd[1885]: time="2026-03-02T12:56:49.329628781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-s9xt2,Uid:018390e1-07bc-43c8-afb7-0f774f706259,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:49.334029 containerd[1885]: time="2026-03-02T12:56:49.334001324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lqjzn,Uid:a9b47b71-026d-450a-8133-78f1e1d6309a,Namespace:kube-system,Attempt:0,}" Mar 2 12:56:49.345511 containerd[1885]: time="2026-03-02T12:56:49.345471322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfq4x,Uid:ad0dd550-658f-4383-892c-c32b6335909f,Namespace:kube-system,Attempt:0,}" Mar 2 12:56:49.402060 containerd[1885]: time="2026-03-02T12:56:49.402015236Z" level=error msg="Failed to destroy network for sandbox \"145efec6de60c306922e72d4c28da163c53947418daecab3765b5db33293a22c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.410016 containerd[1885]: time="2026-03-02T12:56:49.409558015Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-cb7c4b8dd-jxd4t,Uid:9d59030b-e361-4da2-a1e1-9c6d616c14df,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"145efec6de60c306922e72d4c28da163c53947418daecab3765b5db33293a22c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.410599 kubelet[3314]: E0302 12:56:49.410488 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"145efec6de60c306922e72d4c28da163c53947418daecab3765b5db33293a22c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.410599 kubelet[3314]: E0302 12:56:49.410546 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"145efec6de60c306922e72d4c28da163c53947418daecab3765b5db33293a22c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-cb7c4b8dd-jxd4t" Mar 2 12:56:49.410599 kubelet[3314]: E0302 12:56:49.410562 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"145efec6de60c306922e72d4c28da163c53947418daecab3765b5db33293a22c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-cb7c4b8dd-jxd4t" Mar 2 12:56:49.411012 kubelet[3314]: E0302 12:56:49.410855 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-cb7c4b8dd-jxd4t_calico-system(9d59030b-e361-4da2-a1e1-9c6d616c14df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-cb7c4b8dd-jxd4t_calico-system(9d59030b-e361-4da2-a1e1-9c6d616c14df)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"145efec6de60c306922e72d4c28da163c53947418daecab3765b5db33293a22c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-cb7c4b8dd-jxd4t" podUID="9d59030b-e361-4da2-a1e1-9c6d616c14df" Mar 2 12:56:49.427626 containerd[1885]: time="2026-03-02T12:56:49.427575003Z" level=error msg="Failed to destroy network for sandbox \"b64534d0f2f6e2394eac7ebe2fd006c6ae2618c489fef4a66d768c1cddb7f513\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.431412 containerd[1885]: time="2026-03-02T12:56:49.431235437Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-jz2bw,Uid:de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b64534d0f2f6e2394eac7ebe2fd006c6ae2618c489fef4a66d768c1cddb7f513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.431539 kubelet[3314]: E0302 12:56:49.431452 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b64534d0f2f6e2394eac7ebe2fd006c6ae2618c489fef4a66d768c1cddb7f513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.431539 kubelet[3314]: E0302 12:56:49.431504 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b64534d0f2f6e2394eac7ebe2fd006c6ae2618c489fef4a66d768c1cddb7f513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5cddb65cc8-jz2bw" Mar 2 12:56:49.431539 kubelet[3314]: E0302 12:56:49.431521 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b64534d0f2f6e2394eac7ebe2fd006c6ae2618c489fef4a66d768c1cddb7f513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5cddb65cc8-jz2bw" Mar 2 12:56:49.431616 kubelet[3314]: E0302 12:56:49.431558 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5cddb65cc8-jz2bw_calico-system(de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5cddb65cc8-jz2bw_calico-system(de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b64534d0f2f6e2394eac7ebe2fd006c6ae2618c489fef4a66d768c1cddb7f513\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5cddb65cc8-jz2bw" podUID="de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf" Mar 2 12:56:49.448162 containerd[1885]: time="2026-03-02T12:56:49.448088943Z" level=info msg="StartContainer for \"84861ad3784e52cb957588fab38b4b5a383a4e83c48319c22fb9c55097aa14de\" returns successfully" Mar 2 12:56:49.485422 containerd[1885]: time="2026-03-02T12:56:49.485308760Z" level=error msg="Failed to destroy network for sandbox \"d8bbb2ecf82a237b7ea7315f32d213e42e5934e7d3582c091d75bbb1d3c5d5fe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.486764 containerd[1885]: time="2026-03-02T12:56:49.486679072Z" level=error msg="Failed to destroy network for sandbox \"e3111b65ba6061e1b0b9d3f8b3612759d65951ecb464bd277d80fa229651a29c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.489522 containerd[1885]: time="2026-03-02T12:56:49.489060461Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b74df9d96-k4hhf,Uid:82b77641-f21a-43bf-bab0-caae6ec56911,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8bbb2ecf82a237b7ea7315f32d213e42e5934e7d3582c091d75bbb1d3c5d5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.490066 kubelet[3314]: E0302 12:56:49.489995 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8bbb2ecf82a237b7ea7315f32d213e42e5934e7d3582c091d75bbb1d3c5d5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.490374 kubelet[3314]: E0302 12:56:49.490182 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8bbb2ecf82a237b7ea7315f32d213e42e5934e7d3582c091d75bbb1d3c5d5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b74df9d96-k4hhf" Mar 2 12:56:49.490374 kubelet[3314]: E0302 12:56:49.490216 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8bbb2ecf82a237b7ea7315f32d213e42e5934e7d3582c091d75bbb1d3c5d5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b74df9d96-k4hhf" Mar 2 12:56:49.490374 kubelet[3314]: E0302 12:56:49.490265 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6b74df9d96-k4hhf_calico-system(82b77641-f21a-43bf-bab0-caae6ec56911)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6b74df9d96-k4hhf_calico-system(82b77641-f21a-43bf-bab0-caae6ec56911)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d8bbb2ecf82a237b7ea7315f32d213e42e5934e7d3582c091d75bbb1d3c5d5fe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6b74df9d96-k4hhf" podUID="82b77641-f21a-43bf-bab0-caae6ec56911" Mar 2 12:56:49.493383 containerd[1885]: time="2026-03-02T12:56:49.493351770Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfq4x,Uid:ad0dd550-658f-4383-892c-c32b6335909f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3111b65ba6061e1b0b9d3f8b3612759d65951ecb464bd277d80fa229651a29c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.493767 kubelet[3314]: E0302 12:56:49.493678 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3111b65ba6061e1b0b9d3f8b3612759d65951ecb464bd277d80fa229651a29c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.493767 kubelet[3314]: E0302 12:56:49.493720 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3111b65ba6061e1b0b9d3f8b3612759d65951ecb464bd277d80fa229651a29c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vfq4x" Mar 2 12:56:49.493767 kubelet[3314]: E0302 12:56:49.493740 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3111b65ba6061e1b0b9d3f8b3612759d65951ecb464bd277d80fa229651a29c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vfq4x" Mar 2 12:56:49.494036 kubelet[3314]: E0302 12:56:49.493780 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-vfq4x_kube-system(ad0dd550-658f-4383-892c-c32b6335909f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-vfq4x_kube-system(ad0dd550-658f-4383-892c-c32b6335909f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3111b65ba6061e1b0b9d3f8b3612759d65951ecb464bd277d80fa229651a29c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-vfq4x" podUID="ad0dd550-658f-4383-892c-c32b6335909f" Mar 2 12:56:49.526073 containerd[1885]: time="2026-03-02T12:56:49.525712230Z" level=error msg="Failed to destroy network for sandbox \"5a1eeecf8b530525aac2c6420dd42e24417941d7c89790070d2a37349d56321f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.531244 containerd[1885]: time="2026-03-02T12:56:49.531141140Z" level=error msg="Failed to destroy network for sandbox \"424d13d87d55e79eda72bde283e6c2d716baad3089151eb68a83fa94d7a61ba6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.532768 containerd[1885]: time="2026-03-02T12:56:49.532232139Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-s9xt2,Uid:018390e1-07bc-43c8-afb7-0f774f706259,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a1eeecf8b530525aac2c6420dd42e24417941d7c89790070d2a37349d56321f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.532947 containerd[1885]: time="2026-03-02T12:56:49.532916671Z" level=error msg="Failed to destroy network for sandbox \"16444a31f34105f63d34acdfc982c4814c669a48625268ff6d7b717d61143e81\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.533558 kubelet[3314]: E0302 12:56:49.533431 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a1eeecf8b530525aac2c6420dd42e24417941d7c89790070d2a37349d56321f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.533558 kubelet[3314]: E0302 12:56:49.533508 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a1eeecf8b530525aac2c6420dd42e24417941d7c89790070d2a37349d56321f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9566f57b5-s9xt2" Mar 2 12:56:49.533558 kubelet[3314]: E0302 12:56:49.533533 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a1eeecf8b530525aac2c6420dd42e24417941d7c89790070d2a37349d56321f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9566f57b5-s9xt2" Mar 2 12:56:49.533790 kubelet[3314]: E0302 12:56:49.533756 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9566f57b5-s9xt2_calico-system(018390e1-07bc-43c8-afb7-0f774f706259)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9566f57b5-s9xt2_calico-system(018390e1-07bc-43c8-afb7-0f774f706259)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a1eeecf8b530525aac2c6420dd42e24417941d7c89790070d2a37349d56321f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9566f57b5-s9xt2" podUID="018390e1-07bc-43c8-afb7-0f774f706259" Mar 2 12:56:49.535766 containerd[1885]: time="2026-03-02T12:56:49.535666607Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lqjzn,Uid:a9b47b71-026d-450a-8133-78f1e1d6309a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"424d13d87d55e79eda72bde283e6c2d716baad3089151eb68a83fa94d7a61ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.536002 kubelet[3314]: E0302 12:56:49.535960 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"424d13d87d55e79eda72bde283e6c2d716baad3089151eb68a83fa94d7a61ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.536072 kubelet[3314]: E0302 12:56:49.536012 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"424d13d87d55e79eda72bde283e6c2d716baad3089151eb68a83fa94d7a61ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-lqjzn" Mar 2 12:56:49.536072 kubelet[3314]: E0302 12:56:49.536026 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"424d13d87d55e79eda72bde283e6c2d716baad3089151eb68a83fa94d7a61ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-lqjzn" Mar 2 12:56:49.536072 kubelet[3314]: E0302 12:56:49.536062 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-lqjzn_kube-system(a9b47b71-026d-450a-8133-78f1e1d6309a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-lqjzn_kube-system(a9b47b71-026d-450a-8133-78f1e1d6309a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"424d13d87d55e79eda72bde283e6c2d716baad3089151eb68a83fa94d7a61ba6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-lqjzn" podUID="a9b47b71-026d-450a-8133-78f1e1d6309a" Mar 2 12:56:49.538726 containerd[1885]: time="2026-03-02T12:56:49.538661326Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-c59pl,Uid:5a505ea2-0da7-4eaf-9451-354939c31e56,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"16444a31f34105f63d34acdfc982c4814c669a48625268ff6d7b717d61143e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.539233 kubelet[3314]: E0302 12:56:49.538882 3314 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16444a31f34105f63d34acdfc982c4814c669a48625268ff6d7b717d61143e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:56:49.539233 kubelet[3314]: E0302 12:56:49.538914 3314 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16444a31f34105f63d34acdfc982c4814c669a48625268ff6d7b717d61143e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5cddb65cc8-c59pl" Mar 2 12:56:49.539233 kubelet[3314]: E0302 12:56:49.538929 3314 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16444a31f34105f63d34acdfc982c4814c669a48625268ff6d7b717d61143e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5cddb65cc8-c59pl" Mar 2 12:56:49.539334 kubelet[3314]: E0302 12:56:49.538969 3314 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5cddb65cc8-c59pl_calico-system(5a505ea2-0da7-4eaf-9451-354939c31e56)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5cddb65cc8-c59pl_calico-system(5a505ea2-0da7-4eaf-9451-354939c31e56)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"16444a31f34105f63d34acdfc982c4814c669a48625268ff6d7b717d61143e81\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5cddb65cc8-c59pl" podUID="5a505ea2-0da7-4eaf-9451-354939c31e56" Mar 2 12:56:49.891054 systemd[1]: run-netns-cni\x2dde3b53f5\x2d5c11\x2db319\x2de254\x2de519688f4ccc.mount: Deactivated successfully. Mar 2 12:56:50.174237 kubelet[3314]: I0302 12:56:50.173473 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-h894f" podStartSLOduration=3.933967612 podStartE2EDuration="19.173421895s" podCreationTimestamp="2026-03-02 12:56:31 +0000 UTC" firstStartedPulling="2026-03-02 12:56:31.450736197 +0000 UTC m=+18.498413379" lastFinishedPulling="2026-03-02 12:56:46.69019048 +0000 UTC m=+33.737867662" observedRunningTime="2026-03-02 12:56:50.173118223 +0000 UTC m=+37.220795405" watchObservedRunningTime="2026-03-02 12:56:50.173421895 +0000 UTC m=+37.221122214" Mar 2 12:56:50.226742 kubelet[3314]: I0302 12:56:50.226697 3314 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-ca-bundle\") pod \"9d59030b-e361-4da2-a1e1-9c6d616c14df\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " Mar 2 12:56:50.226742 kubelet[3314]: I0302 12:56:50.226748 3314 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llvqh\" (UniqueName: \"kubernetes.io/projected/9d59030b-e361-4da2-a1e1-9c6d616c14df-kube-api-access-llvqh\") pod \"9d59030b-e361-4da2-a1e1-9c6d616c14df\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " Mar 2 12:56:50.226934 kubelet[3314]: I0302 12:56:50.226764 3314 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-nginx-config\") pod \"9d59030b-e361-4da2-a1e1-9c6d616c14df\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " Mar 2 12:56:50.226934 kubelet[3314]: I0302 12:56:50.226778 3314 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-backend-key-pair\") pod \"9d59030b-e361-4da2-a1e1-9c6d616c14df\" (UID: \"9d59030b-e361-4da2-a1e1-9c6d616c14df\") " Mar 2 12:56:50.228418 kubelet[3314]: I0302 12:56:50.228378 3314 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "9d59030b-e361-4da2-a1e1-9c6d616c14df" (UID: "9d59030b-e361-4da2-a1e1-9c6d616c14df"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 2 12:56:50.232211 kubelet[3314]: I0302 12:56:50.230139 3314 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9d59030b-e361-4da2-a1e1-9c6d616c14df" (UID: "9d59030b-e361-4da2-a1e1-9c6d616c14df"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 2 12:56:50.231602 systemd[1]: var-lib-kubelet-pods-9d59030b\x2de361\x2d4da2\x2da1e1\x2d9c6d616c14df-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dllvqh.mount: Deactivated successfully. Mar 2 12:56:50.233552 kubelet[3314]: I0302 12:56:50.233340 3314 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d59030b-e361-4da2-a1e1-9c6d616c14df-kube-api-access-llvqh" (OuterVolumeSpecName: "kube-api-access-llvqh") pod "9d59030b-e361-4da2-a1e1-9c6d616c14df" (UID: "9d59030b-e361-4da2-a1e1-9c6d616c14df"). InnerVolumeSpecName "kube-api-access-llvqh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 2 12:56:50.235499 systemd[1]: var-lib-kubelet-pods-9d59030b\x2de361\x2d4da2\x2da1e1\x2d9c6d616c14df-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 2 12:56:50.236582 kubelet[3314]: I0302 12:56:50.236130 3314 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9d59030b-e361-4da2-a1e1-9c6d616c14df" (UID: "9d59030b-e361-4da2-a1e1-9c6d616c14df"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 2 12:56:50.327315 kubelet[3314]: I0302 12:56:50.327238 3314 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-nginx-config\") on node \"ci-4459.2.101-5c781fe851\" DevicePath \"\"" Mar 2 12:56:50.327315 kubelet[3314]: I0302 12:56:50.327276 3314 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-backend-key-pair\") on node \"ci-4459.2.101-5c781fe851\" DevicePath \"\"" Mar 2 12:56:50.327315 kubelet[3314]: I0302 12:56:50.327286 3314 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d59030b-e361-4da2-a1e1-9c6d616c14df-whisker-ca-bundle\") on node \"ci-4459.2.101-5c781fe851\" DevicePath \"\"" Mar 2 12:56:50.327315 kubelet[3314]: I0302 12:56:50.327292 3314 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llvqh\" (UniqueName: \"kubernetes.io/projected/9d59030b-e361-4da2-a1e1-9c6d616c14df-kube-api-access-llvqh\") on node \"ci-4459.2.101-5c781fe851\" DevicePath \"\"" Mar 2 12:56:51.041850 systemd[1]: Removed slice kubepods-besteffort-pod9d59030b_e361_4da2_a1e1_9c6d616c14df.slice - libcontainer container kubepods-besteffort-pod9d59030b_e361_4da2_a1e1_9c6d616c14df.slice. Mar 2 12:56:51.244533 systemd[1]: Created slice kubepods-besteffort-pod5aad746b_4cff_4b60_a89c_283714c6a760.slice - libcontainer container kubepods-besteffort-pod5aad746b_4cff_4b60_a89c_283714c6a760.slice. Mar 2 12:56:51.332508 kubelet[3314]: I0302 12:56:51.332327 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbb8d\" (UniqueName: \"kubernetes.io/projected/5aad746b-4cff-4b60-a89c-283714c6a760-kube-api-access-qbb8d\") pod \"whisker-6974bdbc44-tn2mh\" (UID: \"5aad746b-4cff-4b60-a89c-283714c6a760\") " pod="calico-system/whisker-6974bdbc44-tn2mh" Mar 2 12:56:51.332508 kubelet[3314]: I0302 12:56:51.332388 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5aad746b-4cff-4b60-a89c-283714c6a760-whisker-backend-key-pair\") pod \"whisker-6974bdbc44-tn2mh\" (UID: \"5aad746b-4cff-4b60-a89c-283714c6a760\") " pod="calico-system/whisker-6974bdbc44-tn2mh" Mar 2 12:56:51.332508 kubelet[3314]: I0302 12:56:51.332431 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aad746b-4cff-4b60-a89c-283714c6a760-whisker-ca-bundle\") pod \"whisker-6974bdbc44-tn2mh\" (UID: \"5aad746b-4cff-4b60-a89c-283714c6a760\") " pod="calico-system/whisker-6974bdbc44-tn2mh" Mar 2 12:56:51.332508 kubelet[3314]: I0302 12:56:51.332444 3314 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/5aad746b-4cff-4b60-a89c-283714c6a760-nginx-config\") pod \"whisker-6974bdbc44-tn2mh\" (UID: \"5aad746b-4cff-4b60-a89c-283714c6a760\") " pod="calico-system/whisker-6974bdbc44-tn2mh" Mar 2 12:56:51.397588 systemd-networkd[1497]: vxlan.calico: Link UP Mar 2 12:56:51.397599 systemd-networkd[1497]: vxlan.calico: Gained carrier Mar 2 12:56:51.548955 containerd[1885]: time="2026-03-02T12:56:51.548572055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6974bdbc44-tn2mh,Uid:5aad746b-4cff-4b60-a89c-283714c6a760,Namespace:calico-system,Attempt:0,}" Mar 2 12:56:51.675443 systemd-networkd[1497]: calie9505b6d80c: Link UP Mar 2 12:56:51.676308 systemd-networkd[1497]: calie9505b6d80c: Gained carrier Mar 2 12:56:51.692726 containerd[1885]: 2026-03-02 12:56:51.607 [INFO][4644] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0 whisker-6974bdbc44- calico-system 5aad746b-4cff-4b60-a89c-283714c6a760 929 0 2026-03-02 12:56:51 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6974bdbc44 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 whisker-6974bdbc44-tn2mh eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie9505b6d80c [] [] }} ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-" Mar 2 12:56:51.692726 containerd[1885]: 2026-03-02 12:56:51.607 [INFO][4644] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" Mar 2 12:56:51.692726 containerd[1885]: 2026-03-02 12:56:51.635 [INFO][4655] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" HandleID="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Workload="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.642 [INFO][4655] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" HandleID="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Workload="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273460), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.101-5c781fe851", "pod":"whisker-6974bdbc44-tn2mh", "timestamp":"2026-03-02 12:56:51.635611936 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001ecf20)} Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.642 [INFO][4655] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.642 [INFO][4655] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.642 [INFO][4655] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.644 [INFO][4655] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.647 [INFO][4655] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.650 [INFO][4655] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.652 [INFO][4655] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.692926 containerd[1885]: 2026-03-02 12:56:51.653 [INFO][4655] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.694291 containerd[1885]: 2026-03-02 12:56:51.653 [INFO][4655] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.694291 containerd[1885]: 2026-03-02 12:56:51.654 [INFO][4655] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9 Mar 2 12:56:51.694291 containerd[1885]: 2026-03-02 12:56:51.659 [INFO][4655] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.694291 containerd[1885]: 2026-03-02 12:56:51.668 [INFO][4655] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.1/26] block=192.168.70.0/26 handle="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.694291 containerd[1885]: 2026-03-02 12:56:51.668 [INFO][4655] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.1/26] handle="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" host="ci-4459.2.101-5c781fe851" Mar 2 12:56:51.694291 containerd[1885]: 2026-03-02 12:56:51.668 [INFO][4655] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:56:51.694291 containerd[1885]: 2026-03-02 12:56:51.668 [INFO][4655] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.1/26] IPv6=[] ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" HandleID="k8s-pod-network.89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Workload="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" Mar 2 12:56:51.694424 containerd[1885]: 2026-03-02 12:56:51.671 [INFO][4644] cni-plugin/k8s.go 418: Populated endpoint ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0", GenerateName:"whisker-6974bdbc44-", Namespace:"calico-system", SelfLink:"", UID:"5aad746b-4cff-4b60-a89c-283714c6a760", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6974bdbc44", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"whisker-6974bdbc44-tn2mh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.70.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9505b6d80c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:56:51.694424 containerd[1885]: 2026-03-02 12:56:51.671 [INFO][4644] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.1/32] ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" Mar 2 12:56:51.694487 containerd[1885]: 2026-03-02 12:56:51.671 [INFO][4644] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie9505b6d80c ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" Mar 2 12:56:51.694487 containerd[1885]: 2026-03-02 12:56:51.676 [INFO][4644] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" Mar 2 12:56:51.694516 containerd[1885]: 2026-03-02 12:56:51.676 [INFO][4644] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0", GenerateName:"whisker-6974bdbc44-", Namespace:"calico-system", SelfLink:"", UID:"5aad746b-4cff-4b60-a89c-283714c6a760", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6974bdbc44", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9", Pod:"whisker-6974bdbc44-tn2mh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.70.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9505b6d80c", MAC:"86:8d:83:08:b8:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:56:51.694548 containerd[1885]: 2026-03-02 12:56:51.689 [INFO][4644] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" Namespace="calico-system" Pod="whisker-6974bdbc44-tn2mh" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-whisker--6974bdbc44--tn2mh-eth0" Mar 2 12:56:51.745313 containerd[1885]: time="2026-03-02T12:56:51.745266289Z" level=info msg="connecting to shim 89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9" address="unix:///run/containerd/s/f893e6103c9886d0ee76e9f05ab1e5cf65dc1bc764b437e7e774a913344b7488" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:56:51.764297 systemd[1]: Started cri-containerd-89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9.scope - libcontainer container 89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9. Mar 2 12:56:51.799082 containerd[1885]: time="2026-03-02T12:56:51.799038595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6974bdbc44-tn2mh,Uid:5aad746b-4cff-4b60-a89c-283714c6a760,Namespace:calico-system,Attempt:0,} returns sandbox id \"89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9\"" Mar 2 12:56:51.802508 containerd[1885]: time="2026-03-02T12:56:51.802467759Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.3\"" Mar 2 12:56:52.664419 systemd-networkd[1497]: vxlan.calico: Gained IPv6LL Mar 2 12:56:53.038906 kubelet[3314]: I0302 12:56:53.038691 3314 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d59030b-e361-4da2-a1e1-9c6d616c14df" path="/var/lib/kubelet/pods/9d59030b-e361-4da2-a1e1-9c6d616c14df/volumes" Mar 2 12:56:53.221768 containerd[1885]: time="2026-03-02T12:56:53.221711087Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:53.225359 containerd[1885]: time="2026-03-02T12:56:53.225313440Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.3: active requests=0, bytes read=5881068" Mar 2 12:56:53.228818 containerd[1885]: time="2026-03-02T12:56:53.228767516Z" level=info msg="ImageCreate event name:\"sha256:860a7f2cdb9123795f95a07e0cc91bc6b511927d1a4d1d588c303c9c59e0fa59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:53.232679 containerd[1885]: time="2026-03-02T12:56:53.232627964Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:3a388b567fff5cc31c64399d4af0fd03d2f4d243ef26e6f6b77a49386dbadeca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:53.233064 containerd[1885]: time="2026-03-02T12:56:53.232922277Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.3\" with image id \"sha256:860a7f2cdb9123795f95a07e0cc91bc6b511927d1a4d1d588c303c9c59e0fa59\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:3a388b567fff5cc31c64399d4af0fd03d2f4d243ef26e6f6b77a49386dbadeca\", size \"7278585\" in 1.430405684s" Mar 2 12:56:53.233064 containerd[1885]: time="2026-03-02T12:56:53.232951693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.3\" returns image reference \"sha256:860a7f2cdb9123795f95a07e0cc91bc6b511927d1a4d1d588c303c9c59e0fa59\"" Mar 2 12:56:53.241418 containerd[1885]: time="2026-03-02T12:56:53.241387451Z" level=info msg="CreateContainer within sandbox \"89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 2 12:56:53.258915 containerd[1885]: time="2026-03-02T12:56:53.258317254Z" level=info msg="Container a59fbc8ae3fe3e658915d4e5cc6cddf25bc40c48975590ad1268602a347036de: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:53.277271 containerd[1885]: time="2026-03-02T12:56:53.277233684Z" level=info msg="CreateContainer within sandbox \"89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"a59fbc8ae3fe3e658915d4e5cc6cddf25bc40c48975590ad1268602a347036de\"" Mar 2 12:56:53.278410 containerd[1885]: time="2026-03-02T12:56:53.278374613Z" level=info msg="StartContainer for \"a59fbc8ae3fe3e658915d4e5cc6cddf25bc40c48975590ad1268602a347036de\"" Mar 2 12:56:53.279654 containerd[1885]: time="2026-03-02T12:56:53.279625802Z" level=info msg="connecting to shim a59fbc8ae3fe3e658915d4e5cc6cddf25bc40c48975590ad1268602a347036de" address="unix:///run/containerd/s/f893e6103c9886d0ee76e9f05ab1e5cf65dc1bc764b437e7e774a913344b7488" protocol=ttrpc version=3 Mar 2 12:56:53.298294 systemd[1]: Started cri-containerd-a59fbc8ae3fe3e658915d4e5cc6cddf25bc40c48975590ad1268602a347036de.scope - libcontainer container a59fbc8ae3fe3e658915d4e5cc6cddf25bc40c48975590ad1268602a347036de. Mar 2 12:56:53.304292 systemd-networkd[1497]: calie9505b6d80c: Gained IPv6LL Mar 2 12:56:53.368363 containerd[1885]: time="2026-03-02T12:56:53.368320242Z" level=info msg="StartContainer for \"a59fbc8ae3fe3e658915d4e5cc6cddf25bc40c48975590ad1268602a347036de\" returns successfully" Mar 2 12:56:53.370206 containerd[1885]: time="2026-03-02T12:56:53.369843935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\"" Mar 2 12:56:55.215094 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2005999341.mount: Deactivated successfully. Mar 2 12:56:55.280512 containerd[1885]: time="2026-03-02T12:56:55.280456002Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:55.283492 containerd[1885]: time="2026-03-02T12:56:55.283456521Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.3: active requests=0, bytes read=16420592" Mar 2 12:56:55.286987 containerd[1885]: time="2026-03-02T12:56:55.286940326Z" level=info msg="ImageCreate event name:\"sha256:d6c2d25ea514599ef2dbba86e46277491ee9c1e15519321c135bb514b2f46aeb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:55.291271 containerd[1885]: time="2026-03-02T12:56:55.291224226Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:359cb5c751e049ac0bb62c4f7e49b1ac81c59935c70715f5ff4c39a757bf9f38\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:56:55.291594 containerd[1885]: time="2026-03-02T12:56:55.291565980Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\" with image id \"sha256:d6c2d25ea514599ef2dbba86e46277491ee9c1e15519321c135bb514b2f46aeb\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:359cb5c751e049ac0bb62c4f7e49b1ac81c59935c70715f5ff4c39a757bf9f38\", size \"16420422\" in 1.921685373s" Mar 2 12:56:55.291655 containerd[1885]: time="2026-03-02T12:56:55.291597717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\" returns image reference \"sha256:d6c2d25ea514599ef2dbba86e46277491ee9c1e15519321c135bb514b2f46aeb\"" Mar 2 12:56:55.302158 containerd[1885]: time="2026-03-02T12:56:55.302121527Z" level=info msg="CreateContainer within sandbox \"89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 2 12:56:55.326160 containerd[1885]: time="2026-03-02T12:56:55.325299984Z" level=info msg="Container 7c41f977a9a7476a97fda21173723fae2be7db6b8070fcecddee32c3622da349: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:56:55.343715 containerd[1885]: time="2026-03-02T12:56:55.343660158Z" level=info msg="CreateContainer within sandbox \"89cb329ed36a1063f5ee21cf9c712718e4d4321278eed89a3b29a9b0a7e9d5b9\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"7c41f977a9a7476a97fda21173723fae2be7db6b8070fcecddee32c3622da349\"" Mar 2 12:56:55.344425 containerd[1885]: time="2026-03-02T12:56:55.344401403Z" level=info msg="StartContainer for \"7c41f977a9a7476a97fda21173723fae2be7db6b8070fcecddee32c3622da349\"" Mar 2 12:56:55.345822 containerd[1885]: time="2026-03-02T12:56:55.345794140Z" level=info msg="connecting to shim 7c41f977a9a7476a97fda21173723fae2be7db6b8070fcecddee32c3622da349" address="unix:///run/containerd/s/f893e6103c9886d0ee76e9f05ab1e5cf65dc1bc764b437e7e774a913344b7488" protocol=ttrpc version=3 Mar 2 12:56:55.365316 systemd[1]: Started cri-containerd-7c41f977a9a7476a97fda21173723fae2be7db6b8070fcecddee32c3622da349.scope - libcontainer container 7c41f977a9a7476a97fda21173723fae2be7db6b8070fcecddee32c3622da349. Mar 2 12:56:55.402095 containerd[1885]: time="2026-03-02T12:56:55.401982596Z" level=info msg="StartContainer for \"7c41f977a9a7476a97fda21173723fae2be7db6b8070fcecddee32c3622da349\" returns successfully" Mar 2 12:56:56.186193 kubelet[3314]: I0302 12:56:56.185284 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6974bdbc44-tn2mh" podStartSLOduration=1.691454791 podStartE2EDuration="5.185269822s" podCreationTimestamp="2026-03-02 12:56:51 +0000 UTC" firstStartedPulling="2026-03-02 12:56:51.800550503 +0000 UTC m=+38.848227685" lastFinishedPulling="2026-03-02 12:56:55.294365534 +0000 UTC m=+42.342042716" observedRunningTime="2026-03-02 12:56:56.18499767 +0000 UTC m=+43.232674884" watchObservedRunningTime="2026-03-02 12:56:56.185269822 +0000 UTC m=+43.232947004" Mar 2 12:56:56.573315 kubelet[3314]: I0302 12:56:56.572805 3314 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:57:00.037054 containerd[1885]: time="2026-03-02T12:57:00.036994698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86mlk,Uid:4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f,Namespace:calico-system,Attempt:0,}" Mar 2 12:57:00.132304 systemd-networkd[1497]: caliceced6f510f: Link UP Mar 2 12:57:00.134779 systemd-networkd[1497]: caliceced6f510f: Gained carrier Mar 2 12:57:00.154007 containerd[1885]: 2026-03-02 12:57:00.072 [INFO][4904] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0 csi-node-driver- calico-system 4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f 738 0 2026-03-02 12:56:31 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:7494d65b57 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 csi-node-driver-86mlk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliceced6f510f [] [] }} ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-" Mar 2 12:57:00.154007 containerd[1885]: 2026-03-02 12:57:00.073 [INFO][4904] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" Mar 2 12:57:00.154007 containerd[1885]: 2026-03-02 12:57:00.093 [INFO][4915] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" HandleID="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Workload="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.098 [INFO][4915] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" HandleID="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Workload="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.101-5c781fe851", "pod":"csi-node-driver-86mlk", "timestamp":"2026-03-02 12:57:00.093172632 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003a7080)} Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.098 [INFO][4915] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.098 [INFO][4915] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.098 [INFO][4915] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.100 [INFO][4915] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.103 [INFO][4915] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.107 [INFO][4915] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.109 [INFO][4915] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154568 containerd[1885]: 2026-03-02 12:57:00.111 [INFO][4915] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154715 containerd[1885]: 2026-03-02 12:57:00.111 [INFO][4915] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154715 containerd[1885]: 2026-03-02 12:57:00.112 [INFO][4915] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4 Mar 2 12:57:00.154715 containerd[1885]: 2026-03-02 12:57:00.120 [INFO][4915] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154715 containerd[1885]: 2026-03-02 12:57:00.125 [INFO][4915] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.2/26] block=192.168.70.0/26 handle="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154715 containerd[1885]: 2026-03-02 12:57:00.125 [INFO][4915] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.2/26] handle="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:00.154715 containerd[1885]: 2026-03-02 12:57:00.125 [INFO][4915] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:57:00.154715 containerd[1885]: 2026-03-02 12:57:00.125 [INFO][4915] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.2/26] IPv6=[] ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" HandleID="k8s-pod-network.62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Workload="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" Mar 2 12:57:00.154856 containerd[1885]: 2026-03-02 12:57:00.127 [INFO][4904] cni-plugin/k8s.go 418: Populated endpoint ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7494d65b57", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"csi-node-driver-86mlk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliceced6f510f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:00.154892 containerd[1885]: 2026-03-02 12:57:00.127 [INFO][4904] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.2/32] ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" Mar 2 12:57:00.154892 containerd[1885]: 2026-03-02 12:57:00.127 [INFO][4904] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliceced6f510f ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" Mar 2 12:57:00.154892 containerd[1885]: 2026-03-02 12:57:00.132 [INFO][4904] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" Mar 2 12:57:00.154936 containerd[1885]: 2026-03-02 12:57:00.134 [INFO][4904] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7494d65b57", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4", Pod:"csi-node-driver-86mlk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliceced6f510f", MAC:"22:66:4c:b4:cd:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:00.154969 containerd[1885]: 2026-03-02 12:57:00.150 [INFO][4904] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" Namespace="calico-system" Pod="csi-node-driver-86mlk" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-csi--node--driver--86mlk-eth0" Mar 2 12:57:00.199367 containerd[1885]: time="2026-03-02T12:57:00.199322343Z" level=info msg="connecting to shim 62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4" address="unix:///run/containerd/s/31071ca2ebe432581c4635dd486fe3da2b03a551dd7ae6b1795ce6455e413153" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:57:00.226348 systemd[1]: Started cri-containerd-62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4.scope - libcontainer container 62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4. Mar 2 12:57:00.265021 containerd[1885]: time="2026-03-02T12:57:00.264160546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86mlk,Uid:4ed1d8a5-5e5e-4b7e-b72f-84fd1fe8207f,Namespace:calico-system,Attempt:0,} returns sandbox id \"62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4\"" Mar 2 12:57:00.267476 containerd[1885]: time="2026-03-02T12:57:00.267435762Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.3\"" Mar 2 12:57:01.037283 containerd[1885]: time="2026-03-02T12:57:01.036913097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-jz2bw,Uid:de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf,Namespace:calico-system,Attempt:0,}" Mar 2 12:57:01.135321 systemd-networkd[1497]: cali3f31db5f27f: Link UP Mar 2 12:57:01.136630 systemd-networkd[1497]: cali3f31db5f27f: Gained carrier Mar 2 12:57:01.154784 containerd[1885]: 2026-03-02 12:57:01.074 [INFO][4992] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0 calico-apiserver-5cddb65cc8- calico-system de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf 871 0 2026-03-02 12:56:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5cddb65cc8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 calico-apiserver-5cddb65cc8-jz2bw eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali3f31db5f27f [] [] }} ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-" Mar 2 12:57:01.154784 containerd[1885]: 2026-03-02 12:57:01.074 [INFO][4992] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" Mar 2 12:57:01.154784 containerd[1885]: 2026-03-02 12:57:01.092 [INFO][5003] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" HandleID="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Workload="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.098 [INFO][5003] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" HandleID="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Workload="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.101-5c781fe851", "pod":"calico-apiserver-5cddb65cc8-jz2bw", "timestamp":"2026-03-02 12:57:01.092599697 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030ef20)} Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.098 [INFO][5003] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.098 [INFO][5003] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.098 [INFO][5003] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.100 [INFO][5003] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.104 [INFO][5003] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.107 [INFO][5003] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.111 [INFO][5003] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155379 containerd[1885]: 2026-03-02 12:57:01.112 [INFO][5003] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155708 containerd[1885]: 2026-03-02 12:57:01.112 [INFO][5003] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155708 containerd[1885]: 2026-03-02 12:57:01.114 [INFO][5003] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162 Mar 2 12:57:01.155708 containerd[1885]: 2026-03-02 12:57:01.121 [INFO][5003] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155708 containerd[1885]: 2026-03-02 12:57:01.125 [INFO][5003] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.3/26] block=192.168.70.0/26 handle="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155708 containerd[1885]: 2026-03-02 12:57:01.126 [INFO][5003] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.3/26] handle="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:01.155708 containerd[1885]: 2026-03-02 12:57:01.126 [INFO][5003] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:57:01.155708 containerd[1885]: 2026-03-02 12:57:01.126 [INFO][5003] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.3/26] IPv6=[] ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" HandleID="k8s-pod-network.91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Workload="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" Mar 2 12:57:01.156023 containerd[1885]: 2026-03-02 12:57:01.130 [INFO][4992] cni-plugin/k8s.go 418: Populated endpoint ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0", GenerateName:"calico-apiserver-5cddb65cc8-", Namespace:"calico-system", SelfLink:"", UID:"de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cddb65cc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"calico-apiserver-5cddb65cc8-jz2bw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3f31db5f27f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:01.156081 containerd[1885]: 2026-03-02 12:57:01.130 [INFO][4992] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.3/32] ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" Mar 2 12:57:01.156081 containerd[1885]: 2026-03-02 12:57:01.130 [INFO][4992] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3f31db5f27f ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" Mar 2 12:57:01.156081 containerd[1885]: 2026-03-02 12:57:01.137 [INFO][4992] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" Mar 2 12:57:01.156294 containerd[1885]: 2026-03-02 12:57:01.137 [INFO][4992] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0", GenerateName:"calico-apiserver-5cddb65cc8-", Namespace:"calico-system", SelfLink:"", UID:"de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cddb65cc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162", Pod:"calico-apiserver-5cddb65cc8-jz2bw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3f31db5f27f", MAC:"f6:9e:51:e8:fb:ce", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:01.156606 containerd[1885]: 2026-03-02 12:57:01.152 [INFO][4992] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-jz2bw" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--jz2bw-eth0" Mar 2 12:57:01.225792 containerd[1885]: time="2026-03-02T12:57:01.225453556Z" level=info msg="connecting to shim 91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162" address="unix:///run/containerd/s/18d370651d4ca577e33f6f49ee4d526b91357f5b874cbc568fec883b9e6d690a" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:57:01.251296 systemd[1]: Started cri-containerd-91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162.scope - libcontainer container 91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162. Mar 2 12:57:01.306222 containerd[1885]: time="2026-03-02T12:57:01.306097371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-jz2bw,Uid:de3e0983-3c6e-42d1-bdf9-b39c3e0a5faf,Namespace:calico-system,Attempt:0,} returns sandbox id \"91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162\"" Mar 2 12:57:01.546087 containerd[1885]: time="2026-03-02T12:57:01.545205312Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:01.548137 containerd[1885]: time="2026-03-02T12:57:01.548103741Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.3: active requests=0, bytes read=8255947" Mar 2 12:57:01.552723 containerd[1885]: time="2026-03-02T12:57:01.552684922Z" level=info msg="ImageCreate event name:\"sha256:a7b37b6d011a8219915c610022e2c5ef47396285db6e7e10d7694ff3dea87dc5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:01.557402 containerd[1885]: time="2026-03-02T12:57:01.557286176Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:3d04cd6265f850f0420b413351275ebfd244991b1b9e69c64efe8b4eff45b53f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:01.558163 containerd[1885]: time="2026-03-02T12:57:01.558128049Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.3\" with image id \"sha256:a7b37b6d011a8219915c610022e2c5ef47396285db6e7e10d7694ff3dea87dc5\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:3d04cd6265f850f0420b413351275ebfd244991b1b9e69c64efe8b4eff45b53f\", size \"9653472\" in 1.290659207s" Mar 2 12:57:01.558272 containerd[1885]: time="2026-03-02T12:57:01.558258965Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.3\" returns image reference \"sha256:a7b37b6d011a8219915c610022e2c5ef47396285db6e7e10d7694ff3dea87dc5\"" Mar 2 12:57:01.560144 containerd[1885]: time="2026-03-02T12:57:01.559136126Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\"" Mar 2 12:57:01.567408 containerd[1885]: time="2026-03-02T12:57:01.567380183Z" level=info msg="CreateContainer within sandbox \"62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 2 12:57:01.591831 containerd[1885]: time="2026-03-02T12:57:01.591737861Z" level=info msg="Container 3eb99ea5975c68e63b762ae546e24fa66ee017b28c4e9e8e15a7aeff455ce258: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:01.612515 containerd[1885]: time="2026-03-02T12:57:01.612386127Z" level=info msg="CreateContainer within sandbox \"62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"3eb99ea5975c68e63b762ae546e24fa66ee017b28c4e9e8e15a7aeff455ce258\"" Mar 2 12:57:01.615241 containerd[1885]: time="2026-03-02T12:57:01.613500456Z" level=info msg="StartContainer for \"3eb99ea5975c68e63b762ae546e24fa66ee017b28c4e9e8e15a7aeff455ce258\"" Mar 2 12:57:01.616053 containerd[1885]: time="2026-03-02T12:57:01.616016689Z" level=info msg="connecting to shim 3eb99ea5975c68e63b762ae546e24fa66ee017b28c4e9e8e15a7aeff455ce258" address="unix:///run/containerd/s/31071ca2ebe432581c4635dd486fe3da2b03a551dd7ae6b1795ce6455e413153" protocol=ttrpc version=3 Mar 2 12:57:01.624328 systemd-networkd[1497]: caliceced6f510f: Gained IPv6LL Mar 2 12:57:01.644331 systemd[1]: Started cri-containerd-3eb99ea5975c68e63b762ae546e24fa66ee017b28c4e9e8e15a7aeff455ce258.scope - libcontainer container 3eb99ea5975c68e63b762ae546e24fa66ee017b28c4e9e8e15a7aeff455ce258. Mar 2 12:57:01.703836 containerd[1885]: time="2026-03-02T12:57:01.703791497Z" level=info msg="StartContainer for \"3eb99ea5975c68e63b762ae546e24fa66ee017b28c4e9e8e15a7aeff455ce258\" returns successfully" Mar 2 12:57:02.036450 containerd[1885]: time="2026-03-02T12:57:02.036403364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b74df9d96-k4hhf,Uid:82b77641-f21a-43bf-bab0-caae6ec56911,Namespace:calico-system,Attempt:0,}" Mar 2 12:57:02.168086 systemd-networkd[1497]: cali595344c2ca3: Link UP Mar 2 12:57:02.170410 systemd-networkd[1497]: cali595344c2ca3: Gained carrier Mar 2 12:57:02.186168 containerd[1885]: 2026-03-02 12:57:02.101 [INFO][5123] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0 calico-kube-controllers-6b74df9d96- calico-system 82b77641-f21a-43bf-bab0-caae6ec56911 873 0 2026-03-02 12:56:31 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6b74df9d96 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 calico-kube-controllers-6b74df9d96-k4hhf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali595344c2ca3 [] [] }} ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-" Mar 2 12:57:02.186168 containerd[1885]: 2026-03-02 12:57:02.101 [INFO][5123] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" Mar 2 12:57:02.186168 containerd[1885]: 2026-03-02 12:57:02.124 [INFO][5135] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" HandleID="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Workload="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.134 [INFO][5135] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" HandleID="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Workload="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbe80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.101-5c781fe851", "pod":"calico-kube-controllers-6b74df9d96-k4hhf", "timestamp":"2026-03-02 12:57:02.124278967 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000293ce0)} Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.134 [INFO][5135] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.134 [INFO][5135] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.134 [INFO][5135] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.136 [INFO][5135] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.140 [INFO][5135] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.144 [INFO][5135] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.145 [INFO][5135] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186657 containerd[1885]: 2026-03-02 12:57:02.147 [INFO][5135] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186806 containerd[1885]: 2026-03-02 12:57:02.147 [INFO][5135] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186806 containerd[1885]: 2026-03-02 12:57:02.148 [INFO][5135] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19 Mar 2 12:57:02.186806 containerd[1885]: 2026-03-02 12:57:02.153 [INFO][5135] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186806 containerd[1885]: 2026-03-02 12:57:02.161 [INFO][5135] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.4/26] block=192.168.70.0/26 handle="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186806 containerd[1885]: 2026-03-02 12:57:02.161 [INFO][5135] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.4/26] handle="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:02.186806 containerd[1885]: 2026-03-02 12:57:02.161 [INFO][5135] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:57:02.186806 containerd[1885]: 2026-03-02 12:57:02.161 [INFO][5135] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.4/26] IPv6=[] ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" HandleID="k8s-pod-network.9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Workload="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" Mar 2 12:57:02.186902 containerd[1885]: 2026-03-02 12:57:02.164 [INFO][5123] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0", GenerateName:"calico-kube-controllers-6b74df9d96-", Namespace:"calico-system", SelfLink:"", UID:"82b77641-f21a-43bf-bab0-caae6ec56911", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b74df9d96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"calico-kube-controllers-6b74df9d96-k4hhf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali595344c2ca3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:02.186936 containerd[1885]: 2026-03-02 12:57:02.164 [INFO][5123] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.4/32] ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" Mar 2 12:57:02.186936 containerd[1885]: 2026-03-02 12:57:02.164 [INFO][5123] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali595344c2ca3 ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" Mar 2 12:57:02.186936 containerd[1885]: 2026-03-02 12:57:02.170 [INFO][5123] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" Mar 2 12:57:02.186974 containerd[1885]: 2026-03-02 12:57:02.171 [INFO][5123] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0", GenerateName:"calico-kube-controllers-6b74df9d96-", Namespace:"calico-system", SelfLink:"", UID:"82b77641-f21a-43bf-bab0-caae6ec56911", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b74df9d96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19", Pod:"calico-kube-controllers-6b74df9d96-k4hhf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali595344c2ca3", MAC:"be:13:c8:27:b2:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:02.187008 containerd[1885]: 2026-03-02 12:57:02.183 [INFO][5123] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" Namespace="calico-system" Pod="calico-kube-controllers-6b74df9d96-k4hhf" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--kube--controllers--6b74df9d96--k4hhf-eth0" Mar 2 12:57:02.249173 containerd[1885]: time="2026-03-02T12:57:02.248699867Z" level=info msg="connecting to shim 9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19" address="unix:///run/containerd/s/e724da1aef881b205ac5b347bc031994e1229d7f70f12540fe8421fa4a056994" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:57:02.274346 systemd[1]: Started cri-containerd-9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19.scope - libcontainer container 9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19. Mar 2 12:57:02.337389 containerd[1885]: time="2026-03-02T12:57:02.337346580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b74df9d96-k4hhf,Uid:82b77641-f21a-43bf-bab0-caae6ec56911,Namespace:calico-system,Attempt:0,} returns sandbox id \"9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19\"" Mar 2 12:57:02.968293 systemd-networkd[1497]: cali3f31db5f27f: Gained IPv6LL Mar 2 12:57:03.928381 systemd-networkd[1497]: cali595344c2ca3: Gained IPv6LL Mar 2 12:57:04.037695 containerd[1885]: time="2026-03-02T12:57:04.037318099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-s9xt2,Uid:018390e1-07bc-43c8-afb7-0f774f706259,Namespace:calico-system,Attempt:0,}" Mar 2 12:57:04.037695 containerd[1885]: time="2026-03-02T12:57:04.037668486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-c59pl,Uid:5a505ea2-0da7-4eaf-9451-354939c31e56,Namespace:calico-system,Attempt:0,}" Mar 2 12:57:04.038222 containerd[1885]: time="2026-03-02T12:57:04.037864923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lqjzn,Uid:a9b47b71-026d-450a-8133-78f1e1d6309a,Namespace:kube-system,Attempt:0,}" Mar 2 12:57:04.038731 containerd[1885]: time="2026-03-02T12:57:04.038621777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfq4x,Uid:ad0dd550-658f-4383-892c-c32b6335909f,Namespace:kube-system,Attempt:0,}" Mar 2 12:57:04.326335 systemd-networkd[1497]: califa84e099197: Link UP Mar 2 12:57:04.328200 systemd-networkd[1497]: califa84e099197: Gained carrier Mar 2 12:57:04.353810 containerd[1885]: 2026-03-02 12:57:04.146 [INFO][5220] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0 goldmane-9566f57b5- calico-system 018390e1-07bc-43c8-afb7-0f774f706259 874 0 2026-03-02 12:56:30 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9566f57b5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 goldmane-9566f57b5-s9xt2 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califa84e099197 [] [] }} ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-" Mar 2 12:57:04.353810 containerd[1885]: 2026-03-02 12:57:04.147 [INFO][5220] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" Mar 2 12:57:04.353810 containerd[1885]: 2026-03-02 12:57:04.219 [INFO][5269] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" HandleID="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Workload="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.249 [INFO][5269] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" HandleID="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Workload="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbaf0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.101-5c781fe851", "pod":"goldmane-9566f57b5-s9xt2", "timestamp":"2026-03-02 12:57:04.219011994 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002bf600)} Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.249 [INFO][5269] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.249 [INFO][5269] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.249 [INFO][5269] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.255 [INFO][5269] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.265 [INFO][5269] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.279 [INFO][5269] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.293 [INFO][5269] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354009 containerd[1885]: 2026-03-02 12:57:04.296 [INFO][5269] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354144 containerd[1885]: 2026-03-02 12:57:04.297 [INFO][5269] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354144 containerd[1885]: 2026-03-02 12:57:04.299 [INFO][5269] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9 Mar 2 12:57:04.354144 containerd[1885]: 2026-03-02 12:57:04.304 [INFO][5269] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354144 containerd[1885]: 2026-03-02 12:57:04.315 [INFO][5269] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.5/26] block=192.168.70.0/26 handle="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354144 containerd[1885]: 2026-03-02 12:57:04.316 [INFO][5269] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.5/26] handle="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.354144 containerd[1885]: 2026-03-02 12:57:04.316 [INFO][5269] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:57:04.354144 containerd[1885]: 2026-03-02 12:57:04.316 [INFO][5269] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.5/26] IPv6=[] ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" HandleID="k8s-pod-network.ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Workload="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" Mar 2 12:57:04.354861 containerd[1885]: 2026-03-02 12:57:04.320 [INFO][5220] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0", GenerateName:"goldmane-9566f57b5-", Namespace:"calico-system", SelfLink:"", UID:"018390e1-07bc-43c8-afb7-0f774f706259", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9566f57b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"goldmane-9566f57b5-s9xt2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.70.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califa84e099197", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.354861 containerd[1885]: 2026-03-02 12:57:04.320 [INFO][5220] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.5/32] ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" Mar 2 12:57:04.354923 containerd[1885]: 2026-03-02 12:57:04.320 [INFO][5220] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califa84e099197 ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" Mar 2 12:57:04.354923 containerd[1885]: 2026-03-02 12:57:04.327 [INFO][5220] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" Mar 2 12:57:04.354953 containerd[1885]: 2026-03-02 12:57:04.329 [INFO][5220] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0", GenerateName:"goldmane-9566f57b5-", Namespace:"calico-system", SelfLink:"", UID:"018390e1-07bc-43c8-afb7-0f774f706259", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9566f57b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9", Pod:"goldmane-9566f57b5-s9xt2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.70.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califa84e099197", MAC:"32:0f:50:5a:d4:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.354987 containerd[1885]: 2026-03-02 12:57:04.346 [INFO][5220] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" Namespace="calico-system" Pod="goldmane-9566f57b5-s9xt2" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-goldmane--9566f57b5--s9xt2-eth0" Mar 2 12:57:04.417655 systemd-networkd[1497]: cali6f1e3e7e4d4: Link UP Mar 2 12:57:04.418104 systemd-networkd[1497]: cali6f1e3e7e4d4: Gained carrier Mar 2 12:57:04.429210 containerd[1885]: time="2026-03-02T12:57:04.429136007Z" level=info msg="connecting to shim ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9" address="unix:///run/containerd/s/f3cccddda9a77b08e202261373e08486ee01e2fc26376460c1384555f0308b15" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:57:04.450715 containerd[1885]: 2026-03-02 12:57:04.158 [INFO][5231] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0 calico-apiserver-5cddb65cc8- calico-system 5a505ea2-0da7-4eaf-9451-354939c31e56 875 0 2026-03-02 12:56:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5cddb65cc8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 calico-apiserver-5cddb65cc8-c59pl eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali6f1e3e7e4d4 [] [] }} ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-" Mar 2 12:57:04.450715 containerd[1885]: 2026-03-02 12:57:04.158 [INFO][5231] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" Mar 2 12:57:04.450715 containerd[1885]: 2026-03-02 12:57:04.271 [INFO][5276] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" HandleID="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Workload="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.294 [INFO][5276] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" HandleID="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Workload="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400017ba60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.101-5c781fe851", "pod":"calico-apiserver-5cddb65cc8-c59pl", "timestamp":"2026-03-02 12:57:04.271820151 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400010c580)} Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.294 [INFO][5276] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.316 [INFO][5276] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.316 [INFO][5276] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.354 [INFO][5276] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.366 [INFO][5276] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.373 [INFO][5276] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.377 [INFO][5276] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451075 containerd[1885]: 2026-03-02 12:57:04.383 [INFO][5276] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451291 containerd[1885]: 2026-03-02 12:57:04.383 [INFO][5276] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451291 containerd[1885]: 2026-03-02 12:57:04.386 [INFO][5276] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771 Mar 2 12:57:04.451291 containerd[1885]: 2026-03-02 12:57:04.394 [INFO][5276] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451291 containerd[1885]: 2026-03-02 12:57:04.405 [INFO][5276] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.6/26] block=192.168.70.0/26 handle="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451291 containerd[1885]: 2026-03-02 12:57:04.405 [INFO][5276] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.6/26] handle="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.451291 containerd[1885]: 2026-03-02 12:57:04.405 [INFO][5276] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:57:04.451291 containerd[1885]: 2026-03-02 12:57:04.405 [INFO][5276] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.6/26] IPv6=[] ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" HandleID="k8s-pod-network.eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Workload="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" Mar 2 12:57:04.451390 containerd[1885]: 2026-03-02 12:57:04.413 [INFO][5231] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0", GenerateName:"calico-apiserver-5cddb65cc8-", Namespace:"calico-system", SelfLink:"", UID:"5a505ea2-0da7-4eaf-9451-354939c31e56", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cddb65cc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"calico-apiserver-5cddb65cc8-c59pl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali6f1e3e7e4d4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.451429 containerd[1885]: 2026-03-02 12:57:04.413 [INFO][5231] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.6/32] ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" Mar 2 12:57:04.451429 containerd[1885]: 2026-03-02 12:57:04.413 [INFO][5231] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f1e3e7e4d4 ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" Mar 2 12:57:04.451429 containerd[1885]: 2026-03-02 12:57:04.419 [INFO][5231] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" Mar 2 12:57:04.451473 containerd[1885]: 2026-03-02 12:57:04.421 [INFO][5231] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0", GenerateName:"calico-apiserver-5cddb65cc8-", Namespace:"calico-system", SelfLink:"", UID:"5a505ea2-0da7-4eaf-9451-354939c31e56", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cddb65cc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771", Pod:"calico-apiserver-5cddb65cc8-c59pl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali6f1e3e7e4d4", MAC:"8e:84:33:63:26:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.451506 containerd[1885]: 2026-03-02 12:57:04.443 [INFO][5231] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" Namespace="calico-system" Pod="calico-apiserver-5cddb65cc8-c59pl" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-calico--apiserver--5cddb65cc8--c59pl-eth0" Mar 2 12:57:04.472564 systemd[1]: Started cri-containerd-ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9.scope - libcontainer container ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9. Mar 2 12:57:04.535413 systemd-networkd[1497]: cali0716bebb5a7: Link UP Mar 2 12:57:04.536249 systemd-networkd[1497]: cali0716bebb5a7: Gained carrier Mar 2 12:57:04.560020 containerd[1885]: time="2026-03-02T12:57:04.559895503Z" level=info msg="connecting to shim eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771" address="unix:///run/containerd/s/ba0b38e5ae04dff811aabf40825a4de538f15ebc8fea9224fa839a0726519470" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:57:04.577941 containerd[1885]: 2026-03-02 12:57:04.155 [INFO][5251] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0 coredns-674b8bbfcf- kube-system ad0dd550-658f-4383-892c-c32b6335909f 876 0 2026-03-02 12:56:18 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 coredns-674b8bbfcf-vfq4x eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0716bebb5a7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-" Mar 2 12:57:04.577941 containerd[1885]: 2026-03-02 12:57:04.155 [INFO][5251] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" Mar 2 12:57:04.577941 containerd[1885]: 2026-03-02 12:57:04.272 [INFO][5273] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" HandleID="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Workload="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.294 [INFO][5273] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" HandleID="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Workload="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb450), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.101-5c781fe851", "pod":"coredns-674b8bbfcf-vfq4x", "timestamp":"2026-03-02 12:57:04.272187522 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000acf20)} Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.294 [INFO][5273] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.405 [INFO][5273] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.405 [INFO][5273] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.454 [INFO][5273] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.477 [INFO][5273] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.483 [INFO][5273] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.492 [INFO][5273] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578088 containerd[1885]: 2026-03-02 12:57:04.495 [INFO][5273] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578276 containerd[1885]: 2026-03-02 12:57:04.495 [INFO][5273] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578276 containerd[1885]: 2026-03-02 12:57:04.498 [INFO][5273] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b Mar 2 12:57:04.578276 containerd[1885]: 2026-03-02 12:57:04.507 [INFO][5273] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578276 containerd[1885]: 2026-03-02 12:57:04.518 [INFO][5273] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.7/26] block=192.168.70.0/26 handle="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578276 containerd[1885]: 2026-03-02 12:57:04.518 [INFO][5273] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.7/26] handle="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.578276 containerd[1885]: 2026-03-02 12:57:04.519 [INFO][5273] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:57:04.578276 containerd[1885]: 2026-03-02 12:57:04.519 [INFO][5273] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.7/26] IPv6=[] ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" HandleID="k8s-pod-network.c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Workload="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" Mar 2 12:57:04.578375 containerd[1885]: 2026-03-02 12:57:04.528 [INFO][5251] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ad0dd550-658f-4383-892c-c32b6335909f", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"coredns-674b8bbfcf-vfq4x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0716bebb5a7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.578375 containerd[1885]: 2026-03-02 12:57:04.528 [INFO][5251] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.7/32] ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" Mar 2 12:57:04.578375 containerd[1885]: 2026-03-02 12:57:04.529 [INFO][5251] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0716bebb5a7 ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" Mar 2 12:57:04.578375 containerd[1885]: 2026-03-02 12:57:04.537 [INFO][5251] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" Mar 2 12:57:04.578375 containerd[1885]: 2026-03-02 12:57:04.540 [INFO][5251] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ad0dd550-658f-4383-892c-c32b6335909f", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b", Pod:"coredns-674b8bbfcf-vfq4x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0716bebb5a7", MAC:"9a:29:ad:a3:3d:6a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.578375 containerd[1885]: 2026-03-02 12:57:04.569 [INFO][5251] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfq4x" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--vfq4x-eth0" Mar 2 12:57:04.581039 containerd[1885]: time="2026-03-02T12:57:04.580931813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-s9xt2,Uid:018390e1-07bc-43c8-afb7-0f774f706259,Namespace:calico-system,Attempt:0,} returns sandbox id \"ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9\"" Mar 2 12:57:04.617406 systemd[1]: Started cri-containerd-eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771.scope - libcontainer container eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771. Mar 2 12:57:04.641241 containerd[1885]: time="2026-03-02T12:57:04.641195532Z" level=info msg="connecting to shim c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b" address="unix:///run/containerd/s/1e2ae454deaac3f9403074568f510cfea7263773cd902ac8aa7bbd1df8504e48" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:57:04.664343 systemd-networkd[1497]: cali55e53f69744: Link UP Mar 2 12:57:04.664519 systemd-networkd[1497]: cali55e53f69744: Gained carrier Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.147 [INFO][5240] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0 coredns-674b8bbfcf- kube-system a9b47b71-026d-450a-8133-78f1e1d6309a 877 0 2026-03-02 12:56:18 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.101-5c781fe851 coredns-674b8bbfcf-lqjzn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali55e53f69744 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.147 [INFO][5240] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.276 [INFO][5272] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" HandleID="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Workload="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.295 [INFO][5272] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" HandleID="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Workload="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f9d10), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.101-5c781fe851", "pod":"coredns-674b8bbfcf-lqjzn", "timestamp":"2026-03-02 12:57:04.276685661 +0000 UTC"}, Hostname:"ci-4459.2.101-5c781fe851", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004c7b80)} Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.295 [INFO][5272] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.519 [INFO][5272] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.519 [INFO][5272] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.101-5c781fe851' Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.561 [INFO][5272] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.584 [INFO][5272] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.598 [INFO][5272] ipam/ipam.go 526: Trying affinity for 192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.602 [INFO][5272] ipam/ipam.go 160: Attempting to load block cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.608 [INFO][5272] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.610 [INFO][5272] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.620 [INFO][5272] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.628 [INFO][5272] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.641 [INFO][5272] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.70.8/26] block=192.168.70.0/26 handle="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.641 [INFO][5272] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.70.8/26] handle="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" host="ci-4459.2.101-5c781fe851" Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.641 [INFO][5272] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:57:04.697043 containerd[1885]: 2026-03-02 12:57:04.641 [INFO][5272] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.70.8/26] IPv6=[] ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" HandleID="k8s-pod-network.b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Workload="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" Mar 2 12:57:04.699223 containerd[1885]: 2026-03-02 12:57:04.648 [INFO][5240] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a9b47b71-026d-450a-8133-78f1e1d6309a", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"", Pod:"coredns-674b8bbfcf-lqjzn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali55e53f69744", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.699223 containerd[1885]: 2026-03-02 12:57:04.648 [INFO][5240] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.8/32] ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" Mar 2 12:57:04.699223 containerd[1885]: 2026-03-02 12:57:04.648 [INFO][5240] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali55e53f69744 ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" Mar 2 12:57:04.699223 containerd[1885]: 2026-03-02 12:57:04.666 [INFO][5240] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" Mar 2 12:57:04.699223 containerd[1885]: 2026-03-02 12:57:04.667 [INFO][5240] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a9b47b71-026d-450a-8133-78f1e1d6309a", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.101-5c781fe851", ContainerID:"b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d", Pod:"coredns-674b8bbfcf-lqjzn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali55e53f69744", MAC:"fa:9e:a2:4c:4d:76", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:57:04.699223 containerd[1885]: 2026-03-02 12:57:04.691 [INFO][5240] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-lqjzn" WorkloadEndpoint="ci--4459.2.101--5c781fe851-k8s-coredns--674b8bbfcf--lqjzn-eth0" Mar 2 12:57:04.698350 systemd[1]: Started cri-containerd-c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b.scope - libcontainer container c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b. Mar 2 12:57:04.769401 containerd[1885]: time="2026-03-02T12:57:04.769351745Z" level=info msg="connecting to shim b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d" address="unix:///run/containerd/s/40193ab3923c0d422c3c4c35c0567550bf39cf5b98477d540dc0eaa5a43526c8" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:57:04.779624 containerd[1885]: time="2026-03-02T12:57:04.779558363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfq4x,Uid:ad0dd550-658f-4383-892c-c32b6335909f,Namespace:kube-system,Attempt:0,} returns sandbox id \"c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b\"" Mar 2 12:57:04.798108 containerd[1885]: time="2026-03-02T12:57:04.797920467Z" level=info msg="CreateContainer within sandbox \"c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 2 12:57:04.830339 systemd[1]: Started cri-containerd-b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d.scope - libcontainer container b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d. Mar 2 12:57:04.838896 containerd[1885]: time="2026-03-02T12:57:04.838015189Z" level=info msg="Container 3116fed653a2a6c2803d120e81dbc111c52c2563913f5f1af90c79e40e1e2d24: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:04.841827 containerd[1885]: time="2026-03-02T12:57:04.841517163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cddb65cc8-c59pl,Uid:5a505ea2-0da7-4eaf-9451-354939c31e56,Namespace:calico-system,Attempt:0,} returns sandbox id \"eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771\"" Mar 2 12:57:04.863466 containerd[1885]: time="2026-03-02T12:57:04.863429419Z" level=info msg="CreateContainer within sandbox \"c7abd3fab1a74cfe5fb98f8bff4d5e004723dfd3b3898e4dee1c0997631c136b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3116fed653a2a6c2803d120e81dbc111c52c2563913f5f1af90c79e40e1e2d24\"" Mar 2 12:57:04.864892 containerd[1885]: time="2026-03-02T12:57:04.864753474Z" level=info msg="StartContainer for \"3116fed653a2a6c2803d120e81dbc111c52c2563913f5f1af90c79e40e1e2d24\"" Mar 2 12:57:04.866116 containerd[1885]: time="2026-03-02T12:57:04.866090785Z" level=info msg="connecting to shim 3116fed653a2a6c2803d120e81dbc111c52c2563913f5f1af90c79e40e1e2d24" address="unix:///run/containerd/s/1e2ae454deaac3f9403074568f510cfea7263773cd902ac8aa7bbd1df8504e48" protocol=ttrpc version=3 Mar 2 12:57:04.896157 systemd[1]: Started cri-containerd-3116fed653a2a6c2803d120e81dbc111c52c2563913f5f1af90c79e40e1e2d24.scope - libcontainer container 3116fed653a2a6c2803d120e81dbc111c52c2563913f5f1af90c79e40e1e2d24. Mar 2 12:57:04.918370 containerd[1885]: time="2026-03-02T12:57:04.918087886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lqjzn,Uid:a9b47b71-026d-450a-8133-78f1e1d6309a,Namespace:kube-system,Attempt:0,} returns sandbox id \"b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d\"" Mar 2 12:57:04.931829 containerd[1885]: time="2026-03-02T12:57:04.931793086Z" level=info msg="CreateContainer within sandbox \"b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 2 12:57:04.947665 containerd[1885]: time="2026-03-02T12:57:04.947537866Z" level=info msg="StartContainer for \"3116fed653a2a6c2803d120e81dbc111c52c2563913f5f1af90c79e40e1e2d24\" returns successfully" Mar 2 12:57:04.948399 containerd[1885]: time="2026-03-02T12:57:04.948215182Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:04.951523 containerd[1885]: time="2026-03-02T12:57:04.951474701Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.3: active requests=0, bytes read=45512258" Mar 2 12:57:04.959992 containerd[1885]: time="2026-03-02T12:57:04.959937564Z" level=info msg="ImageCreate event name:\"sha256:6c1d6f109ccbdc040de9bade4e1d6f18ad2b7e93a2479f2ff827985a6b5c9653\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:04.969688 containerd[1885]: time="2026-03-02T12:57:04.969653663Z" level=info msg="Container 316ec5e4f4bf1e5e3959929d59474af67b567e336a1c779910175bdcce9c9232: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:04.973584 containerd[1885]: time="2026-03-02T12:57:04.973539721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:c2def03be7412561bd678df17fcf2467cac990dbb42278dcfe193aa5a43128d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:04.974488 containerd[1885]: time="2026-03-02T12:57:04.974124882Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" with image id \"sha256:6c1d6f109ccbdc040de9bade4e1d6f18ad2b7e93a2479f2ff827985a6b5c9653\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:c2def03be7412561bd678df17fcf2467cac990dbb42278dcfe193aa5a43128d4\", size \"46909799\" in 3.414956299s" Mar 2 12:57:04.974488 containerd[1885]: time="2026-03-02T12:57:04.974168803Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" returns image reference \"sha256:6c1d6f109ccbdc040de9bade4e1d6f18ad2b7e93a2479f2ff827985a6b5c9653\"" Mar 2 12:57:04.976641 containerd[1885]: time="2026-03-02T12:57:04.976457598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\"" Mar 2 12:57:04.984383 containerd[1885]: time="2026-03-02T12:57:04.984348540Z" level=info msg="CreateContainer within sandbox \"91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 2 12:57:05.000058 containerd[1885]: time="2026-03-02T12:57:05.000007349Z" level=info msg="CreateContainer within sandbox \"b306fe6da9b74b46193c70170108f0be892b4c3cb6532409675419a46b871b2d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"316ec5e4f4bf1e5e3959929d59474af67b567e336a1c779910175bdcce9c9232\"" Mar 2 12:57:05.000911 containerd[1885]: time="2026-03-02T12:57:05.000872031Z" level=info msg="StartContainer for \"316ec5e4f4bf1e5e3959929d59474af67b567e336a1c779910175bdcce9c9232\"" Mar 2 12:57:05.012436 containerd[1885]: time="2026-03-02T12:57:05.012387135Z" level=info msg="connecting to shim 316ec5e4f4bf1e5e3959929d59474af67b567e336a1c779910175bdcce9c9232" address="unix:///run/containerd/s/40193ab3923c0d422c3c4c35c0567550bf39cf5b98477d540dc0eaa5a43526c8" protocol=ttrpc version=3 Mar 2 12:57:05.024183 containerd[1885]: time="2026-03-02T12:57:05.024112645Z" level=info msg="Container c1931555db72ba5496e01a8a88ccd054399cab1a563de11d00c014f652aba693: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:05.040220 systemd[1]: Started cri-containerd-316ec5e4f4bf1e5e3959929d59474af67b567e336a1c779910175bdcce9c9232.scope - libcontainer container 316ec5e4f4bf1e5e3959929d59474af67b567e336a1c779910175bdcce9c9232. Mar 2 12:57:05.050062 containerd[1885]: time="2026-03-02T12:57:05.050005865Z" level=info msg="CreateContainer within sandbox \"91d9d87380e2450a99cf41b24fb992cf00454038347a6b43066576d1ba0d2162\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c1931555db72ba5496e01a8a88ccd054399cab1a563de11d00c014f652aba693\"" Mar 2 12:57:05.053610 containerd[1885]: time="2026-03-02T12:57:05.053577769Z" level=info msg="StartContainer for \"c1931555db72ba5496e01a8a88ccd054399cab1a563de11d00c014f652aba693\"" Mar 2 12:57:05.060456 containerd[1885]: time="2026-03-02T12:57:05.060371087Z" level=info msg="connecting to shim c1931555db72ba5496e01a8a88ccd054399cab1a563de11d00c014f652aba693" address="unix:///run/containerd/s/18d370651d4ca577e33f6f49ee4d526b91357f5b874cbc568fec883b9e6d690a" protocol=ttrpc version=3 Mar 2 12:57:05.098375 systemd[1]: Started cri-containerd-c1931555db72ba5496e01a8a88ccd054399cab1a563de11d00c014f652aba693.scope - libcontainer container c1931555db72ba5496e01a8a88ccd054399cab1a563de11d00c014f652aba693. Mar 2 12:57:05.109144 containerd[1885]: time="2026-03-02T12:57:05.108911304Z" level=info msg="StartContainer for \"316ec5e4f4bf1e5e3959929d59474af67b567e336a1c779910175bdcce9c9232\" returns successfully" Mar 2 12:57:05.177562 containerd[1885]: time="2026-03-02T12:57:05.177525291Z" level=info msg="StartContainer for \"c1931555db72ba5496e01a8a88ccd054399cab1a563de11d00c014f652aba693\" returns successfully" Mar 2 12:57:05.251502 kubelet[3314]: I0302 12:57:05.250908 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-lqjzn" podStartSLOduration=47.250894328 podStartE2EDuration="47.250894328s" podCreationTimestamp="2026-03-02 12:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:57:05.237622149 +0000 UTC m=+52.285299419" watchObservedRunningTime="2026-03-02 12:57:05.250894328 +0000 UTC m=+52.298571510" Mar 2 12:57:05.286177 kubelet[3314]: I0302 12:57:05.286074 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5cddb65cc8-jz2bw" podStartSLOduration=32.618740017 podStartE2EDuration="36.286056331s" podCreationTimestamp="2026-03-02 12:56:29 +0000 UTC" firstStartedPulling="2026-03-02 12:57:01.308765801 +0000 UTC m=+48.356442983" lastFinishedPulling="2026-03-02 12:57:04.976082003 +0000 UTC m=+52.023759297" observedRunningTime="2026-03-02 12:57:05.269365307 +0000 UTC m=+52.317042489" watchObservedRunningTime="2026-03-02 12:57:05.286056331 +0000 UTC m=+52.333733513" Mar 2 12:57:05.720403 systemd-networkd[1497]: cali0716bebb5a7: Gained IPv6LL Mar 2 12:57:05.912320 systemd-networkd[1497]: califa84e099197: Gained IPv6LL Mar 2 12:57:06.168344 systemd-networkd[1497]: cali55e53f69744: Gained IPv6LL Mar 2 12:57:06.169103 systemd-networkd[1497]: cali6f1e3e7e4d4: Gained IPv6LL Mar 2 12:57:06.219116 kubelet[3314]: I0302 12:57:06.219066 3314 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:57:06.556268 containerd[1885]: time="2026-03-02T12:57:06.555995013Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:06.559537 containerd[1885]: time="2026-03-02T12:57:06.559499771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3: active requests=0, bytes read=13755078" Mar 2 12:57:06.562545 containerd[1885]: time="2026-03-02T12:57:06.562517595Z" level=info msg="ImageCreate event name:\"sha256:c55251c1db32bbbf386d6ef9309a13d39443eef28f12c0883c2fd06bc5561b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:06.566750 containerd[1885]: time="2026-03-02T12:57:06.566719798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:2bdced3111efc84af5b77534155b084a55a3f839010807e7e83e75faefc8cf33\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:06.567600 containerd[1885]: time="2026-03-02T12:57:06.567571847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\" with image id \"sha256:c55251c1db32bbbf386d6ef9309a13d39443eef28f12c0883c2fd06bc5561b09\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:2bdced3111efc84af5b77534155b084a55a3f839010807e7e83e75faefc8cf33\", size \"15152555\" in 1.591088176s" Mar 2 12:57:06.567639 containerd[1885]: time="2026-03-02T12:57:06.567604624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\" returns image reference \"sha256:c55251c1db32bbbf386d6ef9309a13d39443eef28f12c0883c2fd06bc5561b09\"" Mar 2 12:57:06.568567 containerd[1885]: time="2026-03-02T12:57:06.568540283Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\"" Mar 2 12:57:06.575989 containerd[1885]: time="2026-03-02T12:57:06.575958276Z" level=info msg="CreateContainer within sandbox \"62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 2 12:57:06.598703 containerd[1885]: time="2026-03-02T12:57:06.598294768Z" level=info msg="Container a4f647e9d8f172121ad9935e742bacab965e1877dd4549dcef7935aa24eb6e95: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:06.620761 containerd[1885]: time="2026-03-02T12:57:06.620717358Z" level=info msg="CreateContainer within sandbox \"62f63692646b8a5487628c44db22eb15cc35801e455c6841a9d21e3b8ac25cf4\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a4f647e9d8f172121ad9935e742bacab965e1877dd4549dcef7935aa24eb6e95\"" Mar 2 12:57:06.621370 containerd[1885]: time="2026-03-02T12:57:06.621351441Z" level=info msg="StartContainer for \"a4f647e9d8f172121ad9935e742bacab965e1877dd4549dcef7935aa24eb6e95\"" Mar 2 12:57:06.622896 containerd[1885]: time="2026-03-02T12:57:06.622859501Z" level=info msg="connecting to shim a4f647e9d8f172121ad9935e742bacab965e1877dd4549dcef7935aa24eb6e95" address="unix:///run/containerd/s/31071ca2ebe432581c4635dd486fe3da2b03a551dd7ae6b1795ce6455e413153" protocol=ttrpc version=3 Mar 2 12:57:06.642374 systemd[1]: Started cri-containerd-a4f647e9d8f172121ad9935e742bacab965e1877dd4549dcef7935aa24eb6e95.scope - libcontainer container a4f647e9d8f172121ad9935e742bacab965e1877dd4549dcef7935aa24eb6e95. Mar 2 12:57:06.693097 containerd[1885]: time="2026-03-02T12:57:06.693057838Z" level=info msg="StartContainer for \"a4f647e9d8f172121ad9935e742bacab965e1877dd4549dcef7935aa24eb6e95\" returns successfully" Mar 2 12:57:07.103027 kubelet[3314]: I0302 12:57:07.102817 3314 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 2 12:57:07.103027 kubelet[3314]: I0302 12:57:07.102853 3314 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 2 12:57:07.238017 kubelet[3314]: I0302 12:57:07.237950 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-86mlk" podStartSLOduration=29.935880976 podStartE2EDuration="36.237926181s" podCreationTimestamp="2026-03-02 12:56:31 +0000 UTC" firstStartedPulling="2026-03-02 12:57:00.266375099 +0000 UTC m=+47.314052281" lastFinishedPulling="2026-03-02 12:57:06.568420304 +0000 UTC m=+53.616097486" observedRunningTime="2026-03-02 12:57:07.237888476 +0000 UTC m=+54.285565658" watchObservedRunningTime="2026-03-02 12:57:07.237926181 +0000 UTC m=+54.285603371" Mar 2 12:57:07.240652 kubelet[3314]: I0302 12:57:07.238054 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-vfq4x" podStartSLOduration=49.238048704 podStartE2EDuration="49.238048704s" podCreationTimestamp="2026-03-02 12:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:57:05.288789978 +0000 UTC m=+52.336467160" watchObservedRunningTime="2026-03-02 12:57:07.238048704 +0000 UTC m=+54.285725886" Mar 2 12:57:09.577682 containerd[1885]: time="2026-03-02T12:57:09.577550036Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:09.580516 containerd[1885]: time="2026-03-02T12:57:09.580480769Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.3: active requests=0, bytes read=49157508" Mar 2 12:57:09.584173 containerd[1885]: time="2026-03-02T12:57:09.583355373Z" level=info msg="ImageCreate event name:\"sha256:f91182157dd9b43afadc3f9d6dbd919b0ec222fc40e9fa608989310b81c1f18c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:09.588184 containerd[1885]: time="2026-03-02T12:57:09.588136865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:081fd6c3de7754ba9892532b2c7c6cae9ba7bd1cca4c42e4590ee8d0f5a5696b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:09.589111 containerd[1885]: time="2026-03-02T12:57:09.589086709Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\" with image id \"sha256:f91182157dd9b43afadc3f9d6dbd919b0ec222fc40e9fa608989310b81c1f18c\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:081fd6c3de7754ba9892532b2c7c6cae9ba7bd1cca4c42e4590ee8d0f5a5696b\", size \"50555001\" in 3.020516392s" Mar 2 12:57:09.589218 containerd[1885]: time="2026-03-02T12:57:09.589204768Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\" returns image reference \"sha256:f91182157dd9b43afadc3f9d6dbd919b0ec222fc40e9fa608989310b81c1f18c\"" Mar 2 12:57:09.590047 containerd[1885]: time="2026-03-02T12:57:09.590005383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.3\"" Mar 2 12:57:09.615390 containerd[1885]: time="2026-03-02T12:57:09.615356291Z" level=info msg="CreateContainer within sandbox \"9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 2 12:57:09.642763 containerd[1885]: time="2026-03-02T12:57:09.642707386Z" level=info msg="Container 627de0d66819060d2a8a2674783e010f9339892a1f56f511bb9299e7c90b9ae9: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:09.660615 containerd[1885]: time="2026-03-02T12:57:09.660566291Z" level=info msg="CreateContainer within sandbox \"9956f95046aef0a8bb8d6e415734beaf9db1c6e7441f27b335c6b47ba1e63d19\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"627de0d66819060d2a8a2674783e010f9339892a1f56f511bb9299e7c90b9ae9\"" Mar 2 12:57:09.661499 containerd[1885]: time="2026-03-02T12:57:09.661427020Z" level=info msg="StartContainer for \"627de0d66819060d2a8a2674783e010f9339892a1f56f511bb9299e7c90b9ae9\"" Mar 2 12:57:09.662587 containerd[1885]: time="2026-03-02T12:57:09.662566725Z" level=info msg="connecting to shim 627de0d66819060d2a8a2674783e010f9339892a1f56f511bb9299e7c90b9ae9" address="unix:///run/containerd/s/e724da1aef881b205ac5b347bc031994e1229d7f70f12540fe8421fa4a056994" protocol=ttrpc version=3 Mar 2 12:57:09.686319 systemd[1]: Started cri-containerd-627de0d66819060d2a8a2674783e010f9339892a1f56f511bb9299e7c90b9ae9.scope - libcontainer container 627de0d66819060d2a8a2674783e010f9339892a1f56f511bb9299e7c90b9ae9. Mar 2 12:57:09.726811 containerd[1885]: time="2026-03-02T12:57:09.726777448Z" level=info msg="StartContainer for \"627de0d66819060d2a8a2674783e010f9339892a1f56f511bb9299e7c90b9ae9\" returns successfully" Mar 2 12:57:10.251207 kubelet[3314]: I0302 12:57:10.250992 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6b74df9d96-k4hhf" podStartSLOduration=31.999909999 podStartE2EDuration="39.250977244s" podCreationTimestamp="2026-03-02 12:56:31 +0000 UTC" firstStartedPulling="2026-03-02 12:57:02.338852592 +0000 UTC m=+49.386529774" lastFinishedPulling="2026-03-02 12:57:09.589919837 +0000 UTC m=+56.637597019" observedRunningTime="2026-03-02 12:57:10.250004103 +0000 UTC m=+57.297681301" watchObservedRunningTime="2026-03-02 12:57:10.250977244 +0000 UTC m=+57.298654442" Mar 2 12:57:14.619064 kubelet[3314]: I0302 12:57:14.619008 3314 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:57:15.478764 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount326487303.mount: Deactivated successfully. Mar 2 12:57:17.897215 containerd[1885]: time="2026-03-02T12:57:17.896242118Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:17.944214 containerd[1885]: time="2026-03-02T12:57:17.943901314Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.3: active requests=0, bytes read=51600693" Mar 2 12:57:18.007779 containerd[1885]: time="2026-03-02T12:57:18.007694188Z" level=info msg="ImageCreate event name:\"sha256:d40b2a23702c4c62ef242fb10a0dae8b80d5b5a0fd36ecec29e43b227f22611d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:18.056038 containerd[1885]: time="2026-03-02T12:57:18.055950257Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:e85ffa1d9468908b0bd44664de0d023da6669faefb3e1013b3a15b63dfa1f9a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:18.056821 containerd[1885]: time="2026-03-02T12:57:18.056589139Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.3\" with image id \"sha256:d40b2a23702c4c62ef242fb10a0dae8b80d5b5a0fd36ecec29e43b227f22611d\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:e85ffa1d9468908b0bd44664de0d023da6669faefb3e1013b3a15b63dfa1f9a9\", size \"51600539\" in 8.466445912s" Mar 2 12:57:18.056821 containerd[1885]: time="2026-03-02T12:57:18.056616460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.3\" returns image reference \"sha256:d40b2a23702c4c62ef242fb10a0dae8b80d5b5a0fd36ecec29e43b227f22611d\"" Mar 2 12:57:18.063643 containerd[1885]: time="2026-03-02T12:57:18.063584303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\"" Mar 2 12:57:18.112486 containerd[1885]: time="2026-03-02T12:57:18.112445494Z" level=info msg="CreateContainer within sandbox \"ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 2 12:57:18.317836 containerd[1885]: time="2026-03-02T12:57:18.316385408Z" level=info msg="Container 2122939423b8a8799a3debb2adbfb7a9ececd326ae6e34beaed90dbcb5d83862: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:18.409140 containerd[1885]: time="2026-03-02T12:57:18.409094100Z" level=info msg="CreateContainer within sandbox \"ec9a1bfde3597e5d1f82db0acc5cbc7a64a7c53312f46e2ff6e59904009178e9\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"2122939423b8a8799a3debb2adbfb7a9ececd326ae6e34beaed90dbcb5d83862\"" Mar 2 12:57:18.411660 containerd[1885]: time="2026-03-02T12:57:18.411642126Z" level=info msg="StartContainer for \"2122939423b8a8799a3debb2adbfb7a9ececd326ae6e34beaed90dbcb5d83862\"" Mar 2 12:57:18.413561 containerd[1885]: time="2026-03-02T12:57:18.413490036Z" level=info msg="connecting to shim 2122939423b8a8799a3debb2adbfb7a9ececd326ae6e34beaed90dbcb5d83862" address="unix:///run/containerd/s/f3cccddda9a77b08e202261373e08486ee01e2fc26376460c1384555f0308b15" protocol=ttrpc version=3 Mar 2 12:57:18.452307 systemd[1]: Started cri-containerd-2122939423b8a8799a3debb2adbfb7a9ececd326ae6e34beaed90dbcb5d83862.scope - libcontainer container 2122939423b8a8799a3debb2adbfb7a9ececd326ae6e34beaed90dbcb5d83862. Mar 2 12:57:18.491957 containerd[1885]: time="2026-03-02T12:57:18.491901975Z" level=info msg="StartContainer for \"2122939423b8a8799a3debb2adbfb7a9ececd326ae6e34beaed90dbcb5d83862\" returns successfully" Mar 2 12:57:18.947175 containerd[1885]: time="2026-03-02T12:57:18.946878006Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:57:19.004028 containerd[1885]: time="2026-03-02T12:57:19.003983373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.3: active requests=0, bytes read=77" Mar 2 12:57:19.005478 containerd[1885]: time="2026-03-02T12:57:19.005447712Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" with image id \"sha256:6c1d6f109ccbdc040de9bade4e1d6f18ad2b7e93a2479f2ff827985a6b5c9653\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:c2def03be7412561bd678df17fcf2467cac990dbb42278dcfe193aa5a43128d4\", size \"46909799\" in 941.832431ms" Mar 2 12:57:19.005592 containerd[1885]: time="2026-03-02T12:57:19.005577619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" returns image reference \"sha256:6c1d6f109ccbdc040de9bade4e1d6f18ad2b7e93a2479f2ff827985a6b5c9653\"" Mar 2 12:57:19.022212 containerd[1885]: time="2026-03-02T12:57:19.022172806Z" level=info msg="CreateContainer within sandbox \"eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 2 12:57:19.152633 containerd[1885]: time="2026-03-02T12:57:19.152508809Z" level=info msg="Container 8c03df91844432e9c54249cb8d4ba3235600d0afc1aaf8d57614f75eed91c0f1: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:57:19.350025 containerd[1885]: time="2026-03-02T12:57:19.349895197Z" level=info msg="CreateContainer within sandbox \"eda05bbe36729dd6f90814b6b64fb5d8960155e868f0ce6b58947db7bea43771\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8c03df91844432e9c54249cb8d4ba3235600d0afc1aaf8d57614f75eed91c0f1\"" Mar 2 12:57:19.352193 containerd[1885]: time="2026-03-02T12:57:19.351406561Z" level=info msg="StartContainer for \"8c03df91844432e9c54249cb8d4ba3235600d0afc1aaf8d57614f75eed91c0f1\"" Mar 2 12:57:19.352521 containerd[1885]: time="2026-03-02T12:57:19.352502241Z" level=info msg="connecting to shim 8c03df91844432e9c54249cb8d4ba3235600d0afc1aaf8d57614f75eed91c0f1" address="unix:///run/containerd/s/ba0b38e5ae04dff811aabf40825a4de538f15ebc8fea9224fa839a0726519470" protocol=ttrpc version=3 Mar 2 12:57:19.373324 systemd[1]: Started cri-containerd-8c03df91844432e9c54249cb8d4ba3235600d0afc1aaf8d57614f75eed91c0f1.scope - libcontainer container 8c03df91844432e9c54249cb8d4ba3235600d0afc1aaf8d57614f75eed91c0f1. Mar 2 12:57:19.405889 containerd[1885]: time="2026-03-02T12:57:19.405853722Z" level=info msg="StartContainer for \"8c03df91844432e9c54249cb8d4ba3235600d0afc1aaf8d57614f75eed91c0f1\" returns successfully" Mar 2 12:57:20.273941 kubelet[3314]: I0302 12:57:20.273880 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5cddb65cc8-c59pl" podStartSLOduration=37.108922503 podStartE2EDuration="51.273858888s" podCreationTimestamp="2026-03-02 12:56:29 +0000 UTC" firstStartedPulling="2026-03-02 12:57:04.843242958 +0000 UTC m=+51.890920140" lastFinishedPulling="2026-03-02 12:57:19.008179343 +0000 UTC m=+66.055856525" observedRunningTime="2026-03-02 12:57:20.270691235 +0000 UTC m=+67.318368417" watchObservedRunningTime="2026-03-02 12:57:20.273858888 +0000 UTC m=+67.321536070" Mar 2 12:57:20.276418 kubelet[3314]: I0302 12:57:20.275452 3314 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-9566f57b5-s9xt2" podStartSLOduration=36.797084762 podStartE2EDuration="50.275440814s" podCreationTimestamp="2026-03-02 12:56:30 +0000 UTC" firstStartedPulling="2026-03-02 12:57:04.58512372 +0000 UTC m=+51.632800910" lastFinishedPulling="2026-03-02 12:57:18.06347978 +0000 UTC m=+65.111156962" observedRunningTime="2026-03-02 12:57:19.270561839 +0000 UTC m=+66.318239029" watchObservedRunningTime="2026-03-02 12:57:20.275440814 +0000 UTC m=+67.323117996" Mar 2 12:58:18.256474 update_engine[1863]: I20260302 12:58:18.256412 1863 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 2 12:58:18.256474 update_engine[1863]: I20260302 12:58:18.256466 1863 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 2 12:58:18.258311 update_engine[1863]: I20260302 12:58:18.256681 1863 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 2 12:58:18.258311 update_engine[1863]: I20260302 12:58:18.257448 1863 omaha_request_params.cc:62] Current group set to stable Mar 2 12:58:18.258381 update_engine[1863]: I20260302 12:58:18.258350 1863 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 2 12:58:18.258381 update_engine[1863]: I20260302 12:58:18.258374 1863 update_attempter.cc:643] Scheduling an action processor start. Mar 2 12:58:18.258419 update_engine[1863]: I20260302 12:58:18.258392 1863 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 2 12:58:18.262274 update_engine[1863]: I20260302 12:58:18.262243 1863 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 2 12:58:18.262645 update_engine[1863]: I20260302 12:58:18.262436 1863 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 2 12:58:18.262899 update_engine[1863]: I20260302 12:58:18.262698 1863 omaha_request_action.cc:272] Request: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: Mar 2 12:58:18.262899 update_engine[1863]: I20260302 12:58:18.262717 1863 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 2 12:58:18.264978 update_engine[1863]: I20260302 12:58:18.264889 1863 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 2 12:58:18.266382 update_engine[1863]: I20260302 12:58:18.266353 1863 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 2 12:58:18.269116 locksmithd[1941]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 2 12:58:18.302638 update_engine[1863]: E20260302 12:58:18.302492 1863 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 2 12:58:18.302638 update_engine[1863]: I20260302 12:58:18.302599 1863 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 2 12:58:28.210240 update_engine[1863]: I20260302 12:58:28.209775 1863 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 2 12:58:28.210240 update_engine[1863]: I20260302 12:58:28.209879 1863 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 2 12:58:28.210626 update_engine[1863]: I20260302 12:58:28.210423 1863 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 2 12:58:28.220959 update_engine[1863]: E20260302 12:58:28.220911 1863 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 2 12:58:28.221066 update_engine[1863]: I20260302 12:58:28.221009 1863 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 2 12:58:38.210864 update_engine[1863]: I20260302 12:58:38.210783 1863 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 2 12:58:38.211394 update_engine[1863]: I20260302 12:58:38.210887 1863 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 2 12:58:38.211394 update_engine[1863]: I20260302 12:58:38.211354 1863 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 2 12:58:38.258840 update_engine[1863]: E20260302 12:58:38.258776 1863 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 2 12:58:38.258995 update_engine[1863]: I20260302 12:58:38.258867 1863 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 2 12:58:48.211158 update_engine[1863]: I20260302 12:58:48.211067 1863 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 2 12:58:48.211507 update_engine[1863]: I20260302 12:58:48.211184 1863 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 2 12:58:48.211684 update_engine[1863]: I20260302 12:58:48.211643 1863 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 2 12:58:48.487429 update_engine[1863]: E20260302 12:58:48.487285 1863 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 2 12:58:48.487429 update_engine[1863]: I20260302 12:58:48.487368 1863 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 2 12:58:48.487429 update_engine[1863]: I20260302 12:58:48.487375 1863 omaha_request_action.cc:617] Omaha request response: Mar 2 12:58:48.487626 update_engine[1863]: E20260302 12:58:48.487465 1863 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487483 1863 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487485 1863 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487489 1863 update_attempter.cc:306] Processing Done. Mar 2 12:58:48.487626 update_engine[1863]: E20260302 12:58:48.487501 1863 update_attempter.cc:619] Update failed. Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487505 1863 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487509 1863 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487512 1863 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487578 1863 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487596 1863 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487599 1863 omaha_request_action.cc:272] Request: Mar 2 12:58:48.487626 update_engine[1863]: Mar 2 12:58:48.487626 update_engine[1863]: Mar 2 12:58:48.487626 update_engine[1863]: Mar 2 12:58:48.487626 update_engine[1863]: Mar 2 12:58:48.487626 update_engine[1863]: Mar 2 12:58:48.487626 update_engine[1863]: Mar 2 12:58:48.487626 update_engine[1863]: I20260302 12:58:48.487604 1863 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 2 12:58:48.487860 update_engine[1863]: I20260302 12:58:48.487619 1863 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 2 12:58:48.488292 update_engine[1863]: I20260302 12:58:48.488013 1863 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 2 12:58:48.488483 locksmithd[1941]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 2 12:58:48.525871 update_engine[1863]: E20260302 12:58:48.525808 1863 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 2 12:58:48.525998 update_engine[1863]: I20260302 12:58:48.525894 1863 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 2 12:58:48.525998 update_engine[1863]: I20260302 12:58:48.525902 1863 omaha_request_action.cc:617] Omaha request response: Mar 2 12:58:48.525998 update_engine[1863]: I20260302 12:58:48.525906 1863 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 2 12:58:48.525998 update_engine[1863]: I20260302 12:58:48.525909 1863 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 2 12:58:48.525998 update_engine[1863]: I20260302 12:58:48.525913 1863 update_attempter.cc:306] Processing Done. Mar 2 12:58:48.525998 update_engine[1863]: I20260302 12:58:48.525918 1863 update_attempter.cc:310] Error event sent. Mar 2 12:58:48.525998 update_engine[1863]: I20260302 12:58:48.525925 1863 update_check_scheduler.cc:74] Next update check in 40m1s Mar 2 12:58:48.526303 locksmithd[1941]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 2 12:58:55.113404 systemd[1]: Started sshd@7-10.200.20.16:22-10.200.16.10:43846.service - OpenSSH per-connection server daemon (10.200.16.10:43846). Mar 2 12:58:55.542977 sshd[6303]: Accepted publickey for core from 10.200.16.10 port 43846 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:58:55.545073 sshd-session[6303]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:58:55.548959 systemd-logind[1862]: New session 10 of user core. Mar 2 12:58:55.553285 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 2 12:58:55.868180 sshd[6306]: Connection closed by 10.200.16.10 port 43846 Mar 2 12:58:55.867478 sshd-session[6303]: pam_unix(sshd:session): session closed for user core Mar 2 12:58:55.870856 systemd[1]: sshd@7-10.200.20.16:22-10.200.16.10:43846.service: Deactivated successfully. Mar 2 12:58:55.872918 systemd[1]: session-10.scope: Deactivated successfully. Mar 2 12:58:55.873622 systemd-logind[1862]: Session 10 logged out. Waiting for processes to exit. Mar 2 12:58:55.874765 systemd-logind[1862]: Removed session 10. Mar 2 12:59:00.957288 systemd[1]: Started sshd@8-10.200.20.16:22-10.200.16.10:56802.service - OpenSSH per-connection server daemon (10.200.16.10:56802). Mar 2 12:59:01.379141 sshd[6364]: Accepted publickey for core from 10.200.16.10 port 56802 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:01.380285 sshd-session[6364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:01.383876 systemd-logind[1862]: New session 11 of user core. Mar 2 12:59:01.391324 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 2 12:59:01.658978 sshd[6368]: Connection closed by 10.200.16.10 port 56802 Mar 2 12:59:01.659545 sshd-session[6364]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:01.662811 systemd-logind[1862]: Session 11 logged out. Waiting for processes to exit. Mar 2 12:59:01.662942 systemd[1]: sshd@8-10.200.20.16:22-10.200.16.10:56802.service: Deactivated successfully. Mar 2 12:59:01.664438 systemd[1]: session-11.scope: Deactivated successfully. Mar 2 12:59:01.666135 systemd-logind[1862]: Removed session 11. Mar 2 12:59:06.750798 systemd[1]: Started sshd@9-10.200.20.16:22-10.200.16.10:56816.service - OpenSSH per-connection server daemon (10.200.16.10:56816). Mar 2 12:59:07.171022 sshd[6381]: Accepted publickey for core from 10.200.16.10 port 56816 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:07.172366 sshd-session[6381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:07.176059 systemd-logind[1862]: New session 12 of user core. Mar 2 12:59:07.180393 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 2 12:59:07.449005 sshd[6384]: Connection closed by 10.200.16.10 port 56816 Mar 2 12:59:07.448830 sshd-session[6381]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:07.453409 systemd[1]: sshd@9-10.200.20.16:22-10.200.16.10:56816.service: Deactivated successfully. Mar 2 12:59:07.455697 systemd[1]: session-12.scope: Deactivated successfully. Mar 2 12:59:07.458198 systemd-logind[1862]: Session 12 logged out. Waiting for processes to exit. Mar 2 12:59:07.459620 systemd-logind[1862]: Removed session 12. Mar 2 12:59:12.537836 systemd[1]: Started sshd@10-10.200.20.16:22-10.200.16.10:47956.service - OpenSSH per-connection server daemon (10.200.16.10:47956). Mar 2 12:59:12.973633 sshd[6419]: Accepted publickey for core from 10.200.16.10 port 47956 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:12.974898 sshd-session[6419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:12.978749 systemd-logind[1862]: New session 13 of user core. Mar 2 12:59:12.983296 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 2 12:59:13.255764 sshd[6422]: Connection closed by 10.200.16.10 port 47956 Mar 2 12:59:13.256298 sshd-session[6419]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:13.260058 systemd[1]: sshd@10-10.200.20.16:22-10.200.16.10:47956.service: Deactivated successfully. Mar 2 12:59:13.261968 systemd[1]: session-13.scope: Deactivated successfully. Mar 2 12:59:13.262752 systemd-logind[1862]: Session 13 logged out. Waiting for processes to exit. Mar 2 12:59:13.264882 systemd-logind[1862]: Removed session 13. Mar 2 12:59:13.342521 systemd[1]: Started sshd@11-10.200.20.16:22-10.200.16.10:47970.service - OpenSSH per-connection server daemon (10.200.16.10:47970). Mar 2 12:59:13.762826 sshd[6437]: Accepted publickey for core from 10.200.16.10 port 47970 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:13.763968 sshd-session[6437]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:13.767504 systemd-logind[1862]: New session 14 of user core. Mar 2 12:59:13.775294 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 2 12:59:14.076438 sshd[6447]: Connection closed by 10.200.16.10 port 47970 Mar 2 12:59:14.076987 sshd-session[6437]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:14.080570 systemd[1]: sshd@11-10.200.20.16:22-10.200.16.10:47970.service: Deactivated successfully. Mar 2 12:59:14.082309 systemd[1]: session-14.scope: Deactivated successfully. Mar 2 12:59:14.083016 systemd-logind[1862]: Session 14 logged out. Waiting for processes to exit. Mar 2 12:59:14.084300 systemd-logind[1862]: Removed session 14. Mar 2 12:59:14.164274 systemd[1]: Started sshd@12-10.200.20.16:22-10.200.16.10:47974.service - OpenSSH per-connection server daemon (10.200.16.10:47974). Mar 2 12:59:14.597027 sshd[6469]: Accepted publickey for core from 10.200.16.10 port 47974 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:14.598173 sshd-session[6469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:14.601847 systemd-logind[1862]: New session 15 of user core. Mar 2 12:59:14.606300 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 2 12:59:14.873772 sshd[6472]: Connection closed by 10.200.16.10 port 47974 Mar 2 12:59:14.874438 sshd-session[6469]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:14.877657 systemd[1]: sshd@12-10.200.20.16:22-10.200.16.10:47974.service: Deactivated successfully. Mar 2 12:59:14.879889 systemd[1]: session-15.scope: Deactivated successfully. Mar 2 12:59:14.881690 systemd-logind[1862]: Session 15 logged out. Waiting for processes to exit. Mar 2 12:59:14.882772 systemd-logind[1862]: Removed session 15. Mar 2 12:59:19.969322 systemd[1]: Started sshd@13-10.200.20.16:22-10.200.16.10:39298.service - OpenSSH per-connection server daemon (10.200.16.10:39298). Mar 2 12:59:20.396456 sshd[6487]: Accepted publickey for core from 10.200.16.10 port 39298 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:20.397600 sshd-session[6487]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:20.401297 systemd-logind[1862]: New session 16 of user core. Mar 2 12:59:20.404288 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 2 12:59:20.674358 sshd[6490]: Connection closed by 10.200.16.10 port 39298 Mar 2 12:59:20.675223 sshd-session[6487]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:20.678210 systemd[1]: sshd@13-10.200.20.16:22-10.200.16.10:39298.service: Deactivated successfully. Mar 2 12:59:20.680245 systemd[1]: session-16.scope: Deactivated successfully. Mar 2 12:59:20.682231 systemd-logind[1862]: Session 16 logged out. Waiting for processes to exit. Mar 2 12:59:20.683526 systemd-logind[1862]: Removed session 16. Mar 2 12:59:20.763362 systemd[1]: Started sshd@14-10.200.20.16:22-10.200.16.10:39314.service - OpenSSH per-connection server daemon (10.200.16.10:39314). Mar 2 12:59:21.180187 sshd[6502]: Accepted publickey for core from 10.200.16.10 port 39314 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:21.181189 sshd-session[6502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:21.184691 systemd-logind[1862]: New session 17 of user core. Mar 2 12:59:21.192384 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 2 12:59:21.601137 sshd[6505]: Connection closed by 10.200.16.10 port 39314 Mar 2 12:59:21.624022 sshd-session[6502]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:21.627630 systemd[1]: sshd@14-10.200.20.16:22-10.200.16.10:39314.service: Deactivated successfully. Mar 2 12:59:21.629501 systemd[1]: session-17.scope: Deactivated successfully. Mar 2 12:59:21.630467 systemd-logind[1862]: Session 17 logged out. Waiting for processes to exit. Mar 2 12:59:21.632625 systemd-logind[1862]: Removed session 17. Mar 2 12:59:21.689932 systemd[1]: Started sshd@15-10.200.20.16:22-10.200.16.10:39316.service - OpenSSH per-connection server daemon (10.200.16.10:39316). Mar 2 12:59:22.113019 sshd[6538]: Accepted publickey for core from 10.200.16.10 port 39316 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:22.114283 sshd-session[6538]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:22.117873 systemd-logind[1862]: New session 18 of user core. Mar 2 12:59:22.127484 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 2 12:59:22.889603 sshd[6541]: Connection closed by 10.200.16.10 port 39316 Mar 2 12:59:22.890376 sshd-session[6538]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:22.893563 systemd[1]: sshd@15-10.200.20.16:22-10.200.16.10:39316.service: Deactivated successfully. Mar 2 12:59:22.895646 systemd[1]: session-18.scope: Deactivated successfully. Mar 2 12:59:22.896952 systemd-logind[1862]: Session 18 logged out. Waiting for processes to exit. Mar 2 12:59:22.899822 systemd-logind[1862]: Removed session 18. Mar 2 12:59:22.977417 systemd[1]: Started sshd@16-10.200.20.16:22-10.200.16.10:39318.service - OpenSSH per-connection server daemon (10.200.16.10:39318). Mar 2 12:59:23.401889 sshd[6574]: Accepted publickey for core from 10.200.16.10 port 39318 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:23.403079 sshd-session[6574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:23.406851 systemd-logind[1862]: New session 19 of user core. Mar 2 12:59:23.425564 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 2 12:59:23.767121 sshd[6577]: Connection closed by 10.200.16.10 port 39318 Mar 2 12:59:23.766691 sshd-session[6574]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:23.771132 systemd[1]: sshd@16-10.200.20.16:22-10.200.16.10:39318.service: Deactivated successfully. Mar 2 12:59:23.774521 systemd[1]: session-19.scope: Deactivated successfully. Mar 2 12:59:23.775314 systemd-logind[1862]: Session 19 logged out. Waiting for processes to exit. Mar 2 12:59:23.776687 systemd-logind[1862]: Removed session 19. Mar 2 12:59:23.855648 systemd[1]: Started sshd@17-10.200.20.16:22-10.200.16.10:39324.service - OpenSSH per-connection server daemon (10.200.16.10:39324). Mar 2 12:59:24.285182 sshd[6586]: Accepted publickey for core from 10.200.16.10 port 39324 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:24.286081 sshd-session[6586]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:24.290387 systemd-logind[1862]: New session 20 of user core. Mar 2 12:59:24.304315 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 2 12:59:24.562486 sshd[6589]: Connection closed by 10.200.16.10 port 39324 Mar 2 12:59:24.562773 sshd-session[6586]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:24.567180 systemd-logind[1862]: Session 20 logged out. Waiting for processes to exit. Mar 2 12:59:24.567485 systemd[1]: sshd@17-10.200.20.16:22-10.200.16.10:39324.service: Deactivated successfully. Mar 2 12:59:24.570066 systemd[1]: session-20.scope: Deactivated successfully. Mar 2 12:59:24.572068 systemd-logind[1862]: Removed session 20. Mar 2 12:59:29.654376 systemd[1]: Started sshd@18-10.200.20.16:22-10.200.16.10:39340.service - OpenSSH per-connection server daemon (10.200.16.10:39340). Mar 2 12:59:30.074255 sshd[6625]: Accepted publickey for core from 10.200.16.10 port 39340 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:30.075883 sshd-session[6625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:30.080509 systemd-logind[1862]: New session 21 of user core. Mar 2 12:59:30.086306 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 2 12:59:30.352688 sshd[6628]: Connection closed by 10.200.16.10 port 39340 Mar 2 12:59:30.353111 sshd-session[6625]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:30.356684 systemd[1]: sshd@18-10.200.20.16:22-10.200.16.10:39340.service: Deactivated successfully. Mar 2 12:59:30.358414 systemd[1]: session-21.scope: Deactivated successfully. Mar 2 12:59:30.359138 systemd-logind[1862]: Session 21 logged out. Waiting for processes to exit. Mar 2 12:59:30.360813 systemd-logind[1862]: Removed session 21. Mar 2 12:59:35.443328 systemd[1]: Started sshd@19-10.200.20.16:22-10.200.16.10:48870.service - OpenSSH per-connection server daemon (10.200.16.10:48870). Mar 2 12:59:35.870183 sshd[6651]: Accepted publickey for core from 10.200.16.10 port 48870 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:35.870997 sshd-session[6651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:35.874811 systemd-logind[1862]: New session 22 of user core. Mar 2 12:59:35.883285 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 2 12:59:36.151681 sshd[6654]: Connection closed by 10.200.16.10 port 48870 Mar 2 12:59:36.152215 sshd-session[6651]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:36.155622 systemd[1]: sshd@19-10.200.20.16:22-10.200.16.10:48870.service: Deactivated successfully. Mar 2 12:59:36.157554 systemd[1]: session-22.scope: Deactivated successfully. Mar 2 12:59:36.159040 systemd-logind[1862]: Session 22 logged out. Waiting for processes to exit. Mar 2 12:59:36.160132 systemd-logind[1862]: Removed session 22. Mar 2 12:59:41.240550 systemd[1]: Started sshd@20-10.200.20.16:22-10.200.16.10:53324.service - OpenSSH per-connection server daemon (10.200.16.10:53324). Mar 2 12:59:41.660554 sshd[6712]: Accepted publickey for core from 10.200.16.10 port 53324 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:41.661769 sshd-session[6712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:41.665395 systemd-logind[1862]: New session 23 of user core. Mar 2 12:59:41.674280 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 2 12:59:41.940427 sshd[6715]: Connection closed by 10.200.16.10 port 53324 Mar 2 12:59:41.940942 sshd-session[6712]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:41.944771 systemd[1]: sshd@20-10.200.20.16:22-10.200.16.10:53324.service: Deactivated successfully. Mar 2 12:59:41.947309 systemd[1]: session-23.scope: Deactivated successfully. Mar 2 12:59:41.948975 systemd-logind[1862]: Session 23 logged out. Waiting for processes to exit. Mar 2 12:59:41.950941 systemd-logind[1862]: Removed session 23. Mar 2 12:59:47.031993 systemd[1]: Started sshd@21-10.200.20.16:22-10.200.16.10:53336.service - OpenSSH per-connection server daemon (10.200.16.10:53336). Mar 2 12:59:47.447921 sshd[6732]: Accepted publickey for core from 10.200.16.10 port 53336 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:47.448713 sshd-session[6732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:47.453141 systemd-logind[1862]: New session 24 of user core. Mar 2 12:59:47.457302 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 2 12:59:47.741908 sshd[6735]: Connection closed by 10.200.16.10 port 53336 Mar 2 12:59:47.742555 sshd-session[6732]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:47.746237 systemd[1]: sshd@21-10.200.20.16:22-10.200.16.10:53336.service: Deactivated successfully. Mar 2 12:59:47.749740 systemd[1]: session-24.scope: Deactivated successfully. Mar 2 12:59:47.751236 systemd-logind[1862]: Session 24 logged out. Waiting for processes to exit. Mar 2 12:59:47.752546 systemd-logind[1862]: Removed session 24. Mar 2 12:59:52.838260 systemd[1]: Started sshd@22-10.200.20.16:22-10.200.16.10:47576.service - OpenSSH per-connection server daemon (10.200.16.10:47576). Mar 2 12:59:53.258942 sshd[6771]: Accepted publickey for core from 10.200.16.10 port 47576 ssh2: RSA SHA256:7ukVy6tXsczvRkKnjXS5ykZo8M2KdxhCNukcDYzlKCM Mar 2 12:59:53.260011 sshd-session[6771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:59:53.263908 systemd-logind[1862]: New session 25 of user core. Mar 2 12:59:53.268294 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 2 12:59:53.535789 sshd[6774]: Connection closed by 10.200.16.10 port 47576 Mar 2 12:59:53.536348 sshd-session[6771]: pam_unix(sshd:session): session closed for user core Mar 2 12:59:53.540440 systemd[1]: sshd@22-10.200.20.16:22-10.200.16.10:47576.service: Deactivated successfully. Mar 2 12:59:53.543789 systemd[1]: session-25.scope: Deactivated successfully. Mar 2 12:59:53.546058 systemd-logind[1862]: Session 25 logged out. Waiting for processes to exit. Mar 2 12:59:53.548256 systemd-logind[1862]: Removed session 25.