Mar 6 02:58:01.035867 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Mar 6 02:58:01.035884 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Mar 5 23:10:47 -00 2026 Mar 6 02:58:01.035890 kernel: KASLR enabled Mar 6 02:58:01.035894 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Mar 6 02:58:01.035898 kernel: printk: legacy bootconsole [pl11] enabled Mar 6 02:58:01.035903 kernel: efi: EFI v2.7 by EDK II Mar 6 02:58:01.035908 kernel: efi: ACPI 2.0=0x3f979018 SMBIOS=0x3f8a0000 SMBIOS 3.0=0x3f880000 MEMATTR=0x3e89d018 RNG=0x3f979998 MEMRESERVE=0x3db83598 Mar 6 02:58:01.035912 kernel: random: crng init done Mar 6 02:58:01.035916 kernel: secureboot: Secure boot disabled Mar 6 02:58:01.035919 kernel: ACPI: Early table checksum verification disabled Mar 6 02:58:01.035923 kernel: ACPI: RSDP 0x000000003F979018 000024 (v02 VRTUAL) Mar 6 02:58:01.035927 kernel: ACPI: XSDT 0x000000003F979F18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035931 kernel: ACPI: FACP 0x000000003F979C18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035935 kernel: ACPI: DSDT 0x000000003F95A018 01E046 (v02 MSFTVM DSDT01 00000001 INTL 20230628) Mar 6 02:58:01.035941 kernel: ACPI: DBG2 0x000000003F979B18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035945 kernel: ACPI: GTDT 0x000000003F979D98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035949 kernel: ACPI: OEM0 0x000000003F979098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035954 kernel: ACPI: SPCR 0x000000003F979A98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035958 kernel: ACPI: APIC 0x000000003F979818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035963 kernel: ACPI: SRAT 0x000000003F979198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035967 kernel: ACPI: PPTT 0x000000003F979418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Mar 6 02:58:01.035971 kernel: ACPI: BGRT 0x000000003F979E98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 6 02:58:01.035975 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Mar 6 02:58:01.035979 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 6 02:58:01.035983 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 6 02:58:01.035987 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Mar 6 02:58:01.035991 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Mar 6 02:58:01.035995 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 6 02:58:01.036000 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 6 02:58:01.036004 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 6 02:58:01.036009 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 6 02:58:01.036013 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 6 02:58:01.036017 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 6 02:58:01.036021 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 6 02:58:01.036025 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 6 02:58:01.036029 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 6 02:58:01.036033 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Mar 6 02:58:01.036038 kernel: NODE_DATA(0) allocated [mem 0x1bf7ffa00-0x1bf806fff] Mar 6 02:58:01.036042 kernel: Zone ranges: Mar 6 02:58:01.036046 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Mar 6 02:58:01.036082 kernel: DMA32 empty Mar 6 02:58:01.036088 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Mar 6 02:58:01.036093 kernel: Device empty Mar 6 02:58:01.036098 kernel: Movable zone start for each node Mar 6 02:58:01.036102 kernel: Early memory node ranges Mar 6 02:58:01.036106 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Mar 6 02:58:01.036111 kernel: node 0: [mem 0x0000000000824000-0x000000003f38ffff] Mar 6 02:58:01.036116 kernel: node 0: [mem 0x000000003f390000-0x000000003f93ffff] Mar 6 02:58:01.036120 kernel: node 0: [mem 0x000000003f940000-0x000000003f9effff] Mar 6 02:58:01.036124 kernel: node 0: [mem 0x000000003f9f0000-0x000000003fdeffff] Mar 6 02:58:01.036129 kernel: node 0: [mem 0x000000003fdf0000-0x000000003fffffff] Mar 6 02:58:01.036133 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Mar 6 02:58:01.036138 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Mar 6 02:58:01.036142 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Mar 6 02:58:01.036146 kernel: cma: Reserved 16 MiB at 0x000000003ca00000 on node -1 Mar 6 02:58:01.036151 kernel: psci: probing for conduit method from ACPI. Mar 6 02:58:01.036155 kernel: psci: PSCIv1.3 detected in firmware. Mar 6 02:58:01.036159 kernel: psci: Using standard PSCI v0.2 function IDs Mar 6 02:58:01.036164 kernel: psci: MIGRATE_INFO_TYPE not supported. Mar 6 02:58:01.036169 kernel: psci: SMC Calling Convention v1.4 Mar 6 02:58:01.036173 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 6 02:58:01.036177 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 6 02:58:01.036182 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 6 02:58:01.036186 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 6 02:58:01.036191 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 6 02:58:01.036195 kernel: Detected PIPT I-cache on CPU0 Mar 6 02:58:01.036200 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Mar 6 02:58:01.036204 kernel: CPU features: detected: GIC system register CPU interface Mar 6 02:58:01.036208 kernel: CPU features: detected: Spectre-v4 Mar 6 02:58:01.036213 kernel: CPU features: detected: Spectre-BHB Mar 6 02:58:01.036218 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 6 02:58:01.036222 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 6 02:58:01.036227 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Mar 6 02:58:01.036231 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 6 02:58:01.036235 kernel: alternatives: applying boot alternatives Mar 6 02:58:01.036241 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=68c9ef230e3eed1360dd8114dada95b6a934f07952c3a5d42725f3006977f027 Mar 6 02:58:01.036246 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 6 02:58:01.036250 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 6 02:58:01.036254 kernel: Fallback order for Node 0: 0 Mar 6 02:58:01.036259 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Mar 6 02:58:01.036264 kernel: Policy zone: Normal Mar 6 02:58:01.036268 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 6 02:58:01.036273 kernel: software IO TLB: area num 2. Mar 6 02:58:01.036277 kernel: software IO TLB: mapped [mem 0x0000000035900000-0x0000000039900000] (64MB) Mar 6 02:58:01.036281 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 6 02:58:01.036286 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 6 02:58:01.036291 kernel: rcu: RCU event tracing is enabled. Mar 6 02:58:01.036295 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 6 02:58:01.036300 kernel: Trampoline variant of Tasks RCU enabled. Mar 6 02:58:01.036304 kernel: Tracing variant of Tasks RCU enabled. Mar 6 02:58:01.036309 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 6 02:58:01.036313 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 6 02:58:01.036318 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 6 02:58:01.036323 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 6 02:58:01.036327 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 6 02:58:01.036332 kernel: GICv3: 960 SPIs implemented Mar 6 02:58:01.036336 kernel: GICv3: 0 Extended SPIs implemented Mar 6 02:58:01.036340 kernel: Root IRQ handler: gic_handle_irq Mar 6 02:58:01.036345 kernel: GICv3: GICv3 features: 16 PPIs, RSS Mar 6 02:58:01.036349 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Mar 6 02:58:01.036353 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Mar 6 02:58:01.036358 kernel: ITS: No ITS available, not enabling LPIs Mar 6 02:58:01.036362 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 6 02:58:01.036367 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Mar 6 02:58:01.036372 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 6 02:58:01.036376 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Mar 6 02:58:01.036381 kernel: Console: colour dummy device 80x25 Mar 6 02:58:01.036386 kernel: printk: legacy console [tty1] enabled Mar 6 02:58:01.036390 kernel: ACPI: Core revision 20240827 Mar 6 02:58:01.036395 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Mar 6 02:58:01.036400 kernel: pid_max: default: 32768 minimum: 301 Mar 6 02:58:01.036404 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 6 02:58:01.036409 kernel: landlock: Up and running. Mar 6 02:58:01.036414 kernel: SELinux: Initializing. Mar 6 02:58:01.036418 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 6 02:58:01.036423 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 6 02:58:01.036428 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0xa0000e, misc 0x31e1 Mar 6 02:58:01.036432 kernel: Hyper-V: Host Build 10.0.26102.1212-1-0 Mar 6 02:58:01.036440 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 6 02:58:01.036445 kernel: rcu: Hierarchical SRCU implementation. Mar 6 02:58:01.036450 kernel: rcu: Max phase no-delay instances is 400. Mar 6 02:58:01.036455 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 6 02:58:01.036460 kernel: Remapping and enabling EFI services. Mar 6 02:58:01.036464 kernel: smp: Bringing up secondary CPUs ... Mar 6 02:58:01.036469 kernel: Detected PIPT I-cache on CPU1 Mar 6 02:58:01.036475 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Mar 6 02:58:01.036479 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Mar 6 02:58:01.036484 kernel: smp: Brought up 1 node, 2 CPUs Mar 6 02:58:01.036489 kernel: SMP: Total of 2 processors activated. Mar 6 02:58:01.036493 kernel: CPU: All CPU(s) started at EL1 Mar 6 02:58:01.036499 kernel: CPU features: detected: 32-bit EL0 Support Mar 6 02:58:01.036504 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Mar 6 02:58:01.036509 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 6 02:58:01.036513 kernel: CPU features: detected: Common not Private translations Mar 6 02:58:01.036518 kernel: CPU features: detected: CRC32 instructions Mar 6 02:58:01.036523 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Mar 6 02:58:01.036528 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 6 02:58:01.036532 kernel: CPU features: detected: LSE atomic instructions Mar 6 02:58:01.036537 kernel: CPU features: detected: Privileged Access Never Mar 6 02:58:01.036543 kernel: CPU features: detected: Speculation barrier (SB) Mar 6 02:58:01.036547 kernel: CPU features: detected: TLB range maintenance instructions Mar 6 02:58:01.036552 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 6 02:58:01.036557 kernel: CPU features: detected: Scalable Vector Extension Mar 6 02:58:01.036564 kernel: alternatives: applying system-wide alternatives Mar 6 02:58:01.036569 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Mar 6 02:58:01.036573 kernel: SVE: maximum available vector length 16 bytes per vector Mar 6 02:58:01.036578 kernel: SVE: default vector length 16 bytes per vector Mar 6 02:58:01.036583 kernel: Memory: 3952828K/4194160K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 220144K reserved, 16384K cma-reserved) Mar 6 02:58:01.036589 kernel: devtmpfs: initialized Mar 6 02:58:01.036594 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 6 02:58:01.036598 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 6 02:58:01.036603 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 6 02:58:01.036608 kernel: 0 pages in range for non-PLT usage Mar 6 02:58:01.036613 kernel: 508400 pages in range for PLT usage Mar 6 02:58:01.036617 kernel: pinctrl core: initialized pinctrl subsystem Mar 6 02:58:01.036622 kernel: SMBIOS 3.1.0 present. Mar 6 02:58:01.036628 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 06/10/2025 Mar 6 02:58:01.036632 kernel: DMI: Memory slots populated: 2/2 Mar 6 02:58:01.036637 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 6 02:58:01.036642 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 6 02:58:01.036647 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 6 02:58:01.036652 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 6 02:58:01.036656 kernel: audit: initializing netlink subsys (disabled) Mar 6 02:58:01.036661 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Mar 6 02:58:01.036666 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 6 02:58:01.036671 kernel: cpuidle: using governor menu Mar 6 02:58:01.036676 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 6 02:58:01.036680 kernel: ASID allocator initialised with 32768 entries Mar 6 02:58:01.036685 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 6 02:58:01.036690 kernel: Serial: AMBA PL011 UART driver Mar 6 02:58:01.036695 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 6 02:58:01.036699 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 6 02:58:01.036704 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 6 02:58:01.036709 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 6 02:58:01.036714 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 6 02:58:01.036719 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 6 02:58:01.036724 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 6 02:58:01.036728 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 6 02:58:01.036733 kernel: ACPI: Added _OSI(Module Device) Mar 6 02:58:01.036738 kernel: ACPI: Added _OSI(Processor Device) Mar 6 02:58:01.036743 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 6 02:58:01.036747 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 6 02:58:01.036752 kernel: ACPI: Interpreter enabled Mar 6 02:58:01.036757 kernel: ACPI: Using GIC for interrupt routing Mar 6 02:58:01.036762 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Mar 6 02:58:01.036767 kernel: printk: legacy console [ttyAMA0] enabled Mar 6 02:58:01.036772 kernel: printk: legacy bootconsole [pl11] disabled Mar 6 02:58:01.036776 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Mar 6 02:58:01.036781 kernel: ACPI: CPU0 has been hot-added Mar 6 02:58:01.036786 kernel: ACPI: CPU1 has been hot-added Mar 6 02:58:01.036791 kernel: iommu: Default domain type: Translated Mar 6 02:58:01.036795 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 6 02:58:01.036801 kernel: efivars: Registered efivars operations Mar 6 02:58:01.036805 kernel: vgaarb: loaded Mar 6 02:58:01.036810 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 6 02:58:01.036815 kernel: VFS: Disk quotas dquot_6.6.0 Mar 6 02:58:01.036819 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 6 02:58:01.036824 kernel: pnp: PnP ACPI init Mar 6 02:58:01.036829 kernel: pnp: PnP ACPI: found 0 devices Mar 6 02:58:01.036834 kernel: NET: Registered PF_INET protocol family Mar 6 02:58:01.036838 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 6 02:58:01.036843 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 6 02:58:01.036849 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 6 02:58:01.036854 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 6 02:58:01.036858 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 6 02:58:01.036863 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 6 02:58:01.036868 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 6 02:58:01.036873 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 6 02:58:01.036878 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 6 02:58:01.036882 kernel: PCI: CLS 0 bytes, default 64 Mar 6 02:58:01.036887 kernel: kvm [1]: HYP mode not available Mar 6 02:58:01.036893 kernel: Initialise system trusted keyrings Mar 6 02:58:01.036897 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 6 02:58:01.036902 kernel: Key type asymmetric registered Mar 6 02:58:01.036907 kernel: Asymmetric key parser 'x509' registered Mar 6 02:58:01.036911 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 6 02:58:01.036916 kernel: io scheduler mq-deadline registered Mar 6 02:58:01.036921 kernel: io scheduler kyber registered Mar 6 02:58:01.036926 kernel: io scheduler bfq registered Mar 6 02:58:01.036930 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 6 02:58:01.036936 kernel: thunder_xcv, ver 1.0 Mar 6 02:58:01.036941 kernel: thunder_bgx, ver 1.0 Mar 6 02:58:01.036945 kernel: nicpf, ver 1.0 Mar 6 02:58:01.036950 kernel: nicvf, ver 1.0 Mar 6 02:58:01.037060 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 6 02:58:01.037113 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-06T02:58:00 UTC (1772765880) Mar 6 02:58:01.037120 kernel: efifb: probing for efifb Mar 6 02:58:01.037126 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 6 02:58:01.037131 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 6 02:58:01.037136 kernel: efifb: scrolling: redraw Mar 6 02:58:01.037141 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 6 02:58:01.037145 kernel: Console: switching to colour frame buffer device 128x48 Mar 6 02:58:01.037150 kernel: fb0: EFI VGA frame buffer device Mar 6 02:58:01.037155 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Mar 6 02:58:01.037160 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 6 02:58:01.037165 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 6 02:58:01.037170 kernel: watchdog: NMI not fully supported Mar 6 02:58:01.037175 kernel: watchdog: Hard watchdog permanently disabled Mar 6 02:58:01.037180 kernel: NET: Registered PF_INET6 protocol family Mar 6 02:58:01.037184 kernel: Segment Routing with IPv6 Mar 6 02:58:01.037189 kernel: In-situ OAM (IOAM) with IPv6 Mar 6 02:58:01.037194 kernel: NET: Registered PF_PACKET protocol family Mar 6 02:58:01.037198 kernel: Key type dns_resolver registered Mar 6 02:58:01.037203 kernel: registered taskstats version 1 Mar 6 02:58:01.037208 kernel: Loading compiled-in X.509 certificates Mar 6 02:58:01.037213 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 3a2ba669b0bb3660035f2ce1faaa856d46d520ff' Mar 6 02:58:01.037218 kernel: Demotion targets for Node 0: null Mar 6 02:58:01.037223 kernel: Key type .fscrypt registered Mar 6 02:58:01.037228 kernel: Key type fscrypt-provisioning registered Mar 6 02:58:01.037232 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 6 02:58:01.037237 kernel: ima: Allocated hash algorithm: sha1 Mar 6 02:58:01.037242 kernel: ima: No architecture policies found Mar 6 02:58:01.037247 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 6 02:58:01.037251 kernel: clk: Disabling unused clocks Mar 6 02:58:01.037256 kernel: PM: genpd: Disabling unused power domains Mar 6 02:58:01.037262 kernel: Warning: unable to open an initial console. Mar 6 02:58:01.037267 kernel: Freeing unused kernel memory: 39552K Mar 6 02:58:01.037271 kernel: Run /init as init process Mar 6 02:58:01.037276 kernel: with arguments: Mar 6 02:58:01.037281 kernel: /init Mar 6 02:58:01.037285 kernel: with environment: Mar 6 02:58:01.037290 kernel: HOME=/ Mar 6 02:58:01.037295 kernel: TERM=linux Mar 6 02:58:01.037300 systemd[1]: Successfully made /usr/ read-only. Mar 6 02:58:01.037308 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 6 02:58:01.037313 systemd[1]: Detected virtualization microsoft. Mar 6 02:58:01.037318 systemd[1]: Detected architecture arm64. Mar 6 02:58:01.037323 systemd[1]: Running in initrd. Mar 6 02:58:01.037328 systemd[1]: No hostname configured, using default hostname. Mar 6 02:58:01.037334 systemd[1]: Hostname set to . Mar 6 02:58:01.037339 systemd[1]: Initializing machine ID from random generator. Mar 6 02:58:01.037345 systemd[1]: Queued start job for default target initrd.target. Mar 6 02:58:01.037350 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:58:01.037355 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:58:01.037361 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 6 02:58:01.037366 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 02:58:01.037371 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 6 02:58:01.037377 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 6 02:58:01.037384 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 6 02:58:01.037389 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 6 02:58:01.037394 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:58:01.037399 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:58:01.037405 systemd[1]: Reached target paths.target - Path Units. Mar 6 02:58:01.037410 systemd[1]: Reached target slices.target - Slice Units. Mar 6 02:58:01.037415 systemd[1]: Reached target swap.target - Swaps. Mar 6 02:58:01.037420 systemd[1]: Reached target timers.target - Timer Units. Mar 6 02:58:01.037426 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 02:58:01.037431 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 02:58:01.037436 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 6 02:58:01.037442 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 6 02:58:01.037447 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:58:01.037452 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 02:58:01.037457 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:58:01.037462 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 02:58:01.037468 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 6 02:58:01.037474 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 02:58:01.037479 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 6 02:58:01.037484 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 6 02:58:01.037490 systemd[1]: Starting systemd-fsck-usr.service... Mar 6 02:58:01.037495 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 02:58:01.037500 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 02:58:01.037515 systemd-journald[225]: Collecting audit messages is disabled. Mar 6 02:58:01.037529 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:58:01.037535 systemd-journald[225]: Journal started Mar 6 02:58:01.037550 systemd-journald[225]: Runtime Journal (/run/log/journal/e31b0e48aebc4739a86a8b31b56c7289) is 8M, max 78.3M, 70.3M free. Mar 6 02:58:01.048018 systemd-modules-load[227]: Inserted module 'overlay' Mar 6 02:58:01.055742 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 02:58:01.063022 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 6 02:58:01.083773 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 6 02:58:01.083788 kernel: Bridge firewalling registered Mar 6 02:58:01.077331 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:58:01.081229 systemd-modules-load[227]: Inserted module 'br_netfilter' Mar 6 02:58:01.091071 systemd[1]: Finished systemd-fsck-usr.service. Mar 6 02:58:01.096948 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 02:58:01.104630 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:58:01.113941 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 6 02:58:01.134780 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 02:58:01.141605 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 6 02:58:01.161622 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 02:58:01.169651 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 02:58:01.182352 systemd-tmpfiles[256]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 6 02:58:01.190152 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:58:01.201388 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 02:58:01.210099 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:58:01.222158 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 6 02:58:01.248802 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 02:58:01.255161 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 02:58:01.279126 dracut-cmdline[262]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=68c9ef230e3eed1360dd8114dada95b6a934f07952c3a5d42725f3006977f027 Mar 6 02:58:01.311236 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:58:01.316679 systemd-resolved[263]: Positive Trust Anchors: Mar 6 02:58:01.316687 systemd-resolved[263]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 02:58:01.316707 systemd-resolved[263]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 02:58:01.318268 systemd-resolved[263]: Defaulting to hostname 'linux'. Mar 6 02:58:01.319336 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 02:58:01.326087 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:58:01.387066 kernel: SCSI subsystem initialized Mar 6 02:58:01.393062 kernel: Loading iSCSI transport class v2.0-870. Mar 6 02:58:01.400061 kernel: iscsi: registered transport (tcp) Mar 6 02:58:01.413011 kernel: iscsi: registered transport (qla4xxx) Mar 6 02:58:01.413047 kernel: QLogic iSCSI HBA Driver Mar 6 02:58:01.425464 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 6 02:58:01.442301 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:58:01.453639 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 6 02:58:01.493017 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 6 02:58:01.498638 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 6 02:58:01.558069 kernel: raid6: neonx8 gen() 18524 MB/s Mar 6 02:58:01.577063 kernel: raid6: neonx4 gen() 18569 MB/s Mar 6 02:58:01.596059 kernel: raid6: neonx2 gen() 17107 MB/s Mar 6 02:58:01.616060 kernel: raid6: neonx1 gen() 15008 MB/s Mar 6 02:58:01.635078 kernel: raid6: int64x8 gen() 10530 MB/s Mar 6 02:58:01.654060 kernel: raid6: int64x4 gen() 10615 MB/s Mar 6 02:58:01.674142 kernel: raid6: int64x2 gen() 8973 MB/s Mar 6 02:58:01.695535 kernel: raid6: int64x1 gen() 7018 MB/s Mar 6 02:58:01.695544 kernel: raid6: using algorithm neonx4 gen() 18569 MB/s Mar 6 02:58:01.717416 kernel: raid6: .... xor() 15144 MB/s, rmw enabled Mar 6 02:58:01.717423 kernel: raid6: using neon recovery algorithm Mar 6 02:58:01.723060 kernel: xor: measuring software checksum speed Mar 6 02:58:01.729007 kernel: 8regs : 27084 MB/sec Mar 6 02:58:01.729015 kernel: 32regs : 28793 MB/sec Mar 6 02:58:01.731722 kernel: arm64_neon : 37566 MB/sec Mar 6 02:58:01.735039 kernel: xor: using function: arm64_neon (37566 MB/sec) Mar 6 02:58:01.774076 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 6 02:58:01.779144 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 6 02:58:01.788177 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:58:01.812707 systemd-udevd[474]: Using default interface naming scheme 'v255'. Mar 6 02:58:01.816511 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:58:01.829515 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 6 02:58:01.856936 dracut-pre-trigger[489]: rd.md=0: removing MD RAID activation Mar 6 02:58:01.878441 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 02:58:01.883822 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 02:58:01.932640 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:58:01.938973 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 6 02:58:02.000074 kernel: hv_vmbus: Vmbus version:5.3 Mar 6 02:58:02.012859 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:58:02.034933 kernel: hv_vmbus: registering driver hid_hyperv Mar 6 02:58:02.034949 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 6 02:58:02.034956 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 6 02:58:02.034963 kernel: hv_vmbus: registering driver hv_storvsc Mar 6 02:58:02.034975 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Mar 6 02:58:02.034981 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 6 02:58:02.034989 kernel: hv_vmbus: registering driver hv_netvsc Mar 6 02:58:02.012971 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:58:02.055545 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 6 02:58:02.055661 kernel: PTP clock support registered Mar 6 02:58:02.055086 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:58:02.091155 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Mar 6 02:58:02.091171 kernel: scsi host0: storvsc_host_t Mar 6 02:58:02.091301 kernel: scsi host1: storvsc_host_t Mar 6 02:58:02.091367 kernel: scsi 1:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 6 02:58:02.091383 kernel: scsi 1:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Mar 6 02:58:02.082247 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:58:02.092032 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:58:02.115213 kernel: hv_utils: Registering HyperV Utility Driver Mar 6 02:58:02.115247 kernel: hv_vmbus: registering driver hv_utils Mar 6 02:58:02.115255 kernel: sd 1:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 6 02:58:02.125965 kernel: hv_utils: Heartbeat IC version 3.0 Mar 6 02:58:02.125997 kernel: hv_netvsc 00224877-4608-0022-4877-460800224877 eth0: VF slot 1 added Mar 6 02:58:02.126117 kernel: hv_utils: Shutdown IC version 3.2 Mar 6 02:58:02.406469 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Mar 6 02:58:02.406604 kernel: hv_utils: TimeSync IC version 4.0 Mar 6 02:58:02.402128 systemd-resolved[263]: Clock change detected. Flushing caches. Mar 6 02:58:02.412432 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:58:02.428420 kernel: sd 1:0:0:0: [sda] Write Protect is off Mar 6 02:58:02.428547 kernel: sd 1:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 6 02:58:02.428653 kernel: sd 1:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 6 02:58:02.443419 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:58:02.443432 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Mar 6 02:58:02.450181 kernel: hv_vmbus: registering driver hv_pci Mar 6 02:58:02.450212 kernel: sr 1:0:0:2: [sr0] scsi-1 drive Mar 6 02:58:02.456788 kernel: hv_pci 771b3534-3df4-4e32-990a-40763fca19e3: PCI VMBus probing: Using version 0x10004 Mar 6 02:58:02.456927 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 6 02:58:02.471190 kernel: hv_pci 771b3534-3df4-4e32-990a-40763fca19e3: PCI host bridge to bus 3df4:00 Mar 6 02:58:02.471328 kernel: pci_bus 3df4:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Mar 6 02:58:02.471406 kernel: pci_bus 3df4:00: No busn resource found for root bus, will use [bus 00-ff] Mar 6 02:58:02.475362 kernel: sr 1:0:0:2: Attached scsi CD-ROM sr0 Mar 6 02:58:02.481603 kernel: pci 3df4:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Mar 6 02:58:02.486197 kernel: pci 3df4:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Mar 6 02:58:02.491189 kernel: pci 3df4:00:02.0: enabling Extended Tags Mar 6 02:58:02.506253 kernel: pci 3df4:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 3df4:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Mar 6 02:58:02.514731 kernel: pci_bus 3df4:00: busn_res: [bus 00-ff] end is updated to 00 Mar 6 02:58:02.514866 kernel: pci 3df4:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Mar 6 02:58:02.537195 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#220 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 6 02:58:02.559179 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#241 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 6 02:58:02.583020 kernel: mlx5_core 3df4:00:02.0: enabling device (0000 -> 0002) Mar 6 02:58:02.591159 kernel: mlx5_core 3df4:00:02.0: PTM is not supported by PCIe Mar 6 02:58:02.591304 kernel: mlx5_core 3df4:00:02.0: firmware version: 16.30.5026 Mar 6 02:58:02.761014 kernel: hv_netvsc 00224877-4608-0022-4877-460800224877 eth0: VF registering: eth1 Mar 6 02:58:02.761199 kernel: mlx5_core 3df4:00:02.0 eth1: joined to eth0 Mar 6 02:58:02.765881 kernel: mlx5_core 3df4:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Mar 6 02:58:02.774188 kernel: mlx5_core 3df4:00:02.0 enP15860s1: renamed from eth1 Mar 6 02:58:03.345423 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 6 02:58:03.476825 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 6 02:58:03.495132 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 6 02:58:03.644738 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 6 02:58:03.649548 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 6 02:58:03.663084 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 6 02:58:03.670509 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 02:58:03.678538 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:58:03.687737 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 02:58:03.696725 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 6 02:58:03.720750 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 6 02:58:03.737232 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 6 02:58:03.753249 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:58:04.771212 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:58:04.772153 disk-uuid[659]: The operation has completed successfully. Mar 6 02:58:04.835591 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 6 02:58:04.835676 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 6 02:58:04.863363 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 6 02:58:04.886246 sh[824]: Success Mar 6 02:58:04.935158 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 6 02:58:04.935196 kernel: device-mapper: uevent: version 1.0.3 Mar 6 02:58:04.940305 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 6 02:58:04.948189 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 6 02:58:05.434267 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 6 02:58:05.449476 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 6 02:58:05.454538 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 6 02:58:05.475186 kernel: BTRFS: device fsid fcb4e7bf-1206-4803-90fb-6606b15e3aea devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (842) Mar 6 02:58:05.484547 kernel: BTRFS info (device dm-0): first mount of filesystem fcb4e7bf-1206-4803-90fb-6606b15e3aea Mar 6 02:58:05.484556 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:58:06.058360 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 6 02:58:06.058442 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 6 02:58:06.141734 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 6 02:58:06.145640 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 6 02:58:06.153613 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 6 02:58:06.154202 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 6 02:58:06.176811 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 6 02:58:06.206198 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (870) Mar 6 02:58:06.216962 kernel: BTRFS info (device sda6): first mount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:58:06.216993 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:58:06.260593 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:58:06.260627 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:58:06.269203 kernel: BTRFS info (device sda6): last unmount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:58:06.270222 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 6 02:58:06.274592 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 02:58:06.288663 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 6 02:58:06.300055 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 02:58:06.324907 systemd-networkd[1012]: lo: Link UP Mar 6 02:58:06.324918 systemd-networkd[1012]: lo: Gained carrier Mar 6 02:58:06.325609 systemd-networkd[1012]: Enumeration completed Mar 6 02:58:06.327619 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 02:58:06.327775 systemd-networkd[1012]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:58:06.327777 systemd-networkd[1012]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:58:06.335221 systemd[1]: Reached target network.target - Network. Mar 6 02:58:06.404185 kernel: mlx5_core 3df4:00:02.0 enP15860s1: Link up Mar 6 02:58:06.435832 systemd-networkd[1012]: enP15860s1: Link UP Mar 6 02:58:06.438986 kernel: hv_netvsc 00224877-4608-0022-4877-460800224877 eth0: Data path switched to VF: enP15860s1 Mar 6 02:58:06.435888 systemd-networkd[1012]: eth0: Link UP Mar 6 02:58:06.436003 systemd-networkd[1012]: eth0: Gained carrier Mar 6 02:58:06.436011 systemd-networkd[1012]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:58:06.453531 systemd-networkd[1012]: enP15860s1: Gained carrier Mar 6 02:58:06.466200 systemd-networkd[1012]: eth0: DHCPv4 address 10.200.20.33/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 6 02:58:07.550366 systemd-networkd[1012]: eth0: Gained IPv6LL Mar 6 02:58:08.333058 ignition[1011]: Ignition 2.22.0 Mar 6 02:58:08.333072 ignition[1011]: Stage: fetch-offline Mar 6 02:58:08.335781 ignition[1011]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:58:08.337215 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 02:58:08.335788 ignition[1011]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:58:08.345074 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 6 02:58:08.335864 ignition[1011]: parsed url from cmdline: "" Mar 6 02:58:08.335867 ignition[1011]: no config URL provided Mar 6 02:58:08.335870 ignition[1011]: reading system config file "/usr/lib/ignition/user.ign" Mar 6 02:58:08.335875 ignition[1011]: no config at "/usr/lib/ignition/user.ign" Mar 6 02:58:08.335878 ignition[1011]: failed to fetch config: resource requires networking Mar 6 02:58:08.336092 ignition[1011]: Ignition finished successfully Mar 6 02:58:08.375504 ignition[1021]: Ignition 2.22.0 Mar 6 02:58:08.375517 ignition[1021]: Stage: fetch Mar 6 02:58:08.375676 ignition[1021]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:58:08.375682 ignition[1021]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:58:08.375752 ignition[1021]: parsed url from cmdline: "" Mar 6 02:58:08.375755 ignition[1021]: no config URL provided Mar 6 02:58:08.375759 ignition[1021]: reading system config file "/usr/lib/ignition/user.ign" Mar 6 02:58:08.375769 ignition[1021]: no config at "/usr/lib/ignition/user.ign" Mar 6 02:58:08.375784 ignition[1021]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 6 02:58:08.491191 ignition[1021]: GET result: OK Mar 6 02:58:08.491254 ignition[1021]: config has been read from IMDS userdata Mar 6 02:58:08.491274 ignition[1021]: parsing config with SHA512: c78b7bd7315380da4477d0cf93e332f4bfbe9f737ad8cbb6b51c7183fdd038edf9ebb5756eec61c6cd7701e08c05846e2bf004c9f263aaebd132281b836d35d4 Mar 6 02:58:08.494003 unknown[1021]: fetched base config from "system" Mar 6 02:58:08.494227 ignition[1021]: fetch: fetch complete Mar 6 02:58:08.494008 unknown[1021]: fetched base config from "system" Mar 6 02:58:08.494230 ignition[1021]: fetch: fetch passed Mar 6 02:58:08.494011 unknown[1021]: fetched user config from "azure" Mar 6 02:58:08.494265 ignition[1021]: Ignition finished successfully Mar 6 02:58:08.499396 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 6 02:58:08.506399 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 6 02:58:08.545653 ignition[1028]: Ignition 2.22.0 Mar 6 02:58:08.548016 ignition[1028]: Stage: kargs Mar 6 02:58:08.548203 ignition[1028]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:58:08.553193 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 6 02:58:08.548210 ignition[1028]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:58:08.561293 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 6 02:58:08.548718 ignition[1028]: kargs: kargs passed Mar 6 02:58:08.548753 ignition[1028]: Ignition finished successfully Mar 6 02:58:08.592617 ignition[1034]: Ignition 2.22.0 Mar 6 02:58:08.592626 ignition[1034]: Stage: disks Mar 6 02:58:08.592771 ignition[1034]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:58:08.592778 ignition[1034]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:58:08.601509 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 6 02:58:08.597920 ignition[1034]: disks: disks passed Mar 6 02:58:08.605717 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 6 02:58:08.597962 ignition[1034]: Ignition finished successfully Mar 6 02:58:08.614145 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 6 02:58:08.622689 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 02:58:08.629036 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 02:58:08.637150 systemd[1]: Reached target basic.target - Basic System. Mar 6 02:58:08.644264 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 6 02:58:08.768068 systemd-fsck[1042]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Mar 6 02:58:08.775966 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 6 02:58:08.781902 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 6 02:58:09.257187 kernel: EXT4-fs (sda9): mounted filesystem f0884ab3-756d-49e8-9d95-af187b4f35fb r/w with ordered data mode. Quota mode: none. Mar 6 02:58:09.257396 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 6 02:58:09.260985 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 6 02:58:09.299313 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 02:58:09.306073 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 6 02:58:09.316855 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 6 02:58:09.325244 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 6 02:58:09.325279 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 02:58:09.333712 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 6 02:58:09.347309 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 6 02:58:09.373510 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1056) Mar 6 02:58:09.373529 kernel: BTRFS info (device sda6): first mount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:58:09.378500 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:58:09.387496 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:58:09.387524 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:58:09.389090 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 02:58:10.399303 coreos-metadata[1058]: Mar 06 02:58:10.399 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 6 02:58:10.405636 coreos-metadata[1058]: Mar 06 02:58:10.404 INFO Fetch successful Mar 6 02:58:10.405636 coreos-metadata[1058]: Mar 06 02:58:10.404 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 6 02:58:10.417626 coreos-metadata[1058]: Mar 06 02:58:10.417 INFO Fetch successful Mar 6 02:58:10.417626 coreos-metadata[1058]: Mar 06 02:58:10.417 INFO wrote hostname ci-4459.2.3-n-38e0d2a52a to /sysroot/etc/hostname Mar 6 02:58:10.422300 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 6 02:58:10.724479 initrd-setup-root[1087]: cut: /sysroot/etc/passwd: No such file or directory Mar 6 02:58:10.821154 initrd-setup-root[1094]: cut: /sysroot/etc/group: No such file or directory Mar 6 02:58:10.826989 initrd-setup-root[1101]: cut: /sysroot/etc/shadow: No such file or directory Mar 6 02:58:10.832471 initrd-setup-root[1108]: cut: /sysroot/etc/gshadow: No such file or directory Mar 6 02:58:12.642200 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 6 02:58:12.647662 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 6 02:58:12.669556 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 6 02:58:12.681557 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 6 02:58:12.692206 kernel: BTRFS info (device sda6): last unmount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:58:12.709368 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 6 02:58:12.720822 ignition[1181]: INFO : Ignition 2.22.0 Mar 6 02:58:12.720822 ignition[1181]: INFO : Stage: mount Mar 6 02:58:12.728227 ignition[1181]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:58:12.728227 ignition[1181]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:58:12.728227 ignition[1181]: INFO : mount: mount passed Mar 6 02:58:12.728227 ignition[1181]: INFO : Ignition finished successfully Mar 6 02:58:12.723899 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 6 02:58:12.733263 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 6 02:58:12.758755 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 02:58:12.782192 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1189) Mar 6 02:58:12.791431 kernel: BTRFS info (device sda6): first mount of filesystem 890f9900-ea91-473b-9515-ad9b05b1880b Mar 6 02:58:12.791465 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 6 02:58:12.800580 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:58:12.800595 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:58:12.802007 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 02:58:12.830695 ignition[1206]: INFO : Ignition 2.22.0 Mar 6 02:58:12.834413 ignition[1206]: INFO : Stage: files Mar 6 02:58:12.834413 ignition[1206]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:58:12.834413 ignition[1206]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:58:12.834413 ignition[1206]: DEBUG : files: compiled without relabeling support, skipping Mar 6 02:58:12.864429 ignition[1206]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 6 02:58:12.864429 ignition[1206]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 6 02:58:12.974746 ignition[1206]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 6 02:58:12.980254 ignition[1206]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 6 02:58:12.980254 ignition[1206]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 6 02:58:12.977553 unknown[1206]: wrote ssh authorized keys file for user: core Mar 6 02:58:13.016842 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 6 02:58:13.024676 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 6 02:58:13.052257 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 6 02:58:13.146218 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 02:58:13.153968 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 02:58:13.207689 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 02:58:13.207689 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 02:58:13.207689 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 6 02:58:13.207689 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 6 02:58:13.207689 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 6 02:58:13.207689 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-arm64.raw: attempt #1 Mar 6 02:58:13.555682 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 6 02:58:14.111364 ignition[1206]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 6 02:58:14.111364 ignition[1206]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 6 02:58:14.821513 ignition[1206]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 02:58:14.835222 ignition[1206]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 02:58:14.835222 ignition[1206]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 6 02:58:14.835222 ignition[1206]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 6 02:58:14.862284 ignition[1206]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 6 02:58:14.862284 ignition[1206]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 6 02:58:14.862284 ignition[1206]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 6 02:58:14.862284 ignition[1206]: INFO : files: files passed Mar 6 02:58:14.862284 ignition[1206]: INFO : Ignition finished successfully Mar 6 02:58:14.846489 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 6 02:58:14.853658 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 6 02:58:14.875554 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 6 02:58:14.889097 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 6 02:58:14.889213 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 6 02:58:14.938639 initrd-setup-root-after-ignition[1236]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:58:14.938639 initrd-setup-root-after-ignition[1236]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:58:14.951603 initrd-setup-root-after-ignition[1240]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:58:14.951362 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 02:58:14.956700 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 6 02:58:14.968273 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 6 02:58:15.008579 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 6 02:58:15.008686 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 6 02:58:15.017414 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 6 02:58:15.026073 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 6 02:58:15.034016 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 6 02:58:15.034570 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 6 02:58:15.079078 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 02:58:15.085114 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 6 02:58:15.110135 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:58:15.114854 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:58:15.123918 systemd[1]: Stopped target timers.target - Timer Units. Mar 6 02:58:15.131785 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 6 02:58:15.131866 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 02:58:15.143462 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 6 02:58:15.147609 systemd[1]: Stopped target basic.target - Basic System. Mar 6 02:58:15.155619 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 6 02:58:15.163496 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 02:58:15.171461 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 6 02:58:15.179832 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 6 02:58:15.188690 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 6 02:58:15.197015 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 02:58:15.206280 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 6 02:58:15.214244 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 6 02:58:15.222978 systemd[1]: Stopped target swap.target - Swaps. Mar 6 02:58:15.230005 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 6 02:58:15.230104 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 6 02:58:15.240677 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:58:15.245089 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:58:15.253523 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 6 02:58:15.257339 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:58:15.262456 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 6 02:58:15.262527 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 6 02:58:15.275132 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 6 02:58:15.275223 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 02:58:15.280295 systemd[1]: ignition-files.service: Deactivated successfully. Mar 6 02:58:15.280360 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 6 02:58:15.287789 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 6 02:58:15.287849 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 6 02:58:15.298764 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 6 02:58:15.326336 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 6 02:58:15.337389 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 6 02:58:15.337500 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:58:15.347328 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 6 02:58:15.347409 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 02:58:15.389516 ignition[1260]: INFO : Ignition 2.22.0 Mar 6 02:58:15.389516 ignition[1260]: INFO : Stage: umount Mar 6 02:58:15.389516 ignition[1260]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:58:15.389516 ignition[1260]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 6 02:58:15.389516 ignition[1260]: INFO : umount: umount passed Mar 6 02:58:15.389516 ignition[1260]: INFO : Ignition finished successfully Mar 6 02:58:15.370198 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 6 02:58:15.370486 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 6 02:58:15.388391 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 6 02:58:15.388463 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 6 02:58:15.393944 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 6 02:58:15.394645 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 6 02:58:15.394713 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 6 02:58:15.400303 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 6 02:58:15.400340 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 6 02:58:15.406868 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 6 02:58:15.406895 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 6 02:58:15.414448 systemd[1]: Stopped target network.target - Network. Mar 6 02:58:15.422586 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 6 02:58:15.422615 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 02:58:15.430561 systemd[1]: Stopped target paths.target - Path Units. Mar 6 02:58:15.438616 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 6 02:58:15.447189 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:58:15.455355 systemd[1]: Stopped target slices.target - Slice Units. Mar 6 02:58:15.462386 systemd[1]: Stopped target sockets.target - Socket Units. Mar 6 02:58:15.469704 systemd[1]: iscsid.socket: Deactivated successfully. Mar 6 02:58:15.469733 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 02:58:15.476827 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 6 02:58:15.476846 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 02:58:15.484484 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 6 02:58:15.484516 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 6 02:58:15.491855 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 6 02:58:15.491880 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 6 02:58:15.499744 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 6 02:58:15.507116 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 6 02:58:15.514953 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 6 02:58:15.515027 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 6 02:58:15.522328 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 6 02:58:15.522409 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 6 02:58:15.531035 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 6 02:58:15.531147 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 6 02:58:15.543746 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 6 02:58:15.695884 kernel: hv_netvsc 00224877-4608-0022-4877-460800224877 eth0: Data path switched from VF: enP15860s1 Mar 6 02:58:15.543887 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 6 02:58:15.543976 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 6 02:58:15.557458 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 6 02:58:15.558099 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 6 02:58:15.565013 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 6 02:58:15.565045 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:58:15.573591 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 6 02:58:15.585318 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 6 02:58:15.585368 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 02:58:15.594013 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 6 02:58:15.594055 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:58:15.602072 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 6 02:58:15.602105 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 6 02:58:15.606570 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 6 02:58:15.606603 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:58:15.618727 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:58:15.624275 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 6 02:58:15.624337 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:58:15.645087 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 6 02:58:15.645224 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:58:15.653246 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 6 02:58:15.653284 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 6 02:58:15.660929 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 6 02:58:15.660960 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:58:15.669057 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 6 02:58:15.669089 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 6 02:58:15.681004 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 6 02:58:15.681044 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 6 02:58:15.695761 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 6 02:58:15.695795 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 02:58:15.705126 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 6 02:58:15.714246 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 6 02:58:15.714293 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:58:15.727680 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 6 02:58:15.727713 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:58:15.736541 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 6 02:58:15.736597 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 02:58:15.749998 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 6 02:58:15.750034 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:58:15.755123 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:58:15.755150 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:58:15.769404 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 6 02:58:15.769443 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Mar 6 02:58:15.769463 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 6 02:58:15.769483 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:58:15.769707 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 6 02:58:15.769774 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 6 02:58:15.778732 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 6 02:58:15.778854 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 6 02:58:15.785882 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 6 02:58:15.794215 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 6 02:58:15.816563 systemd[1]: Switching root. Mar 6 02:58:16.004637 systemd-journald[225]: Journal stopped Mar 6 02:58:23.463686 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Mar 6 02:58:23.463703 kernel: SELinux: policy capability network_peer_controls=1 Mar 6 02:58:23.463710 kernel: SELinux: policy capability open_perms=1 Mar 6 02:58:23.463716 kernel: SELinux: policy capability extended_socket_class=1 Mar 6 02:58:23.463722 kernel: SELinux: policy capability always_check_network=0 Mar 6 02:58:23.463727 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 6 02:58:23.463733 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 6 02:58:23.463739 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 6 02:58:23.463744 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 6 02:58:23.463749 kernel: SELinux: policy capability userspace_initial_context=0 Mar 6 02:58:23.463756 kernel: audit: type=1403 audit(1772765897.567:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 6 02:58:23.463762 systemd[1]: Successfully loaded SELinux policy in 351.344ms. Mar 6 02:58:23.463769 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.348ms. Mar 6 02:58:23.463775 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 6 02:58:23.463782 systemd[1]: Detected virtualization microsoft. Mar 6 02:58:23.463789 systemd[1]: Detected architecture arm64. Mar 6 02:58:23.463794 systemd[1]: Detected first boot. Mar 6 02:58:23.463800 systemd[1]: Hostname set to . Mar 6 02:58:23.463806 systemd[1]: Initializing machine ID from random generator. Mar 6 02:58:23.463812 zram_generator::config[1302]: No configuration found. Mar 6 02:58:23.463818 kernel: NET: Registered PF_VSOCK protocol family Mar 6 02:58:23.463824 systemd[1]: Populated /etc with preset unit settings. Mar 6 02:58:23.463830 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 6 02:58:23.463837 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 6 02:58:23.463843 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 6 02:58:23.463849 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 6 02:58:23.463855 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 6 02:58:23.463861 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 6 02:58:23.463867 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 6 02:58:23.463873 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 6 02:58:23.463880 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 6 02:58:23.463887 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 6 02:58:23.463893 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 6 02:58:23.463899 systemd[1]: Created slice user.slice - User and Session Slice. Mar 6 02:58:23.463905 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:58:23.463911 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:58:23.463917 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 6 02:58:23.463923 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 6 02:58:23.463930 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 6 02:58:23.463936 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 02:58:23.463944 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 6 02:58:23.463950 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:58:23.463956 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:58:23.463962 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 6 02:58:23.463968 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 6 02:58:23.463974 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 6 02:58:23.463981 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 6 02:58:23.463987 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:58:23.463993 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 02:58:23.463999 systemd[1]: Reached target slices.target - Slice Units. Mar 6 02:58:23.464005 systemd[1]: Reached target swap.target - Swaps. Mar 6 02:58:23.464011 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 6 02:58:23.464018 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 6 02:58:23.464025 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 6 02:58:23.464032 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:58:23.464038 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 02:58:23.464044 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:58:23.464050 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 6 02:58:23.464057 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 6 02:58:23.464063 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 6 02:58:23.464069 systemd[1]: Mounting media.mount - External Media Directory... Mar 6 02:58:23.464076 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 6 02:58:23.464082 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 6 02:58:23.464088 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 6 02:58:23.464095 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 6 02:58:23.464101 systemd[1]: Reached target machines.target - Containers. Mar 6 02:58:23.464107 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 6 02:58:23.464114 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:58:23.464120 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 02:58:23.464126 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 6 02:58:23.464132 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:58:23.464138 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 02:58:23.464145 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:58:23.464152 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 6 02:58:23.464158 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:58:23.464165 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 6 02:58:23.464193 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 6 02:58:23.464201 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 6 02:58:23.464207 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 6 02:58:23.464214 systemd[1]: Stopped systemd-fsck-usr.service. Mar 6 02:58:23.464220 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:58:23.464227 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 02:58:23.464246 systemd-journald[1392]: Collecting audit messages is disabled. Mar 6 02:58:23.464261 systemd-journald[1392]: Journal started Mar 6 02:58:23.464275 systemd-journald[1392]: Runtime Journal (/run/log/journal/dd07a9bc08d94f8bb68afe149dee72db) is 8M, max 78.3M, 70.3M free. Mar 6 02:58:22.722946 systemd[1]: Queued start job for default target multi-user.target. Mar 6 02:58:22.727627 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 6 02:58:22.728004 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 6 02:58:22.728276 systemd[1]: systemd-journald.service: Consumed 2.327s CPU time. Mar 6 02:58:23.478186 kernel: fuse: init (API version 7.41) Mar 6 02:58:23.478221 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 02:58:23.496666 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 6 02:58:23.496709 kernel: loop: module loaded Mar 6 02:58:23.514358 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 6 02:58:23.527228 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 6 02:58:23.543971 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 02:58:23.552666 systemd[1]: verity-setup.service: Deactivated successfully. Mar 6 02:58:23.552703 kernel: ACPI: bus type drm_connector registered Mar 6 02:58:23.552712 systemd[1]: Stopped verity-setup.service. Mar 6 02:58:23.569345 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 02:58:23.569918 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 6 02:58:23.574133 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 6 02:58:23.578646 systemd[1]: Mounted media.mount - External Media Directory. Mar 6 02:58:23.582590 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 6 02:58:23.586980 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 6 02:58:23.591541 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 6 02:58:23.595549 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 6 02:58:23.600644 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:58:23.605851 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 6 02:58:23.605976 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 6 02:58:23.612422 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:58:23.612537 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:58:23.617391 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 02:58:23.617503 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 02:58:23.622030 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:58:23.622137 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:58:23.627299 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 6 02:58:23.627413 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 6 02:58:23.632113 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:58:23.632304 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:58:23.636816 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:58:23.642214 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 6 02:58:23.647597 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 6 02:58:23.652949 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 02:58:23.664793 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 6 02:58:23.672269 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 6 02:58:23.679341 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 6 02:58:23.684050 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 6 02:58:23.684076 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 02:58:23.688835 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 6 02:58:23.701282 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 6 02:58:23.705485 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:58:23.706616 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 6 02:58:23.720606 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 6 02:58:23.725357 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 02:58:23.725992 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 6 02:58:23.730424 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 02:58:23.731046 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 02:58:23.736040 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 6 02:58:23.743457 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 6 02:58:23.749999 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:58:23.755663 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 6 02:58:23.760878 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 6 02:58:23.765847 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 6 02:58:23.772659 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 6 02:58:23.781441 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 6 02:58:23.817375 systemd-tmpfiles[1443]: ACLs are not supported, ignoring. Mar 6 02:58:23.817574 systemd-tmpfiles[1443]: ACLs are not supported, ignoring. Mar 6 02:58:23.819842 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 02:58:23.833211 kernel: loop0: detected capacity change from 0 to 119840 Mar 6 02:58:23.830830 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 6 02:58:23.847366 systemd-journald[1392]: Time spent on flushing to /var/log/journal/dd07a9bc08d94f8bb68afe149dee72db is 8.847ms for 938 entries. Mar 6 02:58:23.847366 systemd-journald[1392]: System Journal (/var/log/journal/dd07a9bc08d94f8bb68afe149dee72db) is 8M, max 2.6G, 2.6G free. Mar 6 02:58:23.906973 systemd-journald[1392]: Received client request to flush runtime journal. Mar 6 02:58:23.854730 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 6 02:58:23.855968 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 6 02:58:23.907971 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 6 02:58:23.938242 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:58:23.980046 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 6 02:58:23.987863 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 02:58:24.004480 systemd-tmpfiles[1460]: ACLs are not supported, ignoring. Mar 6 02:58:24.004491 systemd-tmpfiles[1460]: ACLs are not supported, ignoring. Mar 6 02:58:24.006674 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:58:24.516946 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 6 02:58:24.523085 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:58:24.544268 systemd-udevd[1464]: Using default interface naming scheme 'v255'. Mar 6 02:58:24.574193 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 6 02:58:24.645194 kernel: loop1: detected capacity change from 0 to 27936 Mar 6 02:58:24.758191 kernel: loop2: detected capacity change from 0 to 197488 Mar 6 02:58:24.793196 kernel: loop3: detected capacity change from 0 to 100632 Mar 6 02:58:24.819258 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:58:24.833204 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 02:58:24.868453 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 6 02:58:24.918460 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 6 02:58:24.977192 kernel: mousedev: PS/2 mouse device common for all mice Mar 6 02:58:24.977256 kernel: hv_vmbus: registering driver hv_balloon Mar 6 02:58:24.997185 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#162 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Mar 6 02:58:24.997400 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 6 02:58:25.004217 kernel: hv_balloon: Memory hot add disabled on ARM64 Mar 6 02:58:25.009570 kernel: loop4: detected capacity change from 0 to 119840 Mar 6 02:58:25.033188 kernel: loop5: detected capacity change from 0 to 27936 Mar 6 02:58:25.049120 kernel: hv_vmbus: registering driver hyperv_fb Mar 6 02:58:25.049182 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 6 02:58:25.055025 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 6 02:58:25.055081 kernel: loop6: detected capacity change from 0 to 197488 Mar 6 02:58:25.065029 kernel: Console: switching to colour dummy device 80x25 Mar 6 02:58:25.074829 kernel: Console: switching to colour frame buffer device 128x48 Mar 6 02:58:25.076359 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:58:25.088714 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:58:25.088874 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:58:25.102636 kernel: loop7: detected capacity change from 0 to 100632 Mar 6 02:58:25.108331 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:58:25.115412 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:58:25.116499 (sd-merge)[1531]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 6 02:58:25.116831 (sd-merge)[1531]: Merged extensions into '/usr'. Mar 6 02:58:25.119232 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:58:25.124811 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 6 02:58:25.140338 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:58:25.146447 systemd[1]: Reload requested from client PID 1441 ('systemd-sysext') (unit systemd-sysext.service)... Mar 6 02:58:25.146461 systemd[1]: Reloading... Mar 6 02:58:25.214233 zram_generator::config[1588]: No configuration found. Mar 6 02:58:25.225200 kernel: MACsec IEEE 802.1AE Mar 6 02:58:25.301054 systemd-networkd[1498]: lo: Link UP Mar 6 02:58:25.301065 systemd-networkd[1498]: lo: Gained carrier Mar 6 02:58:25.301935 systemd-networkd[1498]: Enumeration completed Mar 6 02:58:25.302159 systemd-networkd[1498]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:58:25.302161 systemd-networkd[1498]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:58:25.351194 kernel: mlx5_core 3df4:00:02.0 enP15860s1: Link up Mar 6 02:58:25.357716 systemd[1]: Reloading finished in 211 ms. Mar 6 02:58:25.366961 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 02:58:25.383719 kernel: hv_netvsc 00224877-4608-0022-4877-460800224877 eth0: Data path switched to VF: enP15860s1 Mar 6 02:58:25.378483 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 6 02:58:25.385902 systemd-networkd[1498]: enP15860s1: Link UP Mar 6 02:58:25.386018 systemd-networkd[1498]: eth0: Link UP Mar 6 02:58:25.386024 systemd-networkd[1498]: eth0: Gained carrier Mar 6 02:58:25.386035 systemd-networkd[1498]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:58:25.389731 systemd-networkd[1498]: enP15860s1: Gained carrier Mar 6 02:58:25.395455 systemd-networkd[1498]: eth0: DHCPv4 address 10.200.20.33/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 6 02:58:25.405817 systemd[1]: Starting ensure-sysext.service... Mar 6 02:58:25.410885 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 6 02:58:25.417276 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 6 02:58:25.428273 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 02:58:25.440783 systemd-tmpfiles[1693]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 6 02:58:25.440806 systemd-tmpfiles[1693]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 6 02:58:25.440943 systemd-tmpfiles[1693]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 6 02:58:25.441069 systemd-tmpfiles[1693]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 6 02:58:25.441502 systemd-tmpfiles[1693]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 6 02:58:25.441638 systemd-tmpfiles[1693]: ACLs are not supported, ignoring. Mar 6 02:58:25.441665 systemd-tmpfiles[1693]: ACLs are not supported, ignoring. Mar 6 02:58:25.442266 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 6 02:58:25.443862 systemd-tmpfiles[1693]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 02:58:25.443867 systemd-tmpfiles[1693]: Skipping /boot Mar 6 02:58:25.448460 systemd-tmpfiles[1693]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 02:58:25.448471 systemd-tmpfiles[1693]: Skipping /boot Mar 6 02:58:25.449840 systemd[1]: Reload requested from client PID 1689 ('systemctl') (unit ensure-sysext.service)... Mar 6 02:58:25.449925 systemd[1]: Reloading... Mar 6 02:58:25.503204 zram_generator::config[1726]: No configuration found. Mar 6 02:58:25.650140 systemd[1]: Reloading finished in 199 ms. Mar 6 02:58:25.664713 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 6 02:58:25.670148 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:58:25.681970 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 6 02:58:25.689808 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 6 02:58:25.695340 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 6 02:58:25.703360 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 6 02:58:25.715266 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 02:58:25.722592 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 6 02:58:25.728375 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 6 02:58:25.735127 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:58:25.738610 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:58:25.745144 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:58:25.751375 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:58:25.757463 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:58:25.757550 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:58:25.759647 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:58:25.762208 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:58:25.766968 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:58:25.769812 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:58:25.776608 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:58:25.776743 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:58:25.787005 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 6 02:58:25.796783 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 6 02:58:25.804594 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:58:25.805847 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:58:25.813462 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 02:58:25.813826 systemd-resolved[1791]: Positive Trust Anchors: Mar 6 02:58:25.814043 systemd-resolved[1791]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 02:58:25.814116 systemd-resolved[1791]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 02:58:25.818308 augenrules[1820]: No rules Mar 6 02:58:25.821391 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:58:25.828348 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:58:25.834354 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:58:25.834388 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:58:25.834420 systemd[1]: Reached target time-set.target - System Time Set. Mar 6 02:58:25.839192 systemd[1]: Finished ensure-sysext.service. Mar 6 02:58:25.844621 systemd[1]: audit-rules.service: Deactivated successfully. Mar 6 02:58:25.844850 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 6 02:58:25.849682 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:58:25.849886 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:58:25.854815 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 02:58:25.854941 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 02:58:25.859918 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:58:25.860034 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:58:25.865009 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:58:25.865123 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:58:25.873519 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 02:58:25.873582 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 02:58:25.889421 systemd-resolved[1791]: Using system hostname 'ci-4459.2.3-n-38e0d2a52a'. Mar 6 02:58:25.890783 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 02:58:25.895841 systemd[1]: Reached target network.target - Network. Mar 6 02:58:25.899401 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:58:26.028623 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:58:26.815716 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 6 02:58:26.821161 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 6 02:58:27.262302 systemd-networkd[1498]: eth0: Gained IPv6LL Mar 6 02:58:27.264403 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 6 02:58:27.269568 systemd[1]: Reached target network-online.target - Network is Online. Mar 6 02:58:33.834697 ldconfig[1436]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 6 02:58:33.846243 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 6 02:58:33.852387 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 6 02:58:33.894991 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 6 02:58:33.899606 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 02:58:33.903789 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 6 02:58:33.908568 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 6 02:58:33.913493 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 6 02:58:33.917674 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 6 02:58:33.922655 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 6 02:58:33.927425 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 6 02:58:33.927451 systemd[1]: Reached target paths.target - Path Units. Mar 6 02:58:33.930866 systemd[1]: Reached target timers.target - Timer Units. Mar 6 02:58:33.935578 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 6 02:58:33.940858 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 6 02:58:33.945899 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 6 02:58:33.950935 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 6 02:58:33.955903 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 6 02:58:33.961474 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 6 02:58:33.965700 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 6 02:58:33.970903 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 6 02:58:33.975194 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 02:58:33.978977 systemd[1]: Reached target basic.target - Basic System. Mar 6 02:58:33.982694 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 6 02:58:33.982714 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 6 02:58:33.984414 systemd[1]: Starting chronyd.service - NTP client/server... Mar 6 02:58:33.995975 systemd[1]: Starting containerd.service - containerd container runtime... Mar 6 02:58:34.003297 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 6 02:58:34.016476 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 6 02:58:34.021052 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 6 02:58:34.029326 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 6 02:58:34.035431 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 6 02:58:34.039553 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 6 02:58:34.041305 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 6 02:58:34.046095 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 6 02:58:34.046895 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:58:34.053288 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 6 02:58:34.063003 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 6 02:58:34.068257 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 6 02:58:34.072398 jq[1853]: false Mar 6 02:58:34.075841 chronyd[1845]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 6 02:58:34.075933 KVP[1855]: KVP starting; pid is:1855 Mar 6 02:58:34.076588 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 6 02:58:34.090441 kernel: hv_utils: KVP IC version 4.0 Mar 6 02:58:34.086462 KVP[1855]: KVP LIC Version: 3.1 Mar 6 02:58:34.086563 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 6 02:58:34.099168 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 6 02:58:34.105061 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 6 02:58:34.106219 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 6 02:58:34.106642 systemd[1]: Starting update-engine.service - Update Engine... Mar 6 02:58:34.113924 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 6 02:58:34.120463 jq[1873]: true Mar 6 02:58:34.122510 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 6 02:58:34.129312 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 6 02:58:34.131479 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 6 02:58:34.132997 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 6 02:58:34.133285 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 6 02:58:34.138780 chronyd[1845]: Timezone right/UTC failed leap second check, ignoring Mar 6 02:58:34.139863 systemd[1]: Started chronyd.service - NTP client/server. Mar 6 02:58:34.138913 chronyd[1845]: Loaded seccomp filter (level 2) Mar 6 02:58:34.146937 systemd[1]: motdgen.service: Deactivated successfully. Mar 6 02:58:34.148468 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 6 02:58:34.162554 jq[1881]: true Mar 6 02:58:34.169739 (ntainerd)[1884]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 6 02:58:34.229758 systemd-logind[1867]: New seat seat0. Mar 6 02:58:34.231538 extend-filesystems[1854]: Found /dev/sda6 Mar 6 02:58:34.242567 update_engine[1871]: I20260306 02:58:34.231947 1871 main.cc:92] Flatcar Update Engine starting Mar 6 02:58:34.242718 tar[1880]: linux-arm64/LICENSE Mar 6 02:58:34.242718 tar[1880]: linux-arm64/helm Mar 6 02:58:34.233521 systemd-logind[1867]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Mar 6 02:58:34.235022 systemd[1]: Started systemd-logind.service - User Login Management. Mar 6 02:58:34.244770 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 6 02:58:34.267039 extend-filesystems[1854]: Found /dev/sda9 Mar 6 02:58:34.278118 extend-filesystems[1854]: Checking size of /dev/sda9 Mar 6 02:58:34.290971 bash[1909]: Updated "/home/core/.ssh/authorized_keys" Mar 6 02:58:34.292137 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 6 02:58:34.299680 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 6 02:58:34.321379 sshd_keygen[1870]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 6 02:58:34.322054 extend-filesystems[1854]: Old size kept for /dev/sda9 Mar 6 02:58:34.332741 dbus-daemon[1850]: [system] SELinux support is enabled Mar 6 02:58:34.329181 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 6 02:58:34.329364 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 6 02:58:34.340705 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 6 02:58:34.348884 update_engine[1871]: I20260306 02:58:34.347841 1871 update_check_scheduler.cc:74] Next update check in 10m55s Mar 6 02:58:34.350748 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 6 02:58:34.350784 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 6 02:58:34.358678 dbus-daemon[1850]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 6 02:58:34.359055 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 6 02:58:34.359070 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 6 02:58:34.367729 systemd[1]: Started update-engine.service - Update Engine. Mar 6 02:58:34.376354 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 6 02:58:34.400626 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 6 02:58:34.411593 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 6 02:58:34.420954 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 6 02:58:34.441663 coreos-metadata[1847]: Mar 06 02:58:34.441 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 6 02:58:34.442460 coreos-metadata[1847]: Mar 06 02:58:34.442 INFO Fetch successful Mar 6 02:58:34.442460 coreos-metadata[1847]: Mar 06 02:58:34.442 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 6 02:58:34.446679 coreos-metadata[1847]: Mar 06 02:58:34.446 INFO Fetch successful Mar 6 02:58:34.447288 coreos-metadata[1847]: Mar 06 02:58:34.447 INFO Fetching http://168.63.129.16/machine/acd128b5-c7e6-4202-bd44-9fa3ddceee1f/5aa3017d%2D1741%2D4396%2Dbb4d%2Db9a2d5fc5b8a.%5Fci%2D4459.2.3%2Dn%2D38e0d2a52a?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 6 02:58:34.449841 coreos-metadata[1847]: Mar 06 02:58:34.449 INFO Fetch successful Mar 6 02:58:34.449841 coreos-metadata[1847]: Mar 06 02:58:34.449 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 6 02:58:34.454847 systemd[1]: issuegen.service: Deactivated successfully. Mar 6 02:58:34.455945 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 6 02:58:34.460721 coreos-metadata[1847]: Mar 06 02:58:34.460 INFO Fetch successful Mar 6 02:58:34.480571 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 6 02:58:34.489612 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 6 02:58:34.524330 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 6 02:58:34.532885 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 6 02:58:34.549876 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 6 02:58:34.556045 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 6 02:58:34.558409 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 6 02:58:34.564701 systemd[1]: Reached target getty.target - Login Prompts. Mar 6 02:58:34.714409 tar[1880]: linux-arm64/README.md Mar 6 02:58:34.727282 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 6 02:58:34.732070 locksmithd[1941]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 6 02:58:34.899118 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:58:35.106018 containerd[1884]: time="2026-03-06T02:58:35Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 6 02:58:35.110183 containerd[1884]: time="2026-03-06T02:58:35.109533380Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 6 02:58:35.114668 containerd[1884]: time="2026-03-06T02:58:35.114638276Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.224µs" Mar 6 02:58:35.114668 containerd[1884]: time="2026-03-06T02:58:35.114662636Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 6 02:58:35.114742 containerd[1884]: time="2026-03-06T02:58:35.114675972Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 6 02:58:35.114823 containerd[1884]: time="2026-03-06T02:58:35.114804604Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 6 02:58:35.114823 containerd[1884]: time="2026-03-06T02:58:35.114820932Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 6 02:58:35.114848 containerd[1884]: time="2026-03-06T02:58:35.114839108Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 6 02:58:35.114888 containerd[1884]: time="2026-03-06T02:58:35.114875892Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 6 02:58:35.114888 containerd[1884]: time="2026-03-06T02:58:35.114887548Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115049 containerd[1884]: time="2026-03-06T02:58:35.115031548Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115049 containerd[1884]: time="2026-03-06T02:58:35.115047084Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115074 containerd[1884]: time="2026-03-06T02:58:35.115054124Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115074 containerd[1884]: time="2026-03-06T02:58:35.115060260Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115127 containerd[1884]: time="2026-03-06T02:58:35.115115836Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115320 containerd[1884]: time="2026-03-06T02:58:35.115300188Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115337 containerd[1884]: time="2026-03-06T02:58:35.115329540Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 6 02:58:35.115351 containerd[1884]: time="2026-03-06T02:58:35.115337804Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 6 02:58:35.115375 containerd[1884]: time="2026-03-06T02:58:35.115359300Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 6 02:58:35.115724 containerd[1884]: time="2026-03-06T02:58:35.115521620Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 6 02:58:35.115724 containerd[1884]: time="2026-03-06T02:58:35.115605604Z" level=info msg="metadata content store policy set" policy=shared Mar 6 02:58:35.130206 containerd[1884]: time="2026-03-06T02:58:35.130184900Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 6 02:58:35.130302 containerd[1884]: time="2026-03-06T02:58:35.130287948Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 6 02:58:35.130347 containerd[1884]: time="2026-03-06T02:58:35.130337300Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 6 02:58:35.130382 containerd[1884]: time="2026-03-06T02:58:35.130372612Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 6 02:58:35.130436 containerd[1884]: time="2026-03-06T02:58:35.130424500Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 6 02:58:35.130481 containerd[1884]: time="2026-03-06T02:58:35.130469404Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 6 02:58:35.130525 containerd[1884]: time="2026-03-06T02:58:35.130514916Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 6 02:58:35.130562 containerd[1884]: time="2026-03-06T02:58:35.130552740Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 6 02:58:35.130608 containerd[1884]: time="2026-03-06T02:58:35.130598060Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 6 02:58:35.130643 containerd[1884]: time="2026-03-06T02:58:35.130634764Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 6 02:58:35.130683 containerd[1884]: time="2026-03-06T02:58:35.130674156Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 6 02:58:35.130721 containerd[1884]: time="2026-03-06T02:58:35.130712012Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 6 02:58:35.130859 containerd[1884]: time="2026-03-06T02:58:35.130845652Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 6 02:58:35.130912 containerd[1884]: time="2026-03-06T02:58:35.130902492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 6 02:58:35.130962 containerd[1884]: time="2026-03-06T02:58:35.130952180Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 6 02:58:35.130998 containerd[1884]: time="2026-03-06T02:58:35.130989132Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 6 02:58:35.131030 containerd[1884]: time="2026-03-06T02:58:35.131021916Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 6 02:58:35.131071 containerd[1884]: time="2026-03-06T02:58:35.131063532Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 6 02:58:35.131113 containerd[1884]: time="2026-03-06T02:58:35.131104348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 6 02:58:35.131153 containerd[1884]: time="2026-03-06T02:58:35.131144596Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 6 02:58:35.131227 containerd[1884]: time="2026-03-06T02:58:35.131215156Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 6 02:58:35.131270 containerd[1884]: time="2026-03-06T02:58:35.131261020Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 6 02:58:35.131303 containerd[1884]: time="2026-03-06T02:58:35.131295220Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 6 02:58:35.131378 containerd[1884]: time="2026-03-06T02:58:35.131367916Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 6 02:58:35.131432 containerd[1884]: time="2026-03-06T02:58:35.131421044Z" level=info msg="Start snapshots syncer" Mar 6 02:58:35.131494 containerd[1884]: time="2026-03-06T02:58:35.131482948Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 6 02:58:35.131714 containerd[1884]: time="2026-03-06T02:58:35.131685180Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 6 02:58:35.131841 containerd[1884]: time="2026-03-06T02:58:35.131826876Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 6 02:58:35.131936 containerd[1884]: time="2026-03-06T02:58:35.131922924Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 6 02:58:35.132071 containerd[1884]: time="2026-03-06T02:58:35.132055548Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 6 02:58:35.132142 containerd[1884]: time="2026-03-06T02:58:35.132127668Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 6 02:58:35.132200 containerd[1884]: time="2026-03-06T02:58:35.132190524Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 6 02:58:35.132244 containerd[1884]: time="2026-03-06T02:58:35.132233212Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 6 02:58:35.132281 containerd[1884]: time="2026-03-06T02:58:35.132271660Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 6 02:58:35.132323 containerd[1884]: time="2026-03-06T02:58:35.132314260Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 6 02:58:35.132356 containerd[1884]: time="2026-03-06T02:58:35.132348572Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 6 02:58:35.132412 containerd[1884]: time="2026-03-06T02:58:35.132400580Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 6 02:58:35.132452 containerd[1884]: time="2026-03-06T02:58:35.132443100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 6 02:58:35.132493 containerd[1884]: time="2026-03-06T02:58:35.132482836Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 6 02:58:35.132558 containerd[1884]: time="2026-03-06T02:58:35.132546844Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 6 02:58:35.132615 containerd[1884]: time="2026-03-06T02:58:35.132602524Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 6 02:58:35.132652 containerd[1884]: time="2026-03-06T02:58:35.132640596Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 6 02:58:35.132688 containerd[1884]: time="2026-03-06T02:58:35.132677652Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 6 02:58:35.132721 containerd[1884]: time="2026-03-06T02:58:35.132709988Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 6 02:58:35.132755 containerd[1884]: time="2026-03-06T02:58:35.132745012Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 6 02:58:35.132791 containerd[1884]: time="2026-03-06T02:58:35.132780508Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 6 02:58:35.132830 containerd[1884]: time="2026-03-06T02:58:35.132820516Z" level=info msg="runtime interface created" Mar 6 02:58:35.132866 containerd[1884]: time="2026-03-06T02:58:35.132856036Z" level=info msg="created NRI interface" Mar 6 02:58:35.132902 containerd[1884]: time="2026-03-06T02:58:35.132891332Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 6 02:58:35.132942 containerd[1884]: time="2026-03-06T02:58:35.132931916Z" level=info msg="Connect containerd service" Mar 6 02:58:35.132990 containerd[1884]: time="2026-03-06T02:58:35.132979028Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 6 02:58:35.133614 containerd[1884]: time="2026-03-06T02:58:35.133593108Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 6 02:58:35.245704 (kubelet)[2033]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:58:35.514509 kubelet[2033]: E0306 02:58:35.514393 2033 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:58:35.516484 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:58:35.516699 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:58:35.518250 systemd[1]: kubelet.service: Consumed 486ms CPU time, 246.8M memory peak. Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689315180Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689370276Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689394284Z" level=info msg="Start subscribing containerd event" Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689434852Z" level=info msg="Start recovering state" Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689501412Z" level=info msg="Start event monitor" Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689511516Z" level=info msg="Start cni network conf syncer for default" Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689516628Z" level=info msg="Start streaming server" Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689522676Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689527428Z" level=info msg="runtime interface starting up..." Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689531132Z" level=info msg="starting plugins..." Mar 6 02:58:35.689627 containerd[1884]: time="2026-03-06T02:58:35.689542444Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 6 02:58:35.689945 systemd[1]: Started containerd.service - containerd container runtime. Mar 6 02:58:35.694343 containerd[1884]: time="2026-03-06T02:58:35.691206212Z" level=info msg="containerd successfully booted in 0.585572s" Mar 6 02:58:35.694810 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 6 02:58:35.701246 systemd[1]: Startup finished in 1.625s (kernel) + 16.328s (initrd) + 18.483s (userspace) = 36.438s. Mar 6 02:58:36.307813 login[2014]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:36.308554 login[2015]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:58:36.313654 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 6 02:58:36.314509 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 6 02:58:36.319535 systemd-logind[1867]: New session 2 of user core. Mar 6 02:58:36.323072 systemd-logind[1867]: New session 1 of user core. Mar 6 02:58:36.355661 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 6 02:58:36.357890 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 6 02:58:36.364812 (systemd)[2060]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 6 02:58:36.366717 systemd-logind[1867]: New session c1 of user core. Mar 6 02:58:36.508285 systemd[2060]: Queued start job for default target default.target. Mar 6 02:58:36.523860 systemd[2060]: Created slice app.slice - User Application Slice. Mar 6 02:58:36.523985 systemd[2060]: Reached target paths.target - Paths. Mar 6 02:58:36.524071 systemd[2060]: Reached target timers.target - Timers. Mar 6 02:58:36.525005 systemd[2060]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 6 02:58:36.532104 systemd[2060]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 6 02:58:36.532150 systemd[2060]: Reached target sockets.target - Sockets. Mar 6 02:58:36.532188 systemd[2060]: Reached target basic.target - Basic System. Mar 6 02:58:36.532213 systemd[2060]: Reached target default.target - Main User Target. Mar 6 02:58:36.532231 systemd[2060]: Startup finished in 161ms. Mar 6 02:58:36.532313 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 6 02:58:36.534536 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 6 02:58:36.536519 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 6 02:58:37.506611 waagent[1991]: 2026-03-06T02:58:37.506539Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Mar 6 02:58:37.510952 waagent[1991]: 2026-03-06T02:58:37.510914Z INFO Daemon Daemon OS: flatcar 4459.2.3 Mar 6 02:58:37.514355 waagent[1991]: 2026-03-06T02:58:37.514326Z INFO Daemon Daemon Python: 3.11.13 Mar 6 02:58:37.520190 waagent[1991]: 2026-03-06T02:58:37.519240Z INFO Daemon Daemon Run daemon Mar 6 02:58:37.522269 waagent[1991]: 2026-03-06T02:58:37.522234Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.2.3' Mar 6 02:58:37.528672 waagent[1991]: 2026-03-06T02:58:37.528635Z INFO Daemon Daemon Using waagent for provisioning Mar 6 02:58:37.532482 waagent[1991]: 2026-03-06T02:58:37.532447Z INFO Daemon Daemon Activate resource disk Mar 6 02:58:37.535749 waagent[1991]: 2026-03-06T02:58:37.535720Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 6 02:58:37.543577 waagent[1991]: 2026-03-06T02:58:37.543541Z INFO Daemon Daemon Found device: None Mar 6 02:58:37.546782 waagent[1991]: 2026-03-06T02:58:37.546752Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 6 02:58:37.552694 waagent[1991]: 2026-03-06T02:58:37.552667Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 6 02:58:37.560924 waagent[1991]: 2026-03-06T02:58:37.560888Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 6 02:58:37.564998 waagent[1991]: 2026-03-06T02:58:37.564969Z INFO Daemon Daemon Running default provisioning handler Mar 6 02:58:37.573790 waagent[1991]: 2026-03-06T02:58:37.573750Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 6 02:58:37.583970 waagent[1991]: 2026-03-06T02:58:37.583934Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 6 02:58:37.591034 waagent[1991]: 2026-03-06T02:58:37.591005Z INFO Daemon Daemon cloud-init is enabled: False Mar 6 02:58:37.594782 waagent[1991]: 2026-03-06T02:58:37.594759Z INFO Daemon Daemon Copying ovf-env.xml Mar 6 02:58:37.678263 waagent[1991]: 2026-03-06T02:58:37.678208Z INFO Daemon Daemon Successfully mounted dvd Mar 6 02:58:37.721773 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 6 02:58:37.722444 waagent[1991]: 2026-03-06T02:58:37.721980Z INFO Daemon Daemon Detect protocol endpoint Mar 6 02:58:37.725557 waagent[1991]: 2026-03-06T02:58:37.725520Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 6 02:58:37.729641 waagent[1991]: 2026-03-06T02:58:37.729613Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 6 02:58:37.734670 waagent[1991]: 2026-03-06T02:58:37.734645Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 6 02:58:37.738507 waagent[1991]: 2026-03-06T02:58:37.738480Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 6 02:58:37.742123 waagent[1991]: 2026-03-06T02:58:37.742099Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 6 02:58:37.818438 waagent[1991]: 2026-03-06T02:58:37.818350Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 6 02:58:37.823158 waagent[1991]: 2026-03-06T02:58:37.823140Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 6 02:58:37.827055 waagent[1991]: 2026-03-06T02:58:37.827031Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 6 02:58:37.993259 waagent[1991]: 2026-03-06T02:58:37.993192Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 6 02:58:37.997976 waagent[1991]: 2026-03-06T02:58:37.997945Z INFO Daemon Daemon Forcing an update of the goal state. Mar 6 02:58:38.005276 waagent[1991]: 2026-03-06T02:58:38.005242Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 6 02:58:38.036632 waagent[1991]: 2026-03-06T02:58:38.036598Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.179 Mar 6 02:58:38.040819 waagent[1991]: 2026-03-06T02:58:38.040786Z INFO Daemon Mar 6 02:58:38.042834 waagent[1991]: 2026-03-06T02:58:38.042806Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: d0a1313e-3be8-4a3d-a5d0-0d2acb4039ca eTag: 2358388105988788430 source: Fabric] Mar 6 02:58:38.050825 waagent[1991]: 2026-03-06T02:58:38.050794Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 6 02:58:38.055447 waagent[1991]: 2026-03-06T02:58:38.055418Z INFO Daemon Mar 6 02:58:38.057470 waagent[1991]: 2026-03-06T02:58:38.057445Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 6 02:58:38.065593 waagent[1991]: 2026-03-06T02:58:38.065564Z INFO Daemon Daemon Downloading artifacts profile blob Mar 6 02:58:38.172500 waagent[1991]: 2026-03-06T02:58:38.172413Z INFO Daemon Downloaded certificate {'thumbprint': 'DDF5D85D43AC99E09E87E9A00D0573E29D4DC7BE', 'hasPrivateKey': True} Mar 6 02:58:38.179752 waagent[1991]: 2026-03-06T02:58:38.179718Z INFO Daemon Fetch goal state completed Mar 6 02:58:38.212518 waagent[1991]: 2026-03-06T02:58:38.212487Z INFO Daemon Daemon Starting provisioning Mar 6 02:58:38.216298 waagent[1991]: 2026-03-06T02:58:38.216265Z INFO Daemon Daemon Handle ovf-env.xml. Mar 6 02:58:38.219733 waagent[1991]: 2026-03-06T02:58:38.219704Z INFO Daemon Daemon Set hostname [ci-4459.2.3-n-38e0d2a52a] Mar 6 02:58:38.225415 waagent[1991]: 2026-03-06T02:58:38.225376Z INFO Daemon Daemon Publish hostname [ci-4459.2.3-n-38e0d2a52a] Mar 6 02:58:38.229864 waagent[1991]: 2026-03-06T02:58:38.229831Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 6 02:58:38.234334 waagent[1991]: 2026-03-06T02:58:38.234303Z INFO Daemon Daemon Primary interface is [eth0] Mar 6 02:58:38.243443 systemd-networkd[1498]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:58:38.243450 systemd-networkd[1498]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:58:38.243475 systemd-networkd[1498]: eth0: DHCP lease lost Mar 6 02:58:38.244526 waagent[1991]: 2026-03-06T02:58:38.244481Z INFO Daemon Daemon Create user account if not exists Mar 6 02:58:38.248727 waagent[1991]: 2026-03-06T02:58:38.248696Z INFO Daemon Daemon User core already exists, skip useradd Mar 6 02:58:38.252939 waagent[1991]: 2026-03-06T02:58:38.252914Z INFO Daemon Daemon Configure sudoer Mar 6 02:58:38.260120 waagent[1991]: 2026-03-06T02:58:38.260080Z INFO Daemon Daemon Configure sshd Mar 6 02:58:38.266357 waagent[1991]: 2026-03-06T02:58:38.266318Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 6 02:58:38.275358 waagent[1991]: 2026-03-06T02:58:38.275280Z INFO Daemon Daemon Deploy ssh public key. Mar 6 02:58:38.285233 systemd-networkd[1498]: eth0: DHCPv4 address 10.200.20.33/24, gateway 10.200.20.1 acquired from 168.63.129.16 Mar 6 02:58:39.385705 waagent[1991]: 2026-03-06T02:58:39.385660Z INFO Daemon Daemon Provisioning complete Mar 6 02:58:39.399284 waagent[1991]: 2026-03-06T02:58:39.399247Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 6 02:58:39.403882 waagent[1991]: 2026-03-06T02:58:39.403851Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 6 02:58:39.411039 waagent[1991]: 2026-03-06T02:58:39.411008Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Mar 6 02:58:39.510216 waagent[2111]: 2026-03-06T02:58:39.509294Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Mar 6 02:58:39.510216 waagent[2111]: 2026-03-06T02:58:39.509428Z INFO ExtHandler ExtHandler OS: flatcar 4459.2.3 Mar 6 02:58:39.510216 waagent[2111]: 2026-03-06T02:58:39.509464Z INFO ExtHandler ExtHandler Python: 3.11.13 Mar 6 02:58:39.510216 waagent[2111]: 2026-03-06T02:58:39.509499Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Mar 6 02:58:39.581316 waagent[2111]: 2026-03-06T02:58:39.581251Z INFO ExtHandler ExtHandler Distro: flatcar-4459.2.3; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Mar 6 02:58:39.581614 waagent[2111]: 2026-03-06T02:58:39.581584Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 6 02:58:39.581747 waagent[2111]: 2026-03-06T02:58:39.581721Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 6 02:58:39.587872 waagent[2111]: 2026-03-06T02:58:39.587826Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 6 02:58:39.592689 waagent[2111]: 2026-03-06T02:58:39.592657Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.179 Mar 6 02:58:39.593119 waagent[2111]: 2026-03-06T02:58:39.593087Z INFO ExtHandler Mar 6 02:58:39.593307 waagent[2111]: 2026-03-06T02:58:39.593280Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 3c95e6fc-ccb1-4289-8415-2a350c6896b1 eTag: 2358388105988788430 source: Fabric] Mar 6 02:58:39.593610 waagent[2111]: 2026-03-06T02:58:39.593579Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 6 02:58:39.594097 waagent[2111]: 2026-03-06T02:58:39.594066Z INFO ExtHandler Mar 6 02:58:39.594214 waagent[2111]: 2026-03-06T02:58:39.594194Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 6 02:58:39.597468 waagent[2111]: 2026-03-06T02:58:39.597442Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 6 02:58:39.648647 waagent[2111]: 2026-03-06T02:58:39.648557Z INFO ExtHandler Downloaded certificate {'thumbprint': 'DDF5D85D43AC99E09E87E9A00D0573E29D4DC7BE', 'hasPrivateKey': True} Mar 6 02:58:39.649140 waagent[2111]: 2026-03-06T02:58:39.649108Z INFO ExtHandler Fetch goal state completed Mar 6 02:58:39.660840 waagent[2111]: 2026-03-06T02:58:39.660808Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.4 27 Jan 2026 (Library: OpenSSL 3.4.4 27 Jan 2026) Mar 6 02:58:39.664162 waagent[2111]: 2026-03-06T02:58:39.664120Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2111 Mar 6 02:58:39.664372 waagent[2111]: 2026-03-06T02:58:39.664341Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 6 02:58:39.664688 waagent[2111]: 2026-03-06T02:58:39.664658Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Mar 6 02:58:39.666204 waagent[2111]: 2026-03-06T02:58:39.665834Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.2.3', '', 'Flatcar Container Linux by Kinvolk'] Mar 6 02:58:39.666204 waagent[2111]: 2026-03-06T02:58:39.666139Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.2.3', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Mar 6 02:58:39.666320 waagent[2111]: 2026-03-06T02:58:39.666287Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Mar 6 02:58:39.666727 waagent[2111]: 2026-03-06T02:58:39.666694Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 6 02:58:39.968257 waagent[2111]: 2026-03-06T02:58:39.968218Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 6 02:58:39.968429 waagent[2111]: 2026-03-06T02:58:39.968402Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 6 02:58:39.972850 waagent[2111]: 2026-03-06T02:58:39.972811Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 6 02:58:39.977127 systemd[1]: Reload requested from client PID 2127 ('systemctl') (unit waagent.service)... Mar 6 02:58:39.977142 systemd[1]: Reloading... Mar 6 02:58:40.044206 zram_generator::config[2166]: No configuration found. Mar 6 02:58:40.188143 systemd[1]: Reloading finished in 210 ms. Mar 6 02:58:40.201201 waagent[2111]: 2026-03-06T02:58:40.200547Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 6 02:58:40.201201 waagent[2111]: 2026-03-06T02:58:40.200684Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 6 02:58:40.476100 waagent[2111]: 2026-03-06T02:58:40.475372Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 6 02:58:40.476100 waagent[2111]: 2026-03-06T02:58:40.475664Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Mar 6 02:58:40.476341 waagent[2111]: 2026-03-06T02:58:40.476297Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 6 02:58:40.476416 waagent[2111]: 2026-03-06T02:58:40.476378Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 6 02:58:40.476476 waagent[2111]: 2026-03-06T02:58:40.476455Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 6 02:58:40.476668 waagent[2111]: 2026-03-06T02:58:40.476638Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 6 02:58:40.477002 waagent[2111]: 2026-03-06T02:58:40.476965Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 6 02:58:40.477115 waagent[2111]: 2026-03-06T02:58:40.477079Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 6 02:58:40.477115 waagent[2111]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 6 02:58:40.477115 waagent[2111]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Mar 6 02:58:40.477115 waagent[2111]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 6 02:58:40.477115 waagent[2111]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 6 02:58:40.477115 waagent[2111]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 6 02:58:40.477115 waagent[2111]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 6 02:58:40.477391 waagent[2111]: 2026-03-06T02:58:40.477341Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 6 02:58:40.477409 waagent[2111]: 2026-03-06T02:58:40.477393Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 6 02:58:40.477522 waagent[2111]: 2026-03-06T02:58:40.477496Z INFO EnvHandler ExtHandler Configure routes Mar 6 02:58:40.477799 waagent[2111]: 2026-03-06T02:58:40.477753Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 6 02:58:40.477875 waagent[2111]: 2026-03-06T02:58:40.477838Z INFO EnvHandler ExtHandler Gateway:None Mar 6 02:58:40.477994 waagent[2111]: 2026-03-06T02:58:40.477892Z INFO EnvHandler ExtHandler Routes:None Mar 6 02:58:40.478144 waagent[2111]: 2026-03-06T02:58:40.478107Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 6 02:58:40.478653 waagent[2111]: 2026-03-06T02:58:40.478622Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 6 02:58:40.478751 waagent[2111]: 2026-03-06T02:58:40.478713Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 6 02:58:40.478823 waagent[2111]: 2026-03-06T02:58:40.478800Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 6 02:58:40.484089 waagent[2111]: 2026-03-06T02:58:40.484058Z INFO ExtHandler ExtHandler Mar 6 02:58:40.484234 waagent[2111]: 2026-03-06T02:58:40.484206Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: e5b628a8-0eea-4cba-b114-1e5f998b9a76 correlation 93d23cbc-5fff-400d-b669-397056c04ada created: 2026-03-06T02:57:14.029655Z] Mar 6 02:58:40.484566 waagent[2111]: 2026-03-06T02:58:40.484536Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 6 02:58:40.485046 waagent[2111]: 2026-03-06T02:58:40.485018Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Mar 6 02:58:40.515299 waagent[2111]: 2026-03-06T02:58:40.515258Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Mar 6 02:58:40.515299 waagent[2111]: Try `iptables -h' or 'iptables --help' for more information.) Mar 6 02:58:40.515579 waagent[2111]: 2026-03-06T02:58:40.515545Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: D70B1FB3-CDC2-4C1E-8AC2-AA19076DE041;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Mar 6 02:58:40.553259 waagent[2111]: 2026-03-06T02:58:40.553210Z INFO MonitorHandler ExtHandler Network interfaces: Mar 6 02:58:40.553259 waagent[2111]: Executing ['ip', '-a', '-o', 'link']: Mar 6 02:58:40.553259 waagent[2111]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 6 02:58:40.553259 waagent[2111]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:77:46:08 brd ff:ff:ff:ff:ff:ff Mar 6 02:58:40.553259 waagent[2111]: 3: enP15860s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:77:46:08 brd ff:ff:ff:ff:ff:ff\ altname enP15860p0s2 Mar 6 02:58:40.553259 waagent[2111]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 6 02:58:40.553259 waagent[2111]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 6 02:58:40.553259 waagent[2111]: 2: eth0 inet 10.200.20.33/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 6 02:58:40.553259 waagent[2111]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 6 02:58:40.553259 waagent[2111]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 6 02:58:40.553259 waagent[2111]: 2: eth0 inet6 fe80::222:48ff:fe77:4608/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 6 02:58:40.675846 waagent[2111]: 2026-03-06T02:58:40.675794Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Mar 6 02:58:40.675846 waagent[2111]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:58:40.675846 waagent[2111]: pkts bytes target prot opt in out source destination Mar 6 02:58:40.675846 waagent[2111]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:58:40.675846 waagent[2111]: pkts bytes target prot opt in out source destination Mar 6 02:58:40.675846 waagent[2111]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:58:40.675846 waagent[2111]: pkts bytes target prot opt in out source destination Mar 6 02:58:40.675846 waagent[2111]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 6 02:58:40.675846 waagent[2111]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 6 02:58:40.675846 waagent[2111]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 6 02:58:40.678092 waagent[2111]: 2026-03-06T02:58:40.678049Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 6 02:58:40.678092 waagent[2111]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:58:40.678092 waagent[2111]: pkts bytes target prot opt in out source destination Mar 6 02:58:40.678092 waagent[2111]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:58:40.678092 waagent[2111]: pkts bytes target prot opt in out source destination Mar 6 02:58:40.678092 waagent[2111]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 6 02:58:40.678092 waagent[2111]: pkts bytes target prot opt in out source destination Mar 6 02:58:40.678092 waagent[2111]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 6 02:58:40.678092 waagent[2111]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 6 02:58:40.678092 waagent[2111]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 6 02:58:40.678276 waagent[2111]: 2026-03-06T02:58:40.678252Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 6 02:58:45.673128 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 6 02:58:45.674905 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:58:45.785013 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:58:45.787870 (kubelet)[2261]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:58:45.869458 kubelet[2261]: E0306 02:58:45.869421 2261 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:58:45.871882 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:58:45.871986 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:58:45.872432 systemd[1]: kubelet.service: Consumed 107ms CPU time, 107.5M memory peak. Mar 6 02:58:55.923984 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 6 02:58:55.925641 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:58:56.285246 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:58:56.287812 (kubelet)[2275]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:58:56.313185 kubelet[2275]: E0306 02:58:56.313149 2275 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:58:56.315053 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:58:56.315248 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:58:56.316262 systemd[1]: kubelet.service: Consumed 102ms CPU time, 105.3M memory peak. Mar 6 02:58:57.933078 chronyd[1845]: Selected source PHC0 Mar 6 02:59:00.190289 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 6 02:59:00.191158 systemd[1]: Started sshd@0-10.200.20.33:22-10.200.16.10:45254.service - OpenSSH per-connection server daemon (10.200.16.10:45254). Mar 6 02:59:00.589410 sshd[2283]: Accepted publickey for core from 10.200.16.10 port 45254 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:00.590338 sshd-session[2283]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:00.594157 systemd-logind[1867]: New session 3 of user core. Mar 6 02:59:00.599279 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 6 02:59:00.861417 systemd[1]: Started sshd@1-10.200.20.33:22-10.200.16.10:45270.service - OpenSSH per-connection server daemon (10.200.16.10:45270). Mar 6 02:59:01.226973 sshd[2289]: Accepted publickey for core from 10.200.16.10 port 45270 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:01.227958 sshd-session[2289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:01.231327 systemd-logind[1867]: New session 4 of user core. Mar 6 02:59:01.242300 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 6 02:59:01.431215 sshd[2292]: Connection closed by 10.200.16.10 port 45270 Mar 6 02:59:01.431720 sshd-session[2289]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:01.434526 systemd[1]: sshd@1-10.200.20.33:22-10.200.16.10:45270.service: Deactivated successfully. Mar 6 02:59:01.435805 systemd[1]: session-4.scope: Deactivated successfully. Mar 6 02:59:01.436413 systemd-logind[1867]: Session 4 logged out. Waiting for processes to exit. Mar 6 02:59:01.437650 systemd-logind[1867]: Removed session 4. Mar 6 02:59:01.507352 systemd[1]: Started sshd@2-10.200.20.33:22-10.200.16.10:45278.service - OpenSSH per-connection server daemon (10.200.16.10:45278). Mar 6 02:59:01.873166 sshd[2298]: Accepted publickey for core from 10.200.16.10 port 45278 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:01.874021 sshd-session[2298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:01.877371 systemd-logind[1867]: New session 5 of user core. Mar 6 02:59:01.887297 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 6 02:59:02.074848 sshd[2301]: Connection closed by 10.200.16.10 port 45278 Mar 6 02:59:02.074760 sshd-session[2298]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:02.077943 systemd[1]: sshd@2-10.200.20.33:22-10.200.16.10:45278.service: Deactivated successfully. Mar 6 02:59:02.079385 systemd[1]: session-5.scope: Deactivated successfully. Mar 6 02:59:02.080000 systemd-logind[1867]: Session 5 logged out. Waiting for processes to exit. Mar 6 02:59:02.080957 systemd-logind[1867]: Removed session 5. Mar 6 02:59:02.148181 systemd[1]: Started sshd@3-10.200.20.33:22-10.200.16.10:45286.service - OpenSSH per-connection server daemon (10.200.16.10:45286). Mar 6 02:59:02.498682 sshd[2307]: Accepted publickey for core from 10.200.16.10 port 45286 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:02.499365 sshd-session[2307]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:02.503139 systemd-logind[1867]: New session 6 of user core. Mar 6 02:59:02.510285 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 6 02:59:02.694254 sshd[2310]: Connection closed by 10.200.16.10 port 45286 Mar 6 02:59:02.694689 sshd-session[2307]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:02.697395 systemd[1]: sshd@3-10.200.20.33:22-10.200.16.10:45286.service: Deactivated successfully. Mar 6 02:59:02.698686 systemd[1]: session-6.scope: Deactivated successfully. Mar 6 02:59:02.699288 systemd-logind[1867]: Session 6 logged out. Waiting for processes to exit. Mar 6 02:59:02.700441 systemd-logind[1867]: Removed session 6. Mar 6 02:59:02.769471 systemd[1]: Started sshd@4-10.200.20.33:22-10.200.16.10:45294.service - OpenSSH per-connection server daemon (10.200.16.10:45294). Mar 6 02:59:03.127823 sshd[2316]: Accepted publickey for core from 10.200.16.10 port 45294 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:03.128899 sshd-session[2316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:03.132602 systemd-logind[1867]: New session 7 of user core. Mar 6 02:59:03.139297 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 6 02:59:03.288065 sudo[2320]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 6 02:59:03.288292 sudo[2320]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:03.302466 sudo[2320]: pam_unix(sudo:session): session closed for user root Mar 6 02:59:03.366938 sshd[2319]: Connection closed by 10.200.16.10 port 45294 Mar 6 02:59:03.367521 sshd-session[2316]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:03.371034 systemd[1]: sshd@4-10.200.20.33:22-10.200.16.10:45294.service: Deactivated successfully. Mar 6 02:59:03.372704 systemd[1]: session-7.scope: Deactivated successfully. Mar 6 02:59:03.373533 systemd-logind[1867]: Session 7 logged out. Waiting for processes to exit. Mar 6 02:59:03.374832 systemd-logind[1867]: Removed session 7. Mar 6 02:59:03.447360 systemd[1]: Started sshd@5-10.200.20.33:22-10.200.16.10:45298.service - OpenSSH per-connection server daemon (10.200.16.10:45298). Mar 6 02:59:03.814313 sshd[2326]: Accepted publickey for core from 10.200.16.10 port 45298 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:03.815413 sshd-session[2326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:03.818629 systemd-logind[1867]: New session 8 of user core. Mar 6 02:59:03.825284 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 6 02:59:03.950528 sudo[2331]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 6 02:59:03.950721 sudo[2331]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:03.956700 sudo[2331]: pam_unix(sudo:session): session closed for user root Mar 6 02:59:03.960066 sudo[2330]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 6 02:59:03.960498 sudo[2330]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:03.966571 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 6 02:59:03.996132 augenrules[2353]: No rules Mar 6 02:59:03.997126 systemd[1]: audit-rules.service: Deactivated successfully. Mar 6 02:59:03.997542 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 6 02:59:03.998549 sudo[2330]: pam_unix(sudo:session): session closed for user root Mar 6 02:59:04.065049 sshd[2329]: Connection closed by 10.200.16.10 port 45298 Mar 6 02:59:04.065589 sshd-session[2326]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:04.067931 systemd[1]: sshd@5-10.200.20.33:22-10.200.16.10:45298.service: Deactivated successfully. Mar 6 02:59:04.069271 systemd[1]: session-8.scope: Deactivated successfully. Mar 6 02:59:04.070469 systemd-logind[1867]: Session 8 logged out. Waiting for processes to exit. Mar 6 02:59:04.071665 systemd-logind[1867]: Removed session 8. Mar 6 02:59:04.141362 systemd[1]: Started sshd@6-10.200.20.33:22-10.200.16.10:45310.service - OpenSSH per-connection server daemon (10.200.16.10:45310). Mar 6 02:59:04.506222 sshd[2362]: Accepted publickey for core from 10.200.16.10 port 45310 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 02:59:04.506985 sshd-session[2362]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:04.510832 systemd-logind[1867]: New session 9 of user core. Mar 6 02:59:04.516302 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 6 02:59:04.642076 sudo[2366]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 6 02:59:04.642300 sudo[2366]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:06.423045 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 6 02:59:06.424321 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:06.786500 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:06.796498 (kubelet)[2392]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:59:06.821271 kubelet[2392]: E0306 02:59:06.821221 2392 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:59:06.822842 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:59:06.822943 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:59:06.823344 systemd[1]: kubelet.service: Consumed 101ms CPU time, 108.9M memory peak. Mar 6 02:59:07.106211 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 6 02:59:07.110443 (dockerd)[2399]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 6 02:59:07.339198 dockerd[2399]: time="2026-03-06T02:59:07.339028073Z" level=info msg="Starting up" Mar 6 02:59:07.341598 dockerd[2399]: time="2026-03-06T02:59:07.341581536Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 6 02:59:07.349333 dockerd[2399]: time="2026-03-06T02:59:07.349252903Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 6 02:59:07.379043 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3028373602-merged.mount: Deactivated successfully. Mar 6 02:59:07.498891 dockerd[2399]: time="2026-03-06T02:59:07.498736373Z" level=info msg="Loading containers: start." Mar 6 02:59:07.517221 kernel: Initializing XFRM netlink socket Mar 6 02:59:07.700626 systemd-networkd[1498]: docker0: Link UP Mar 6 02:59:07.716776 dockerd[2399]: time="2026-03-06T02:59:07.716750163Z" level=info msg="Loading containers: done." Mar 6 02:59:07.740679 dockerd[2399]: time="2026-03-06T02:59:07.740645748Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 6 02:59:07.740795 dockerd[2399]: time="2026-03-06T02:59:07.740704950Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 6 02:59:07.740795 dockerd[2399]: time="2026-03-06T02:59:07.740772296Z" level=info msg="Initializing buildkit" Mar 6 02:59:07.788883 dockerd[2399]: time="2026-03-06T02:59:07.788849836Z" level=info msg="Completed buildkit initialization" Mar 6 02:59:07.793786 dockerd[2399]: time="2026-03-06T02:59:07.793752427Z" level=info msg="Daemon has completed initialization" Mar 6 02:59:07.794000 dockerd[2399]: time="2026-03-06T02:59:07.793851455Z" level=info msg="API listen on /run/docker.sock" Mar 6 02:59:07.794118 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 6 02:59:08.071906 containerd[1884]: time="2026-03-06T02:59:08.071713994Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 6 02:59:08.377623 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4007656264-merged.mount: Deactivated successfully. Mar 6 02:59:08.930669 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1958407470.mount: Deactivated successfully. Mar 6 02:59:10.415497 containerd[1884]: time="2026-03-06T02:59:10.415444203Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:10.418648 containerd[1884]: time="2026-03-06T02:59:10.418620839Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=24701796" Mar 6 02:59:10.421181 containerd[1884]: time="2026-03-06T02:59:10.421148974Z" level=info msg="ImageCreate event name:\"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:10.425531 containerd[1884]: time="2026-03-06T02:59:10.425505434Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:10.426593 containerd[1884]: time="2026-03-06T02:59:10.426566751Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"24698395\" in 2.354819364s" Mar 6 02:59:10.426612 containerd[1884]: time="2026-03-06T02:59:10.426602688Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\"" Mar 6 02:59:10.427340 containerd[1884]: time="2026-03-06T02:59:10.427008878Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 6 02:59:11.744204 containerd[1884]: time="2026-03-06T02:59:11.743724591Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:11.747008 containerd[1884]: time="2026-03-06T02:59:11.746988111Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=19063039" Mar 6 02:59:11.750156 containerd[1884]: time="2026-03-06T02:59:11.750140130Z" level=info msg="ImageCreate event name:\"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:11.756984 containerd[1884]: time="2026-03-06T02:59:11.756964300Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:11.757533 containerd[1884]: time="2026-03-06T02:59:11.757394106Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"20675140\" in 1.330359227s" Mar 6 02:59:11.757533 containerd[1884]: time="2026-03-06T02:59:11.757423595Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\"" Mar 6 02:59:11.757792 containerd[1884]: time="2026-03-06T02:59:11.757771311Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 6 02:59:13.091987 containerd[1884]: time="2026-03-06T02:59:13.091930719Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:13.094539 containerd[1884]: time="2026-03-06T02:59:13.094515397Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=13797901" Mar 6 02:59:13.097643 containerd[1884]: time="2026-03-06T02:59:13.097621018Z" level=info msg="ImageCreate event name:\"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:13.102702 containerd[1884]: time="2026-03-06T02:59:13.102216200Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:13.102702 containerd[1884]: time="2026-03-06T02:59:13.102602282Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"15410020\" in 1.344805746s" Mar 6 02:59:13.102702 containerd[1884]: time="2026-03-06T02:59:13.102627611Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\"" Mar 6 02:59:13.103338 containerd[1884]: time="2026-03-06T02:59:13.103315302Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 6 02:59:13.135923 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Mar 6 02:59:14.253902 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2540002140.mount: Deactivated successfully. Mar 6 02:59:14.436848 containerd[1884]: time="2026-03-06T02:59:14.436397269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:14.439483 containerd[1884]: time="2026-03-06T02:59:14.439446686Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=22329583" Mar 6 02:59:14.443946 containerd[1884]: time="2026-03-06T02:59:14.443918615Z" level=info msg="ImageCreate event name:\"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:14.447537 containerd[1884]: time="2026-03-06T02:59:14.447494835Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:14.448074 containerd[1884]: time="2026-03-06T02:59:14.447780777Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"22328602\" in 1.344439291s" Mar 6 02:59:14.448074 containerd[1884]: time="2026-03-06T02:59:14.447807315Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\"" Mar 6 02:59:14.448301 containerd[1884]: time="2026-03-06T02:59:14.448266258Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 6 02:59:14.993701 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3581358505.mount: Deactivated successfully. Mar 6 02:59:16.275047 containerd[1884]: time="2026-03-06T02:59:16.274987591Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:16.278038 containerd[1884]: time="2026-03-06T02:59:16.278013455Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=21172211" Mar 6 02:59:16.285199 containerd[1884]: time="2026-03-06T02:59:16.284570825Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:16.288983 containerd[1884]: time="2026-03-06T02:59:16.288944469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:16.289855 containerd[1884]: time="2026-03-06T02:59:16.289538331Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.84124624s" Mar 6 02:59:16.289855 containerd[1884]: time="2026-03-06T02:59:16.289565692Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"" Mar 6 02:59:16.289980 containerd[1884]: time="2026-03-06T02:59:16.289957440Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 6 02:59:16.826887 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 6 02:59:16.828315 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:16.832382 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1046680358.mount: Deactivated successfully. Mar 6 02:59:16.860182 containerd[1884]: time="2026-03-06T02:59:16.860074571Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:16.921016 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:16.923643 (kubelet)[2749]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:59:17.025190 kubelet[2749]: E0306 02:59:17.025111 2749 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:59:17.027165 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:59:17.027389 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:59:17.027885 systemd[1]: kubelet.service: Consumed 104ms CPU time, 105M memory peak. Mar 6 02:59:17.250783 containerd[1884]: time="2026-03-06T02:59:17.250734914Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709" Mar 6 02:59:17.364211 containerd[1884]: time="2026-03-06T02:59:17.363782937Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:17.367781 containerd[1884]: time="2026-03-06T02:59:17.367754625Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:17.368318 containerd[1884]: time="2026-03-06T02:59:17.368071033Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 1.078089143s" Mar 6 02:59:17.368318 containerd[1884]: time="2026-03-06T02:59:17.368102338Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 6 02:59:17.368664 containerd[1884]: time="2026-03-06T02:59:17.368644318Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 6 02:59:18.077213 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2450955984.mount: Deactivated successfully. Mar 6 02:59:19.275026 update_engine[1871]: I20260306 02:59:19.274950 1871 update_attempter.cc:509] Updating boot flags... Mar 6 02:59:20.124734 containerd[1884]: time="2026-03-06T02:59:20.124685253Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:20.127700 containerd[1884]: time="2026-03-06T02:59:20.127550324Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21738165" Mar 6 02:59:20.131093 containerd[1884]: time="2026-03-06T02:59:20.131068136Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:20.135384 containerd[1884]: time="2026-03-06T02:59:20.135360276Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:20.136746 containerd[1884]: time="2026-03-06T02:59:20.136712597Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 2.768044446s" Mar 6 02:59:20.139202 containerd[1884]: time="2026-03-06T02:59:20.136819352Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\"" Mar 6 02:59:21.380043 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:21.380154 systemd[1]: kubelet.service: Consumed 104ms CPU time, 105M memory peak. Mar 6 02:59:21.385250 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:21.402649 systemd[1]: Reload requested from client PID 2958 ('systemctl') (unit session-9.scope)... Mar 6 02:59:21.402661 systemd[1]: Reloading... Mar 6 02:59:21.497203 zram_generator::config[3011]: No configuration found. Mar 6 02:59:21.638738 systemd[1]: Reloading finished in 235 ms. Mar 6 02:59:21.707716 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 6 02:59:21.707769 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 6 02:59:21.708611 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:21.708649 systemd[1]: kubelet.service: Consumed 72ms CPU time, 95.2M memory peak. Mar 6 02:59:21.709683 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:21.936710 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:21.942527 (kubelet)[3072]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 02:59:22.062411 kubelet[3072]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 02:59:22.273632 kubelet[3072]: I0306 02:59:22.273443 3072 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 6 02:59:22.273632 kubelet[3072]: I0306 02:59:22.273480 3072 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 02:59:22.274537 kubelet[3072]: I0306 02:59:22.274522 3072 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 02:59:22.275194 kubelet[3072]: I0306 02:59:22.274602 3072 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 02:59:22.275194 kubelet[3072]: I0306 02:59:22.274796 3072 server.go:951] "Client rotation is on, will bootstrap in background" Mar 6 02:59:22.462823 kubelet[3072]: E0306 02:59:22.462782 3072 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.33:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.33:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 6 02:59:22.463238 kubelet[3072]: I0306 02:59:22.463120 3072 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 02:59:22.466726 kubelet[3072]: I0306 02:59:22.466657 3072 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 6 02:59:22.469530 kubelet[3072]: I0306 02:59:22.469319 3072 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 02:59:22.470076 kubelet[3072]: I0306 02:59:22.470046 3072 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 02:59:22.470287 kubelet[3072]: I0306 02:59:22.470136 3072 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.3-n-38e0d2a52a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 02:59:22.470660 kubelet[3072]: I0306 02:59:22.470408 3072 topology_manager.go:143] "Creating topology manager with none policy" Mar 6 02:59:22.470660 kubelet[3072]: I0306 02:59:22.470422 3072 container_manager_linux.go:308] "Creating device plugin manager" Mar 6 02:59:22.470660 kubelet[3072]: I0306 02:59:22.470501 3072 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 02:59:22.475217 kubelet[3072]: I0306 02:59:22.475200 3072 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 6 02:59:22.475412 kubelet[3072]: I0306 02:59:22.475400 3072 kubelet.go:482] "Attempting to sync node with API server" Mar 6 02:59:22.475475 kubelet[3072]: I0306 02:59:22.475468 3072 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 02:59:22.475518 kubelet[3072]: I0306 02:59:22.475513 3072 kubelet.go:394] "Adding apiserver pod source" Mar 6 02:59:22.475560 kubelet[3072]: I0306 02:59:22.475554 3072 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 02:59:22.479231 kubelet[3072]: I0306 02:59:22.478443 3072 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 6 02:59:22.479231 kubelet[3072]: I0306 02:59:22.479024 3072 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 02:59:22.479231 kubelet[3072]: I0306 02:59:22.479044 3072 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 02:59:22.479231 kubelet[3072]: W0306 02:59:22.479071 3072 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 6 02:59:22.480910 kubelet[3072]: I0306 02:59:22.480898 3072 server.go:1257] "Started kubelet" Mar 6 02:59:22.482165 kubelet[3072]: I0306 02:59:22.482147 3072 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 6 02:59:22.484941 kubelet[3072]: E0306 02:59:22.484125 3072 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.33:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.33:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.3-n-38e0d2a52a.189a214017935683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.3-n-38e0d2a52a,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.3-n-38e0d2a52a,},FirstTimestamp:2026-03-06 02:59:22.480866947 +0000 UTC m=+0.535881394,LastTimestamp:2026-03-06 02:59:22.480866947 +0000 UTC m=+0.535881394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.3-n-38e0d2a52a,}" Mar 6 02:59:22.485042 kubelet[3072]: I0306 02:59:22.484949 3072 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 02:59:22.485754 kubelet[3072]: I0306 02:59:22.485727 3072 server.go:317] "Adding debug handlers to kubelet server" Mar 6 02:59:22.487709 kubelet[3072]: I0306 02:59:22.487666 3072 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 02:59:22.487766 kubelet[3072]: I0306 02:59:22.487712 3072 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 02:59:22.487854 kubelet[3072]: I0306 02:59:22.487838 3072 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 02:59:22.488010 kubelet[3072]: I0306 02:59:22.487990 3072 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 02:59:22.489189 kubelet[3072]: I0306 02:59:22.489163 3072 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 6 02:59:22.489576 kubelet[3072]: E0306 02:59:22.489385 3072 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:22.490133 kubelet[3072]: E0306 02:59:22.490107 3072 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-38e0d2a52a?timeout=10s\": dial tcp 10.200.20.33:6443: connect: connection refused" interval="200ms" Mar 6 02:59:22.490316 kubelet[3072]: I0306 02:59:22.490298 3072 factory.go:223] Registration of the systemd container factory successfully Mar 6 02:59:22.490375 kubelet[3072]: I0306 02:59:22.490363 3072 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 02:59:22.490669 kubelet[3072]: I0306 02:59:22.490651 3072 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 02:59:22.490741 kubelet[3072]: I0306 02:59:22.490731 3072 reconciler.go:29] "Reconciler: start to sync state" Mar 6 02:59:22.492437 kubelet[3072]: I0306 02:59:22.492423 3072 factory.go:223] Registration of the containerd container factory successfully Mar 6 02:59:22.500576 kubelet[3072]: E0306 02:59:22.500418 3072 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 6 02:59:22.501562 kubelet[3072]: I0306 02:59:22.501543 3072 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 02:59:22.502821 kubelet[3072]: I0306 02:59:22.502641 3072 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 02:59:22.502821 kubelet[3072]: I0306 02:59:22.502665 3072 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 6 02:59:22.502821 kubelet[3072]: I0306 02:59:22.502684 3072 kubelet.go:2501] "Starting kubelet main sync loop" Mar 6 02:59:22.502821 kubelet[3072]: E0306 02:59:22.502713 3072 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 02:59:22.506317 kubelet[3072]: I0306 02:59:22.506305 3072 cpu_manager.go:225] "Starting" policy="none" Mar 6 02:59:22.506388 kubelet[3072]: I0306 02:59:22.506380 3072 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 6 02:59:22.506452 kubelet[3072]: I0306 02:59:22.506444 3072 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 6 02:59:22.511334 kubelet[3072]: I0306 02:59:22.511320 3072 policy_none.go:50] "Start" Mar 6 02:59:22.511411 kubelet[3072]: I0306 02:59:22.511403 3072 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 02:59:22.511446 kubelet[3072]: I0306 02:59:22.511439 3072 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 02:59:22.516189 kubelet[3072]: I0306 02:59:22.515998 3072 policy_none.go:44] "Start" Mar 6 02:59:22.519930 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 6 02:59:22.527270 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 6 02:59:22.531034 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 6 02:59:22.538813 kubelet[3072]: E0306 02:59:22.538794 3072 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 02:59:22.538949 kubelet[3072]: I0306 02:59:22.538928 3072 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 6 02:59:22.538976 kubelet[3072]: I0306 02:59:22.538941 3072 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 02:59:22.539809 kubelet[3072]: I0306 02:59:22.539756 3072 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 6 02:59:22.540368 kubelet[3072]: E0306 02:59:22.540329 3072 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 02:59:22.540368 kubelet[3072]: E0306 02:59:22.540354 3072 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:22.615563 systemd[1]: Created slice kubepods-burstable-pod7e4214a24a30fe1c344021d350b4e3de.slice - libcontainer container kubepods-burstable-pod7e4214a24a30fe1c344021d350b4e3de.slice. Mar 6 02:59:22.624272 kubelet[3072]: E0306 02:59:22.624233 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.628080 systemd[1]: Created slice kubepods-burstable-pod0bcf97f5e6a483412ccd694281cb246e.slice - libcontainer container kubepods-burstable-pod0bcf97f5e6a483412ccd694281cb246e.slice. Mar 6 02:59:22.629668 kubelet[3072]: E0306 02:59:22.629647 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.631677 systemd[1]: Created slice kubepods-burstable-pode932240c0d9fe699741ef0217cb09ffc.slice - libcontainer container kubepods-burstable-pode932240c0d9fe699741ef0217cb09ffc.slice. Mar 6 02:59:22.632932 kubelet[3072]: E0306 02:59:22.632906 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.640037 kubelet[3072]: I0306 02:59:22.640015 3072 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.640280 kubelet[3072]: E0306 02:59:22.640255 3072 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.200.20.33:6443/api/v1/nodes\": dial tcp 10.200.20.33:6443: connect: connection refused" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.690891 kubelet[3072]: E0306 02:59:22.690864 3072 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-38e0d2a52a?timeout=10s\": dial tcp 10.200.20.33:6443: connect: connection refused" interval="400ms" Mar 6 02:59:22.691908 kubelet[3072]: I0306 02:59:22.691891 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7e4214a24a30fe1c344021d350b4e3de-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" (UID: \"7e4214a24a30fe1c344021d350b4e3de\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.691953 kubelet[3072]: I0306 02:59:22.691910 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-ca-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.691953 kubelet[3072]: I0306 02:59:22.691924 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.691953 kubelet[3072]: I0306 02:59:22.691935 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.691953 kubelet[3072]: I0306 02:59:22.691944 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e932240c0d9fe699741ef0217cb09ffc-kubeconfig\") pod \"kube-scheduler-ci-4459.2.3-n-38e0d2a52a\" (UID: \"e932240c0d9fe699741ef0217cb09ffc\") " pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.691953 kubelet[3072]: I0306 02:59:22.691954 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7e4214a24a30fe1c344021d350b4e3de-ca-certs\") pod \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" (UID: \"7e4214a24a30fe1c344021d350b4e3de\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.692036 kubelet[3072]: I0306 02:59:22.691964 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7e4214a24a30fe1c344021d350b4e3de-k8s-certs\") pod \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" (UID: \"7e4214a24a30fe1c344021d350b4e3de\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.692036 kubelet[3072]: I0306 02:59:22.691973 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.692036 kubelet[3072]: I0306 02:59:22.691981 3072 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.842460 kubelet[3072]: I0306 02:59:22.842361 3072 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.842983 kubelet[3072]: E0306 02:59:22.842954 3072 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.200.20.33:6443/api/v1/nodes\": dial tcp 10.200.20.33:6443: connect: connection refused" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:22.931908 containerd[1884]: time="2026-03-06T02:59:22.931871272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.3-n-38e0d2a52a,Uid:7e4214a24a30fe1c344021d350b4e3de,Namespace:kube-system,Attempt:0,}" Mar 6 02:59:22.938655 containerd[1884]: time="2026-03-06T02:59:22.938526662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.3-n-38e0d2a52a,Uid:0bcf97f5e6a483412ccd694281cb246e,Namespace:kube-system,Attempt:0,}" Mar 6 02:59:22.943580 containerd[1884]: time="2026-03-06T02:59:22.943550968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.3-n-38e0d2a52a,Uid:e932240c0d9fe699741ef0217cb09ffc,Namespace:kube-system,Attempt:0,}" Mar 6 02:59:23.092190 kubelet[3072]: E0306 02:59:23.092140 3072 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-38e0d2a52a?timeout=10s\": dial tcp 10.200.20.33:6443: connect: connection refused" interval="800ms" Mar 6 02:59:23.244943 kubelet[3072]: I0306 02:59:23.244909 3072 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:23.245221 kubelet[3072]: E0306 02:59:23.245193 3072 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.200.20.33:6443/api/v1/nodes\": dial tcp 10.200.20.33:6443: connect: connection refused" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:23.561923 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1621394911.mount: Deactivated successfully. Mar 6 02:59:23.586424 containerd[1884]: time="2026-03-06T02:59:23.586383188Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:59:23.599441 containerd[1884]: time="2026-03-06T02:59:23.599406359Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 6 02:59:23.602555 containerd[1884]: time="2026-03-06T02:59:23.602529188Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:59:23.606191 containerd[1884]: time="2026-03-06T02:59:23.605833550Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:59:23.611673 containerd[1884]: time="2026-03-06T02:59:23.611653289Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 6 02:59:23.614970 containerd[1884]: time="2026-03-06T02:59:23.614949939Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:59:23.618057 containerd[1884]: time="2026-03-06T02:59:23.618026975Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 02:59:23.618502 containerd[1884]: time="2026-03-06T02:59:23.618479709Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 679.931615ms" Mar 6 02:59:23.621263 containerd[1884]: time="2026-03-06T02:59:23.621112058Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 6 02:59:23.625271 containerd[1884]: time="2026-03-06T02:59:23.625248631Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 673.217383ms" Mar 6 02:59:23.646877 containerd[1884]: time="2026-03-06T02:59:23.646843062Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 700.908866ms" Mar 6 02:59:23.674192 containerd[1884]: time="2026-03-06T02:59:23.673792826Z" level=info msg="connecting to shim ad9c75b34981204ee41dca12fa72d43de0e10b71404736c66133cbe38b093132" address="unix:///run/containerd/s/4b0ad71d92fbe1af5b3df1a57ed001d9d9ad0fe9ef507dde6580a1fc47a105be" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:59:23.686542 containerd[1884]: time="2026-03-06T02:59:23.686519604Z" level=info msg="connecting to shim dda0a028a8181d94faae861fe756e16c9d8bbe8ce4593bde62b9571e3d55d1a9" address="unix:///run/containerd/s/1c9cec5e53b30a73f70cb55e5eaa7059dee3959e672789e063590a7e3fb0d783" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:59:23.700314 systemd[1]: Started cri-containerd-ad9c75b34981204ee41dca12fa72d43de0e10b71404736c66133cbe38b093132.scope - libcontainer container ad9c75b34981204ee41dca12fa72d43de0e10b71404736c66133cbe38b093132. Mar 6 02:59:23.703666 systemd[1]: Started cri-containerd-dda0a028a8181d94faae861fe756e16c9d8bbe8ce4593bde62b9571e3d55d1a9.scope - libcontainer container dda0a028a8181d94faae861fe756e16c9d8bbe8ce4593bde62b9571e3d55d1a9. Mar 6 02:59:23.772055 containerd[1884]: time="2026-03-06T02:59:23.772024822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.3-n-38e0d2a52a,Uid:7e4214a24a30fe1c344021d350b4e3de,Namespace:kube-system,Attempt:0,} returns sandbox id \"ad9c75b34981204ee41dca12fa72d43de0e10b71404736c66133cbe38b093132\"" Mar 6 02:59:23.773832 containerd[1884]: time="2026-03-06T02:59:23.773405690Z" level=info msg="connecting to shim 41d59180794cb0ce516b720aea6844a64cb1fa6c4206883ae5a8ea90c34268f1" address="unix:///run/containerd/s/097f43094f6f43dd1fadbda367f3d9ecee6eda3153e86e86531deb4e88c20da3" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:59:23.775370 containerd[1884]: time="2026-03-06T02:59:23.775351681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.3-n-38e0d2a52a,Uid:e932240c0d9fe699741ef0217cb09ffc,Namespace:kube-system,Attempt:0,} returns sandbox id \"dda0a028a8181d94faae861fe756e16c9d8bbe8ce4593bde62b9571e3d55d1a9\"" Mar 6 02:59:23.781994 containerd[1884]: time="2026-03-06T02:59:23.781973606Z" level=info msg="CreateContainer within sandbox \"ad9c75b34981204ee41dca12fa72d43de0e10b71404736c66133cbe38b093132\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 6 02:59:23.788930 containerd[1884]: time="2026-03-06T02:59:23.788538769Z" level=info msg="CreateContainer within sandbox \"dda0a028a8181d94faae861fe756e16c9d8bbe8ce4593bde62b9571e3d55d1a9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 6 02:59:23.796316 systemd[1]: Started cri-containerd-41d59180794cb0ce516b720aea6844a64cb1fa6c4206883ae5a8ea90c34268f1.scope - libcontainer container 41d59180794cb0ce516b720aea6844a64cb1fa6c4206883ae5a8ea90c34268f1. Mar 6 02:59:23.806749 containerd[1884]: time="2026-03-06T02:59:23.806716387Z" level=info msg="Container 795b4fe0b9982d9966a9a04084797807939ef49114a0dfe09721e1a56f7b5bcb: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:59:23.814295 containerd[1884]: time="2026-03-06T02:59:23.814191587Z" level=info msg="Container 448fe2f669811dc5f81ee9b6aa598f15715c9609d5cf1417fb03744bce26433c: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:59:23.829368 containerd[1884]: time="2026-03-06T02:59:23.829338971Z" level=info msg="CreateContainer within sandbox \"ad9c75b34981204ee41dca12fa72d43de0e10b71404736c66133cbe38b093132\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"795b4fe0b9982d9966a9a04084797807939ef49114a0dfe09721e1a56f7b5bcb\"" Mar 6 02:59:23.829943 containerd[1884]: time="2026-03-06T02:59:23.829922070Z" level=info msg="StartContainer for \"795b4fe0b9982d9966a9a04084797807939ef49114a0dfe09721e1a56f7b5bcb\"" Mar 6 02:59:23.831005 containerd[1884]: time="2026-03-06T02:59:23.830752297Z" level=info msg="connecting to shim 795b4fe0b9982d9966a9a04084797807939ef49114a0dfe09721e1a56f7b5bcb" address="unix:///run/containerd/s/4b0ad71d92fbe1af5b3df1a57ed001d9d9ad0fe9ef507dde6580a1fc47a105be" protocol=ttrpc version=3 Mar 6 02:59:23.832789 kubelet[3072]: E0306 02:59:23.832700 3072 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.33:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.33:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.3-n-38e0d2a52a.189a214017935683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.3-n-38e0d2a52a,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.3-n-38e0d2a52a,},FirstTimestamp:2026-03-06 02:59:22.480866947 +0000 UTC m=+0.535881394,LastTimestamp:2026-03-06 02:59:22.480866947 +0000 UTC m=+0.535881394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.3-n-38e0d2a52a,}" Mar 6 02:59:23.840796 containerd[1884]: time="2026-03-06T02:59:23.840766443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.3-n-38e0d2a52a,Uid:0bcf97f5e6a483412ccd694281cb246e,Namespace:kube-system,Attempt:0,} returns sandbox id \"41d59180794cb0ce516b720aea6844a64cb1fa6c4206883ae5a8ea90c34268f1\"" Mar 6 02:59:23.842315 systemd[1]: Started cri-containerd-795b4fe0b9982d9966a9a04084797807939ef49114a0dfe09721e1a56f7b5bcb.scope - libcontainer container 795b4fe0b9982d9966a9a04084797807939ef49114a0dfe09721e1a56f7b5bcb. Mar 6 02:59:23.844519 containerd[1884]: time="2026-03-06T02:59:23.844474115Z" level=info msg="CreateContainer within sandbox \"dda0a028a8181d94faae861fe756e16c9d8bbe8ce4593bde62b9571e3d55d1a9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"448fe2f669811dc5f81ee9b6aa598f15715c9609d5cf1417fb03744bce26433c\"" Mar 6 02:59:23.846082 containerd[1884]: time="2026-03-06T02:59:23.844772100Z" level=info msg="StartContainer for \"448fe2f669811dc5f81ee9b6aa598f15715c9609d5cf1417fb03744bce26433c\"" Mar 6 02:59:23.846862 containerd[1884]: time="2026-03-06T02:59:23.846840135Z" level=info msg="connecting to shim 448fe2f669811dc5f81ee9b6aa598f15715c9609d5cf1417fb03744bce26433c" address="unix:///run/containerd/s/1c9cec5e53b30a73f70cb55e5eaa7059dee3959e672789e063590a7e3fb0d783" protocol=ttrpc version=3 Mar 6 02:59:23.849197 containerd[1884]: time="2026-03-06T02:59:23.848382881Z" level=info msg="CreateContainer within sandbox \"41d59180794cb0ce516b720aea6844a64cb1fa6c4206883ae5a8ea90c34268f1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 6 02:59:23.866419 systemd[1]: Started cri-containerd-448fe2f669811dc5f81ee9b6aa598f15715c9609d5cf1417fb03744bce26433c.scope - libcontainer container 448fe2f669811dc5f81ee9b6aa598f15715c9609d5cf1417fb03744bce26433c. Mar 6 02:59:23.873450 containerd[1884]: time="2026-03-06T02:59:23.872826164Z" level=info msg="Container 759ab3ecfb138825e4f23e44719905233a7e098b521027ea3e9b2c1b11612a06: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:59:23.892089 containerd[1884]: time="2026-03-06T02:59:23.892058647Z" level=info msg="StartContainer for \"795b4fe0b9982d9966a9a04084797807939ef49114a0dfe09721e1a56f7b5bcb\" returns successfully" Mar 6 02:59:23.892615 kubelet[3072]: E0306 02:59:23.892581 3072 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.3-n-38e0d2a52a?timeout=10s\": dial tcp 10.200.20.33:6443: connect: connection refused" interval="1.6s" Mar 6 02:59:23.900671 containerd[1884]: time="2026-03-06T02:59:23.900537480Z" level=info msg="CreateContainer within sandbox \"41d59180794cb0ce516b720aea6844a64cb1fa6c4206883ae5a8ea90c34268f1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"759ab3ecfb138825e4f23e44719905233a7e098b521027ea3e9b2c1b11612a06\"" Mar 6 02:59:23.906225 containerd[1884]: time="2026-03-06T02:59:23.905562394Z" level=info msg="StartContainer for \"759ab3ecfb138825e4f23e44719905233a7e098b521027ea3e9b2c1b11612a06\"" Mar 6 02:59:23.906696 containerd[1884]: time="2026-03-06T02:59:23.906677750Z" level=info msg="connecting to shim 759ab3ecfb138825e4f23e44719905233a7e098b521027ea3e9b2c1b11612a06" address="unix:///run/containerd/s/097f43094f6f43dd1fadbda367f3d9ecee6eda3153e86e86531deb4e88c20da3" protocol=ttrpc version=3 Mar 6 02:59:23.922998 containerd[1884]: time="2026-03-06T02:59:23.922978771Z" level=info msg="StartContainer for \"448fe2f669811dc5f81ee9b6aa598f15715c9609d5cf1417fb03744bce26433c\" returns successfully" Mar 6 02:59:23.927316 systemd[1]: Started cri-containerd-759ab3ecfb138825e4f23e44719905233a7e098b521027ea3e9b2c1b11612a06.scope - libcontainer container 759ab3ecfb138825e4f23e44719905233a7e098b521027ea3e9b2c1b11612a06. Mar 6 02:59:23.973829 containerd[1884]: time="2026-03-06T02:59:23.973784039Z" level=info msg="StartContainer for \"759ab3ecfb138825e4f23e44719905233a7e098b521027ea3e9b2c1b11612a06\" returns successfully" Mar 6 02:59:24.050403 kubelet[3072]: I0306 02:59:24.050376 3072 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:24.514942 kubelet[3072]: E0306 02:59:24.514757 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:24.515694 kubelet[3072]: E0306 02:59:24.515633 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:24.517580 kubelet[3072]: E0306 02:59:24.517555 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:24.909043 kubelet[3072]: I0306 02:59:24.908848 3072 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:24.909043 kubelet[3072]: E0306 02:59:24.908877 3072 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4459.2.3-n-38e0d2a52a\": node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:24.999248 kubelet[3072]: E0306 02:59:24.999192 3072 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:25.100225 kubelet[3072]: E0306 02:59:25.100193 3072 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:25.200720 kubelet[3072]: E0306 02:59:25.200693 3072 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:25.301764 kubelet[3072]: E0306 02:59:25.301727 3072 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:25.402216 kubelet[3072]: E0306 02:59:25.402182 3072 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:25.502883 kubelet[3072]: E0306 02:59:25.502535 3072 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" Mar 6 02:59:25.519405 kubelet[3072]: E0306 02:59:25.519382 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:25.519705 kubelet[3072]: E0306 02:59:25.519676 3072 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.3-n-38e0d2a52a\" not found" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:25.589979 kubelet[3072]: I0306 02:59:25.589952 3072 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:25.593459 kubelet[3072]: E0306 02:59:25.593431 3072 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:25.593459 kubelet[3072]: I0306 02:59:25.593453 3072 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:25.594635 kubelet[3072]: E0306 02:59:25.594598 3072 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:25.594635 kubelet[3072]: I0306 02:59:25.594631 3072 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:25.595942 kubelet[3072]: E0306 02:59:25.595920 3072 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.3-n-38e0d2a52a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:26.480149 kubelet[3072]: I0306 02:59:26.479936 3072 apiserver.go:52] "Watching apiserver" Mar 6 02:59:26.490947 kubelet[3072]: I0306 02:59:26.490913 3072 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 02:59:26.716277 kubelet[3072]: I0306 02:59:26.716251 3072 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:26.722409 kubelet[3072]: I0306 02:59:26.722359 3072 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:59:27.060284 kubelet[3072]: I0306 02:59:27.060256 3072 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:27.067735 kubelet[3072]: I0306 02:59:27.067586 3072 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:59:27.492922 systemd[1]: Reload requested from client PID 3352 ('systemctl') (unit session-9.scope)... Mar 6 02:59:27.492940 systemd[1]: Reloading... Mar 6 02:59:27.583203 zram_generator::config[3408]: No configuration found. Mar 6 02:59:27.728018 systemd[1]: Reloading finished in 234 ms. Mar 6 02:59:27.759420 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:27.774543 systemd[1]: kubelet.service: Deactivated successfully. Mar 6 02:59:27.774705 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:27.774738 systemd[1]: kubelet.service: Consumed 487ms CPU time, 120.5M memory peak. Mar 6 02:59:27.776437 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:27.870979 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:27.879608 (kubelet)[3463]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 02:59:27.906360 kubelet[3463]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 02:59:27.910891 kubelet[3463]: I0306 02:59:27.910847 3463 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 6 02:59:27.910891 kubelet[3463]: I0306 02:59:27.910885 3463 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 02:59:27.910975 kubelet[3463]: I0306 02:59:27.910902 3463 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 02:59:27.910975 kubelet[3463]: I0306 02:59:27.910907 3463 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 02:59:27.911087 kubelet[3463]: I0306 02:59:27.911071 3463 server.go:951] "Client rotation is on, will bootstrap in background" Mar 6 02:59:27.912045 kubelet[3463]: I0306 02:59:27.912025 3463 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 6 02:59:27.958514 kubelet[3463]: I0306 02:59:27.957289 3463 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 02:59:27.962697 kubelet[3463]: I0306 02:59:27.961823 3463 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 6 02:59:27.965482 kubelet[3463]: I0306 02:59:27.965442 3463 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 02:59:27.965624 kubelet[3463]: I0306 02:59:27.965600 3463 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 02:59:27.966748 kubelet[3463]: I0306 02:59:27.965619 3463 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.3-n-38e0d2a52a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 02:59:27.966748 kubelet[3463]: I0306 02:59:27.966622 3463 topology_manager.go:143] "Creating topology manager with none policy" Mar 6 02:59:27.966748 kubelet[3463]: I0306 02:59:27.966632 3463 container_manager_linux.go:308] "Creating device plugin manager" Mar 6 02:59:27.966748 kubelet[3463]: I0306 02:59:27.966653 3463 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 02:59:27.966904 kubelet[3463]: I0306 02:59:27.966791 3463 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 6 02:59:27.966921 kubelet[3463]: I0306 02:59:27.966906 3463 kubelet.go:482] "Attempting to sync node with API server" Mar 6 02:59:27.966921 kubelet[3463]: I0306 02:59:27.966918 3463 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 02:59:27.966952 kubelet[3463]: I0306 02:59:27.966930 3463 kubelet.go:394] "Adding apiserver pod source" Mar 6 02:59:27.966952 kubelet[3463]: I0306 02:59:27.966937 3463 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 02:59:27.969291 kubelet[3463]: I0306 02:59:27.969125 3463 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 6 02:59:27.971131 kubelet[3463]: I0306 02:59:27.970534 3463 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 02:59:27.971131 kubelet[3463]: I0306 02:59:27.970647 3463 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 02:59:27.974303 kubelet[3463]: I0306 02:59:27.974283 3463 server.go:1257] "Started kubelet" Mar 6 02:59:27.978126 kubelet[3463]: I0306 02:59:27.978104 3463 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 6 02:59:27.989413 kubelet[3463]: I0306 02:59:27.988896 3463 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 02:59:27.990067 kubelet[3463]: I0306 02:59:27.989963 3463 server.go:317] "Adding debug handlers to kubelet server" Mar 6 02:59:27.994939 kubelet[3463]: I0306 02:59:27.994908 3463 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 02:59:27.995121 kubelet[3463]: I0306 02:59:27.995107 3463 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 02:59:27.995495 kubelet[3463]: I0306 02:59:27.995481 3463 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 02:59:27.996536 kubelet[3463]: I0306 02:59:27.995891 3463 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 02:59:27.996597 kubelet[3463]: I0306 02:59:27.996568 3463 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 6 02:59:27.996700 kubelet[3463]: I0306 02:59:27.996619 3463 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 02:59:27.996700 kubelet[3463]: I0306 02:59:27.996698 3463 reconciler.go:29] "Reconciler: start to sync state" Mar 6 02:59:28.002235 kubelet[3463]: I0306 02:59:28.001373 3463 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 02:59:28.002297 kubelet[3463]: I0306 02:59:28.002284 3463 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 02:59:28.002316 kubelet[3463]: I0306 02:59:28.002298 3463 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 6 02:59:28.002316 kubelet[3463]: I0306 02:59:28.002312 3463 kubelet.go:2501] "Starting kubelet main sync loop" Mar 6 02:59:28.002353 kubelet[3463]: E0306 02:59:28.002340 3463 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 02:59:28.003915 kubelet[3463]: I0306 02:59:28.003888 3463 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 02:59:28.006676 kubelet[3463]: I0306 02:59:28.006657 3463 factory.go:223] Registration of the containerd container factory successfully Mar 6 02:59:28.006792 kubelet[3463]: I0306 02:59:28.006781 3463 factory.go:223] Registration of the systemd container factory successfully Mar 6 02:59:28.054226 kubelet[3463]: I0306 02:59:28.053635 3463 cpu_manager.go:225] "Starting" policy="none" Mar 6 02:59:28.054421 kubelet[3463]: I0306 02:59:28.054353 3463 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 6 02:59:28.054527 kubelet[3463]: I0306 02:59:28.054516 3463 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 6 02:59:28.054694 kubelet[3463]: I0306 02:59:28.054681 3463 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 6 02:59:28.054782 kubelet[3463]: I0306 02:59:28.054754 3463 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 6 02:59:28.054843 kubelet[3463]: I0306 02:59:28.054836 3463 policy_none.go:50] "Start" Mar 6 02:59:28.054888 kubelet[3463]: I0306 02:59:28.054881 3463 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 02:59:28.055284 kubelet[3463]: I0306 02:59:28.055220 3463 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 02:59:28.055438 kubelet[3463]: I0306 02:59:28.055417 3463 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 6 02:59:28.055486 kubelet[3463]: I0306 02:59:28.055481 3463 policy_none.go:44] "Start" Mar 6 02:59:28.058860 kubelet[3463]: E0306 02:59:28.058840 3463 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 02:59:28.058997 kubelet[3463]: I0306 02:59:28.058980 3463 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 6 02:59:28.059056 kubelet[3463]: I0306 02:59:28.059007 3463 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 02:59:28.060338 kubelet[3463]: I0306 02:59:28.060262 3463 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 6 02:59:28.062809 kubelet[3463]: E0306 02:59:28.062787 3463 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 02:59:28.104872 kubelet[3463]: I0306 02:59:28.103259 3463 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.104872 kubelet[3463]: I0306 02:59:28.103595 3463 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.105218 kubelet[3463]: I0306 02:59:28.105203 3463 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.113001 kubelet[3463]: I0306 02:59:28.112981 3463 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:59:28.113216 kubelet[3463]: E0306 02:59:28.113203 3463 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" already exists" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.113399 kubelet[3463]: I0306 02:59:28.113124 3463 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:59:28.113784 kubelet[3463]: I0306 02:59:28.113734 3463 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:59:28.113944 kubelet[3463]: E0306 02:59:28.113932 3463 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.3-n-38e0d2a52a\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.162300 kubelet[3463]: I0306 02:59:28.162279 3463 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.177784 kubelet[3463]: I0306 02:59:28.177759 3463 kubelet_node_status.go:123] "Node was previously registered" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.177973 kubelet[3463]: I0306 02:59:28.177911 3463 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198602 kubelet[3463]: I0306 02:59:28.198464 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7e4214a24a30fe1c344021d350b4e3de-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" (UID: \"7e4214a24a30fe1c344021d350b4e3de\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198602 kubelet[3463]: I0306 02:59:28.198487 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-ca-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198602 kubelet[3463]: I0306 02:59:28.198501 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198602 kubelet[3463]: I0306 02:59:28.198510 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198602 kubelet[3463]: I0306 02:59:28.198519 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198726 kubelet[3463]: I0306 02:59:28.198528 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0bcf97f5e6a483412ccd694281cb246e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.3-n-38e0d2a52a\" (UID: \"0bcf97f5e6a483412ccd694281cb246e\") " pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198726 kubelet[3463]: I0306 02:59:28.198539 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e932240c0d9fe699741ef0217cb09ffc-kubeconfig\") pod \"kube-scheduler-ci-4459.2.3-n-38e0d2a52a\" (UID: \"e932240c0d9fe699741ef0217cb09ffc\") " pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198726 kubelet[3463]: I0306 02:59:28.198548 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7e4214a24a30fe1c344021d350b4e3de-ca-certs\") pod \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" (UID: \"7e4214a24a30fe1c344021d350b4e3de\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.198726 kubelet[3463]: I0306 02:59:28.198556 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7e4214a24a30fe1c344021d350b4e3de-k8s-certs\") pod \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" (UID: \"7e4214a24a30fe1c344021d350b4e3de\") " pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:28.968197 kubelet[3463]: I0306 02:59:28.967770 3463 apiserver.go:52] "Watching apiserver" Mar 6 02:59:28.996917 kubelet[3463]: I0306 02:59:28.996813 3463 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 02:59:29.044339 kubelet[3463]: I0306 02:59:29.044318 3463 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:29.044611 kubelet[3463]: I0306 02:59:29.044589 3463 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:29.056018 kubelet[3463]: I0306 02:59:29.055574 3463 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:59:29.056018 kubelet[3463]: E0306 02:59:29.055612 3463 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.3-n-38e0d2a52a\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:29.056211 kubelet[3463]: I0306 02:59:29.056193 3463 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 6 02:59:29.056258 kubelet[3463]: E0306 02:59:29.056225 3463 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.3-n-38e0d2a52a\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" Mar 6 02:59:29.453865 kubelet[3463]: I0306 02:59:29.453715 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.2.3-n-38e0d2a52a" podStartSLOduration=2.453703779 podStartE2EDuration="2.453703779s" podCreationTimestamp="2026-03-06 02:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:59:29.452928826 +0000 UTC m=+1.569902204" watchObservedRunningTime="2026-03-06 02:59:29.453703779 +0000 UTC m=+1.570677157" Mar 6 02:59:29.473801 kubelet[3463]: I0306 02:59:29.473769 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.2.3-n-38e0d2a52a" podStartSLOduration=1.47376092 podStartE2EDuration="1.47376092s" podCreationTimestamp="2026-03-06 02:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:59:29.465538504 +0000 UTC m=+1.582511882" watchObservedRunningTime="2026-03-06 02:59:29.47376092 +0000 UTC m=+1.590734306" Mar 6 02:59:29.744554 kubelet[3463]: I0306 02:59:29.744070 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.2.3-n-38e0d2a52a" podStartSLOduration=3.744061608 podStartE2EDuration="3.744061608s" podCreationTimestamp="2026-03-06 02:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:59:29.474915301 +0000 UTC m=+1.591888679" watchObservedRunningTime="2026-03-06 02:59:29.744061608 +0000 UTC m=+1.861035018" Mar 6 02:59:33.500433 kubelet[3463]: I0306 02:59:33.500394 3463 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 6 02:59:33.501003 containerd[1884]: time="2026-03-06T02:59:33.500670066Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 6 02:59:33.501723 kubelet[3463]: I0306 02:59:33.501273 3463 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 6 02:59:34.395687 systemd[1]: Created slice kubepods-besteffort-pod324878c4_e2f4_4449_a9e7_690b4ab2fe6b.slice - libcontainer container kubepods-besteffort-pod324878c4_e2f4_4449_a9e7_690b4ab2fe6b.slice. Mar 6 02:59:34.437708 kubelet[3463]: I0306 02:59:34.437633 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/324878c4-e2f4-4449-a9e7-690b4ab2fe6b-kube-proxy\") pod \"kube-proxy-8h6np\" (UID: \"324878c4-e2f4-4449-a9e7-690b4ab2fe6b\") " pod="kube-system/kube-proxy-8h6np" Mar 6 02:59:34.437708 kubelet[3463]: I0306 02:59:34.437662 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/324878c4-e2f4-4449-a9e7-690b4ab2fe6b-lib-modules\") pod \"kube-proxy-8h6np\" (UID: \"324878c4-e2f4-4449-a9e7-690b4ab2fe6b\") " pod="kube-system/kube-proxy-8h6np" Mar 6 02:59:34.437708 kubelet[3463]: I0306 02:59:34.437673 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/324878c4-e2f4-4449-a9e7-690b4ab2fe6b-xtables-lock\") pod \"kube-proxy-8h6np\" (UID: \"324878c4-e2f4-4449-a9e7-690b4ab2fe6b\") " pod="kube-system/kube-proxy-8h6np" Mar 6 02:59:34.437708 kubelet[3463]: I0306 02:59:34.437683 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc84d\" (UniqueName: \"kubernetes.io/projected/324878c4-e2f4-4449-a9e7-690b4ab2fe6b-kube-api-access-xc84d\") pod \"kube-proxy-8h6np\" (UID: \"324878c4-e2f4-4449-a9e7-690b4ab2fe6b\") " pod="kube-system/kube-proxy-8h6np" Mar 6 02:59:34.710068 containerd[1884]: time="2026-03-06T02:59:34.709991728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8h6np,Uid:324878c4-e2f4-4449-a9e7-690b4ab2fe6b,Namespace:kube-system,Attempt:0,}" Mar 6 02:59:34.748773 containerd[1884]: time="2026-03-06T02:59:34.748719208Z" level=info msg="connecting to shim a872094699fda14711f41cc3f3a1c75369bea0e2c4817f4f58d55df8138eea8d" address="unix:///run/containerd/s/35a338655566774912eefaaedbadd8bf1e9ca89f4de4bf89b4eeba9ddec458f1" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:59:34.766971 systemd[1]: Started cri-containerd-a872094699fda14711f41cc3f3a1c75369bea0e2c4817f4f58d55df8138eea8d.scope - libcontainer container a872094699fda14711f41cc3f3a1c75369bea0e2c4817f4f58d55df8138eea8d. Mar 6 02:59:34.793990 containerd[1884]: time="2026-03-06T02:59:34.793826012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8h6np,Uid:324878c4-e2f4-4449-a9e7-690b4ab2fe6b,Namespace:kube-system,Attempt:0,} returns sandbox id \"a872094699fda14711f41cc3f3a1c75369bea0e2c4817f4f58d55df8138eea8d\"" Mar 6 02:59:34.807151 containerd[1884]: time="2026-03-06T02:59:34.807051037Z" level=info msg="CreateContainer within sandbox \"a872094699fda14711f41cc3f3a1c75369bea0e2c4817f4f58d55df8138eea8d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 6 02:59:34.816409 systemd[1]: Created slice kubepods-besteffort-pod02c5282a_0d88_4123_a985_3e3296b9bebe.slice - libcontainer container kubepods-besteffort-pod02c5282a_0d88_4123_a985_3e3296b9bebe.slice. Mar 6 02:59:34.834654 containerd[1884]: time="2026-03-06T02:59:34.834627998Z" level=info msg="Container 7e218e3086dcee9f204624a45c7a971032d99a11b638e66f0969d1762285f2f4: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:59:34.836785 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2316168684.mount: Deactivated successfully. Mar 6 02:59:34.840478 kubelet[3463]: I0306 02:59:34.840449 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/02c5282a-0d88-4123-a985-3e3296b9bebe-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-lrq5c\" (UID: \"02c5282a-0d88-4123-a985-3e3296b9bebe\") " pod="tigera-operator/tigera-operator-6cf4cccc57-lrq5c" Mar 6 02:59:34.840692 kubelet[3463]: I0306 02:59:34.840483 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf8bv\" (UniqueName: \"kubernetes.io/projected/02c5282a-0d88-4123-a985-3e3296b9bebe-kube-api-access-cf8bv\") pod \"tigera-operator-6cf4cccc57-lrq5c\" (UID: \"02c5282a-0d88-4123-a985-3e3296b9bebe\") " pod="tigera-operator/tigera-operator-6cf4cccc57-lrq5c" Mar 6 02:59:34.851206 containerd[1884]: time="2026-03-06T02:59:34.851156501Z" level=info msg="CreateContainer within sandbox \"a872094699fda14711f41cc3f3a1c75369bea0e2c4817f4f58d55df8138eea8d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7e218e3086dcee9f204624a45c7a971032d99a11b638e66f0969d1762285f2f4\"" Mar 6 02:59:34.851707 containerd[1884]: time="2026-03-06T02:59:34.851684647Z" level=info msg="StartContainer for \"7e218e3086dcee9f204624a45c7a971032d99a11b638e66f0969d1762285f2f4\"" Mar 6 02:59:34.852828 containerd[1884]: time="2026-03-06T02:59:34.852803992Z" level=info msg="connecting to shim 7e218e3086dcee9f204624a45c7a971032d99a11b638e66f0969d1762285f2f4" address="unix:///run/containerd/s/35a338655566774912eefaaedbadd8bf1e9ca89f4de4bf89b4eeba9ddec458f1" protocol=ttrpc version=3 Mar 6 02:59:34.863282 systemd[1]: Started cri-containerd-7e218e3086dcee9f204624a45c7a971032d99a11b638e66f0969d1762285f2f4.scope - libcontainer container 7e218e3086dcee9f204624a45c7a971032d99a11b638e66f0969d1762285f2f4. Mar 6 02:59:34.924842 containerd[1884]: time="2026-03-06T02:59:34.924813205Z" level=info msg="StartContainer for \"7e218e3086dcee9f204624a45c7a971032d99a11b638e66f0969d1762285f2f4\" returns successfully" Mar 6 02:59:35.065674 kubelet[3463]: I0306 02:59:35.065570 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-8h6np" podStartSLOduration=1.06556161 podStartE2EDuration="1.06556161s" podCreationTimestamp="2026-03-06 02:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 02:59:35.064602416 +0000 UTC m=+7.181575794" watchObservedRunningTime="2026-03-06 02:59:35.06556161 +0000 UTC m=+7.182534988" Mar 6 02:59:35.125825 containerd[1884]: time="2026-03-06T02:59:35.125774986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-lrq5c,Uid:02c5282a-0d88-4123-a985-3e3296b9bebe,Namespace:tigera-operator,Attempt:0,}" Mar 6 02:59:35.161217 containerd[1884]: time="2026-03-06T02:59:35.161192659Z" level=info msg="connecting to shim a969c3b43fb6362bdae46ba4b604e8f0e0fd063d5d8eab05c34dbf8824d3c5bd" address="unix:///run/containerd/s/b69b65ae5d562bf33f1d572cc4ca5634112f34bc0610fce973284e453e5abf58" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:59:35.177282 systemd[1]: Started cri-containerd-a969c3b43fb6362bdae46ba4b604e8f0e0fd063d5d8eab05c34dbf8824d3c5bd.scope - libcontainer container a969c3b43fb6362bdae46ba4b604e8f0e0fd063d5d8eab05c34dbf8824d3c5bd. Mar 6 02:59:35.205186 containerd[1884]: time="2026-03-06T02:59:35.205146886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-lrq5c,Uid:02c5282a-0d88-4123-a985-3e3296b9bebe,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a969c3b43fb6362bdae46ba4b604e8f0e0fd063d5d8eab05c34dbf8824d3c5bd\"" Mar 6 02:59:35.207346 containerd[1884]: time="2026-03-06T02:59:35.207320268Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 6 02:59:37.189547 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1244591575.mount: Deactivated successfully. Mar 6 02:59:38.596857 containerd[1884]: time="2026-03-06T02:59:38.596387933Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:38.600587 containerd[1884]: time="2026-03-06T02:59:38.600562596Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 6 02:59:38.603328 containerd[1884]: time="2026-03-06T02:59:38.603308147Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:38.607412 containerd[1884]: time="2026-03-06T02:59:38.607376839Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:38.607788 containerd[1884]: time="2026-03-06T02:59:38.607672623Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 3.40032597s" Mar 6 02:59:38.607788 containerd[1884]: time="2026-03-06T02:59:38.607699504Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 6 02:59:38.614782 containerd[1884]: time="2026-03-06T02:59:38.614760698Z" level=info msg="CreateContainer within sandbox \"a969c3b43fb6362bdae46ba4b604e8f0e0fd063d5d8eab05c34dbf8824d3c5bd\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 6 02:59:38.639461 containerd[1884]: time="2026-03-06T02:59:38.638473599Z" level=info msg="Container db5f8ee619824e8da380e892e9a60b3734c928f595cb8877dc531628c14b5a88: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:59:38.652715 containerd[1884]: time="2026-03-06T02:59:38.652633307Z" level=info msg="CreateContainer within sandbox \"a969c3b43fb6362bdae46ba4b604e8f0e0fd063d5d8eab05c34dbf8824d3c5bd\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"db5f8ee619824e8da380e892e9a60b3734c928f595cb8877dc531628c14b5a88\"" Mar 6 02:59:38.653228 containerd[1884]: time="2026-03-06T02:59:38.653207220Z" level=info msg="StartContainer for \"db5f8ee619824e8da380e892e9a60b3734c928f595cb8877dc531628c14b5a88\"" Mar 6 02:59:38.653904 containerd[1884]: time="2026-03-06T02:59:38.653879335Z" level=info msg="connecting to shim db5f8ee619824e8da380e892e9a60b3734c928f595cb8877dc531628c14b5a88" address="unix:///run/containerd/s/b69b65ae5d562bf33f1d572cc4ca5634112f34bc0610fce973284e453e5abf58" protocol=ttrpc version=3 Mar 6 02:59:38.675286 systemd[1]: Started cri-containerd-db5f8ee619824e8da380e892e9a60b3734c928f595cb8877dc531628c14b5a88.scope - libcontainer container db5f8ee619824e8da380e892e9a60b3734c928f595cb8877dc531628c14b5a88. Mar 6 02:59:38.698541 containerd[1884]: time="2026-03-06T02:59:38.698467488Z" level=info msg="StartContainer for \"db5f8ee619824e8da380e892e9a60b3734c928f595cb8877dc531628c14b5a88\" returns successfully" Mar 6 02:59:39.452992 kubelet[3463]: I0306 02:59:39.452919 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-lrq5c" podStartSLOduration=2.05082652 podStartE2EDuration="5.45289613s" podCreationTimestamp="2026-03-06 02:59:34 +0000 UTC" firstStartedPulling="2026-03-06 02:59:35.206448621 +0000 UTC m=+7.323421999" lastFinishedPulling="2026-03-06 02:59:38.608518231 +0000 UTC m=+10.725491609" observedRunningTime="2026-03-06 02:59:39.07394166 +0000 UTC m=+11.190915070" watchObservedRunningTime="2026-03-06 02:59:39.45289613 +0000 UTC m=+11.569869540" Mar 6 02:59:43.583506 sudo[2366]: pam_unix(sudo:session): session closed for user root Mar 6 02:59:43.649928 sshd[2365]: Connection closed by 10.200.16.10 port 45310 Mar 6 02:59:43.650337 sshd-session[2362]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:43.654805 systemd[1]: sshd@6-10.200.20.33:22-10.200.16.10:45310.service: Deactivated successfully. Mar 6 02:59:43.658681 systemd[1]: session-9.scope: Deactivated successfully. Mar 6 02:59:43.661312 systemd[1]: session-9.scope: Consumed 2.361s CPU time, 220.1M memory peak. Mar 6 02:59:43.662687 systemd-logind[1867]: Session 9 logged out. Waiting for processes to exit. Mar 6 02:59:43.663774 systemd-logind[1867]: Removed session 9. Mar 6 02:59:46.852189 systemd[1]: Created slice kubepods-besteffort-podc979d213_92a9_4301_984c_17ea54caadd5.slice - libcontainer container kubepods-besteffort-podc979d213_92a9_4301_984c_17ea54caadd5.slice. Mar 6 02:59:46.914625 kubelet[3463]: I0306 02:59:46.914531 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8zm\" (UniqueName: \"kubernetes.io/projected/c979d213-92a9-4301-984c-17ea54caadd5-kube-api-access-dc8zm\") pod \"calico-typha-844dcfbf67-rs68m\" (UID: \"c979d213-92a9-4301-984c-17ea54caadd5\") " pod="calico-system/calico-typha-844dcfbf67-rs68m" Mar 6 02:59:46.914625 kubelet[3463]: I0306 02:59:46.914565 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c979d213-92a9-4301-984c-17ea54caadd5-tigera-ca-bundle\") pod \"calico-typha-844dcfbf67-rs68m\" (UID: \"c979d213-92a9-4301-984c-17ea54caadd5\") " pod="calico-system/calico-typha-844dcfbf67-rs68m" Mar 6 02:59:46.914625 kubelet[3463]: I0306 02:59:46.914577 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c979d213-92a9-4301-984c-17ea54caadd5-typha-certs\") pod \"calico-typha-844dcfbf67-rs68m\" (UID: \"c979d213-92a9-4301-984c-17ea54caadd5\") " pod="calico-system/calico-typha-844dcfbf67-rs68m" Mar 6 02:59:46.921420 systemd[1]: Created slice kubepods-besteffort-poda11c6a34_5699_4e00_9c20_5907293ba921.slice - libcontainer container kubepods-besteffort-poda11c6a34_5699_4e00_9c20_5907293ba921.slice. Mar 6 02:59:47.015678 kubelet[3463]: I0306 02:59:47.015647 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a11c6a34-5699-4e00-9c20-5907293ba921-node-certs\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.015678 kubelet[3463]: I0306 02:59:47.015674 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-sys-fs\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.015801 kubelet[3463]: I0306 02:59:47.015685 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxjk5\" (UniqueName: \"kubernetes.io/projected/a11c6a34-5699-4e00-9c20-5907293ba921-kube-api-access-rxjk5\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.015801 kubelet[3463]: I0306 02:59:47.015696 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-cni-net-dir\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.015801 kubelet[3463]: I0306 02:59:47.015707 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-nodeproc\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.015801 kubelet[3463]: I0306 02:59:47.015725 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-lib-modules\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.019948 kubelet[3463]: I0306 02:59:47.017246 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-policysync\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.019948 kubelet[3463]: I0306 02:59:47.017285 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11c6a34-5699-4e00-9c20-5907293ba921-tigera-ca-bundle\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.019948 kubelet[3463]: I0306 02:59:47.017300 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-cni-bin-dir\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.019948 kubelet[3463]: I0306 02:59:47.017309 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-cni-log-dir\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.019948 kubelet[3463]: I0306 02:59:47.017333 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-flexvol-driver-host\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.020122 kubelet[3463]: I0306 02:59:47.017344 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-xtables-lock\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.020122 kubelet[3463]: I0306 02:59:47.017359 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-bpffs\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.020122 kubelet[3463]: I0306 02:59:47.017369 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-var-lib-calico\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.020122 kubelet[3463]: I0306 02:59:47.017377 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a11c6a34-5699-4e00-9c20-5907293ba921-var-run-calico\") pod \"calico-node-bsrgz\" (UID: \"a11c6a34-5699-4e00-9c20-5907293ba921\") " pod="calico-system/calico-node-bsrgz" Mar 6 02:59:47.033472 kubelet[3463]: E0306 02:59:47.033445 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 02:59:47.118556 kubelet[3463]: I0306 02:59:47.118058 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/34238614-4e43-4e4c-a383-1e98f49409d2-socket-dir\") pod \"csi-node-driver-vgvhp\" (UID: \"34238614-4e43-4e4c-a383-1e98f49409d2\") " pod="calico-system/csi-node-driver-vgvhp" Mar 6 02:59:47.118712 kubelet[3463]: I0306 02:59:47.118684 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkq4\" (UniqueName: \"kubernetes.io/projected/34238614-4e43-4e4c-a383-1e98f49409d2-kube-api-access-fqkq4\") pod \"csi-node-driver-vgvhp\" (UID: \"34238614-4e43-4e4c-a383-1e98f49409d2\") " pod="calico-system/csi-node-driver-vgvhp" Mar 6 02:59:47.118828 kubelet[3463]: I0306 02:59:47.118816 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/34238614-4e43-4e4c-a383-1e98f49409d2-varrun\") pod \"csi-node-driver-vgvhp\" (UID: \"34238614-4e43-4e4c-a383-1e98f49409d2\") " pod="calico-system/csi-node-driver-vgvhp" Mar 6 02:59:47.118964 kubelet[3463]: I0306 02:59:47.118953 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34238614-4e43-4e4c-a383-1e98f49409d2-kubelet-dir\") pod \"csi-node-driver-vgvhp\" (UID: \"34238614-4e43-4e4c-a383-1e98f49409d2\") " pod="calico-system/csi-node-driver-vgvhp" Mar 6 02:59:47.119067 kubelet[3463]: I0306 02:59:47.119056 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/34238614-4e43-4e4c-a383-1e98f49409d2-registration-dir\") pod \"csi-node-driver-vgvhp\" (UID: \"34238614-4e43-4e4c-a383-1e98f49409d2\") " pod="calico-system/csi-node-driver-vgvhp" Mar 6 02:59:47.121902 kubelet[3463]: E0306 02:59:47.121877 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.122025 kubelet[3463]: W0306 02:59:47.121927 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.122025 kubelet[3463]: E0306 02:59:47.121946 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.122093 kubelet[3463]: E0306 02:59:47.122080 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.122093 kubelet[3463]: W0306 02:59:47.122087 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.122125 kubelet[3463]: E0306 02:59:47.122094 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.122279 kubelet[3463]: E0306 02:59:47.122233 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.122279 kubelet[3463]: W0306 02:59:47.122242 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.122279 kubelet[3463]: E0306 02:59:47.122250 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.122362 kubelet[3463]: E0306 02:59:47.122350 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.122362 kubelet[3463]: W0306 02:59:47.122355 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.122362 kubelet[3463]: E0306 02:59:47.122361 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.122497 kubelet[3463]: E0306 02:59:47.122439 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.122497 kubelet[3463]: W0306 02:59:47.122444 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.122497 kubelet[3463]: E0306 02:59:47.122449 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.123293 kubelet[3463]: E0306 02:59:47.122575 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.123293 kubelet[3463]: W0306 02:59:47.122581 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.123293 kubelet[3463]: E0306 02:59:47.122587 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.127309 kubelet[3463]: E0306 02:59:47.126610 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.127309 kubelet[3463]: W0306 02:59:47.126624 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.127309 kubelet[3463]: E0306 02:59:47.126634 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.134580 kubelet[3463]: E0306 02:59:47.134235 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.134580 kubelet[3463]: W0306 02:59:47.134250 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.134580 kubelet[3463]: E0306 02:59:47.134262 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.161605 containerd[1884]: time="2026-03-06T02:59:47.161572589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-844dcfbf67-rs68m,Uid:c979d213-92a9-4301-984c-17ea54caadd5,Namespace:calico-system,Attempt:0,}" Mar 6 02:59:47.220116 kubelet[3463]: E0306 02:59:47.220040 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.220116 kubelet[3463]: W0306 02:59:47.220054 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.220116 kubelet[3463]: E0306 02:59:47.220065 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.254973 kubelet[3463]: E0306 02:59:47.220298 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.254973 kubelet[3463]: W0306 02:59:47.220308 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.254973 kubelet[3463]: E0306 02:59:47.220318 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.254973 kubelet[3463]: E0306 02:59:47.220517 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.254973 kubelet[3463]: W0306 02:59:47.220528 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.254973 kubelet[3463]: E0306 02:59:47.220539 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.254973 kubelet[3463]: E0306 02:59:47.220682 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.254973 kubelet[3463]: W0306 02:59:47.220689 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.254973 kubelet[3463]: E0306 02:59:47.220697 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.254973 kubelet[3463]: E0306 02:59:47.220812 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255117 kubelet[3463]: W0306 02:59:47.220818 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255117 kubelet[3463]: E0306 02:59:47.220824 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255117 kubelet[3463]: E0306 02:59:47.220963 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255117 kubelet[3463]: W0306 02:59:47.220973 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255117 kubelet[3463]: E0306 02:59:47.220982 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255117 kubelet[3463]: E0306 02:59:47.221101 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255117 kubelet[3463]: W0306 02:59:47.221107 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255117 kubelet[3463]: E0306 02:59:47.221113 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255117 kubelet[3463]: E0306 02:59:47.221227 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255117 kubelet[3463]: W0306 02:59:47.221233 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255272 kubelet[3463]: E0306 02:59:47.221239 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255272 kubelet[3463]: E0306 02:59:47.221366 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255272 kubelet[3463]: W0306 02:59:47.221372 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255272 kubelet[3463]: E0306 02:59:47.221379 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255272 kubelet[3463]: E0306 02:59:47.221471 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255272 kubelet[3463]: W0306 02:59:47.221475 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255272 kubelet[3463]: E0306 02:59:47.221481 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255272 kubelet[3463]: E0306 02:59:47.221575 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255272 kubelet[3463]: W0306 02:59:47.221580 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255272 kubelet[3463]: E0306 02:59:47.221586 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255404 kubelet[3463]: E0306 02:59:47.221699 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255404 kubelet[3463]: W0306 02:59:47.221704 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255404 kubelet[3463]: E0306 02:59:47.221709 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255404 kubelet[3463]: E0306 02:59:47.221832 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255404 kubelet[3463]: W0306 02:59:47.221838 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255404 kubelet[3463]: E0306 02:59:47.221844 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255404 kubelet[3463]: E0306 02:59:47.221943 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255404 kubelet[3463]: W0306 02:59:47.221948 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255404 kubelet[3463]: E0306 02:59:47.221953 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255404 kubelet[3463]: E0306 02:59:47.222048 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255529 kubelet[3463]: W0306 02:59:47.222052 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255529 kubelet[3463]: E0306 02:59:47.222057 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255529 kubelet[3463]: E0306 02:59:47.222175 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255529 kubelet[3463]: W0306 02:59:47.222187 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255529 kubelet[3463]: E0306 02:59:47.222192 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255529 kubelet[3463]: E0306 02:59:47.222454 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255529 kubelet[3463]: W0306 02:59:47.222461 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255529 kubelet[3463]: E0306 02:59:47.222469 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255529 kubelet[3463]: E0306 02:59:47.222561 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255529 kubelet[3463]: W0306 02:59:47.222565 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255659 kubelet[3463]: E0306 02:59:47.222570 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255659 kubelet[3463]: E0306 02:59:47.222640 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255659 kubelet[3463]: W0306 02:59:47.222644 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255659 kubelet[3463]: E0306 02:59:47.222649 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255659 kubelet[3463]: E0306 02:59:47.222763 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255659 kubelet[3463]: W0306 02:59:47.222769 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255659 kubelet[3463]: E0306 02:59:47.222774 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.255659 kubelet[3463]: E0306 02:59:47.222860 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.255659 kubelet[3463]: W0306 02:59:47.222864 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.255659 kubelet[3463]: E0306 02:59:47.222869 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.256585 kubelet[3463]: E0306 02:59:47.222934 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.256585 kubelet[3463]: W0306 02:59:47.222937 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.256585 kubelet[3463]: E0306 02:59:47.222941 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.256585 kubelet[3463]: E0306 02:59:47.223026 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.256585 kubelet[3463]: W0306 02:59:47.223030 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.256585 kubelet[3463]: E0306 02:59:47.223035 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.256585 kubelet[3463]: E0306 02:59:47.223203 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.256585 kubelet[3463]: W0306 02:59:47.223210 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.256585 kubelet[3463]: E0306 02:59:47.223216 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.256585 kubelet[3463]: E0306 02:59:47.255846 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.256736 kubelet[3463]: W0306 02:59:47.255856 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.256736 kubelet[3463]: E0306 02:59:47.255867 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.303121 kubelet[3463]: E0306 02:59:47.303102 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.303121 kubelet[3463]: W0306 02:59:47.303116 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.303228 kubelet[3463]: E0306 02:59:47.303128 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.306408 containerd[1884]: time="2026-03-06T02:59:47.306373610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bsrgz,Uid:a11c6a34-5699-4e00-9c20-5907293ba921,Namespace:calico-system,Attempt:0,}" Mar 6 02:59:47.612162 kubelet[3463]: E0306 02:59:47.612100 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.612554 kubelet[3463]: W0306 02:59:47.612131 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.612554 kubelet[3463]: E0306 02:59:47.612265 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.613084 kubelet[3463]: E0306 02:59:47.613025 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.613084 kubelet[3463]: W0306 02:59:47.613039 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.613084 kubelet[3463]: E0306 02:59:47.613049 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.613647 kubelet[3463]: E0306 02:59:47.613571 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.613647 kubelet[3463]: W0306 02:59:47.613585 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.613647 kubelet[3463]: E0306 02:59:47.613595 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.614078 kubelet[3463]: E0306 02:59:47.614067 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.614702 kubelet[3463]: W0306 02:59:47.614123 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.614702 kubelet[3463]: E0306 02:59:47.614218 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.615231 kubelet[3463]: E0306 02:59:47.615124 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:47.615231 kubelet[3463]: W0306 02:59:47.615137 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:47.615231 kubelet[3463]: E0306 02:59:47.615149 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:47.615990 containerd[1884]: time="2026-03-06T02:59:47.615807119Z" level=info msg="connecting to shim d96af315b70a0843a7905b0850a53846142807d1b8a4d53ccbf51cef149c3fd4" address="unix:///run/containerd/s/3e683a31e033bdb1fb3ea1082ba64b3e8d93ab0cc196313bb4aa58d778cff24e" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:59:47.644299 systemd[1]: Started cri-containerd-d96af315b70a0843a7905b0850a53846142807d1b8a4d53ccbf51cef149c3fd4.scope - libcontainer container d96af315b70a0843a7905b0850a53846142807d1b8a4d53ccbf51cef149c3fd4. Mar 6 02:59:47.765425 containerd[1884]: time="2026-03-06T02:59:47.765276478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-844dcfbf67-rs68m,Uid:c979d213-92a9-4301-984c-17ea54caadd5,Namespace:calico-system,Attempt:0,} returns sandbox id \"d96af315b70a0843a7905b0850a53846142807d1b8a4d53ccbf51cef149c3fd4\"" Mar 6 02:59:47.767434 containerd[1884]: time="2026-03-06T02:59:47.767409392Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 6 02:59:47.863524 containerd[1884]: time="2026-03-06T02:59:47.863273716Z" level=info msg="connecting to shim 98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046" address="unix:///run/containerd/s/a848697741f4917bbf6c307ddf06d7e909ae7097dae48ecce018f6168710a773" namespace=k8s.io protocol=ttrpc version=3 Mar 6 02:59:47.878350 systemd[1]: Started cri-containerd-98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046.scope - libcontainer container 98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046. Mar 6 02:59:47.910147 containerd[1884]: time="2026-03-06T02:59:47.910113556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bsrgz,Uid:a11c6a34-5699-4e00-9c20-5907293ba921,Namespace:calico-system,Attempt:0,} returns sandbox id \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\"" Mar 6 02:59:49.002940 kubelet[3463]: E0306 02:59:49.002902 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 02:59:51.003039 kubelet[3463]: E0306 02:59:51.002978 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 02:59:51.078832 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1946225500.mount: Deactivated successfully. Mar 6 02:59:52.053505 containerd[1884]: time="2026-03-06T02:59:52.053422428Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:52.114645 containerd[1884]: time="2026-03-06T02:59:52.114611484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 6 02:59:52.118145 containerd[1884]: time="2026-03-06T02:59:52.118097577Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:52.167552 containerd[1884]: time="2026-03-06T02:59:52.167517714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:52.168135 containerd[1884]: time="2026-03-06T02:59:52.167776274Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 4.400180523s" Mar 6 02:59:52.168135 containerd[1884]: time="2026-03-06T02:59:52.167797810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 6 02:59:52.168813 containerd[1884]: time="2026-03-06T02:59:52.168791945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 6 02:59:52.211852 containerd[1884]: time="2026-03-06T02:59:52.211809434Z" level=info msg="CreateContainer within sandbox \"d96af315b70a0843a7905b0850a53846142807d1b8a4d53ccbf51cef149c3fd4\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 6 02:59:52.414552 containerd[1884]: time="2026-03-06T02:59:52.414456878Z" level=info msg="Container 183352ddaaa99462edb7b2bf34c1fa43c2c125ac9e0f44ac232163e529ceceff: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:59:52.559598 containerd[1884]: time="2026-03-06T02:59:52.559547388Z" level=info msg="CreateContainer within sandbox \"d96af315b70a0843a7905b0850a53846142807d1b8a4d53ccbf51cef149c3fd4\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"183352ddaaa99462edb7b2bf34c1fa43c2c125ac9e0f44ac232163e529ceceff\"" Mar 6 02:59:52.560490 containerd[1884]: time="2026-03-06T02:59:52.559992026Z" level=info msg="StartContainer for \"183352ddaaa99462edb7b2bf34c1fa43c2c125ac9e0f44ac232163e529ceceff\"" Mar 6 02:59:52.563783 containerd[1884]: time="2026-03-06T02:59:52.563761904Z" level=info msg="connecting to shim 183352ddaaa99462edb7b2bf34c1fa43c2c125ac9e0f44ac232163e529ceceff" address="unix:///run/containerd/s/3e683a31e033bdb1fb3ea1082ba64b3e8d93ab0cc196313bb4aa58d778cff24e" protocol=ttrpc version=3 Mar 6 02:59:52.580275 systemd[1]: Started cri-containerd-183352ddaaa99462edb7b2bf34c1fa43c2c125ac9e0f44ac232163e529ceceff.scope - libcontainer container 183352ddaaa99462edb7b2bf34c1fa43c2c125ac9e0f44ac232163e529ceceff. Mar 6 02:59:52.656478 containerd[1884]: time="2026-03-06T02:59:52.656448472Z" level=info msg="StartContainer for \"183352ddaaa99462edb7b2bf34c1fa43c2c125ac9e0f44ac232163e529ceceff\" returns successfully" Mar 6 02:59:53.003554 kubelet[3463]: E0306 02:59:53.003511 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 02:59:53.104969 kubelet[3463]: I0306 02:59:53.104285 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-844dcfbf67-rs68m" podStartSLOduration=2.702798302 podStartE2EDuration="7.104276594s" podCreationTimestamp="2026-03-06 02:59:46 +0000 UTC" firstStartedPulling="2026-03-06 02:59:47.767205586 +0000 UTC m=+19.884178972" lastFinishedPulling="2026-03-06 02:59:52.168683886 +0000 UTC m=+24.285657264" observedRunningTime="2026-03-06 02:59:53.104058195 +0000 UTC m=+25.221031581" watchObservedRunningTime="2026-03-06 02:59:53.104276594 +0000 UTC m=+25.221249980" Mar 6 02:59:53.145477 kubelet[3463]: E0306 02:59:53.145454 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.145577 kubelet[3463]: W0306 02:59:53.145484 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.145577 kubelet[3463]: E0306 02:59:53.145516 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.145685 kubelet[3463]: E0306 02:59:53.145673 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.145685 kubelet[3463]: W0306 02:59:53.145683 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.145725 kubelet[3463]: E0306 02:59:53.145691 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.145828 kubelet[3463]: E0306 02:59:53.145816 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.145828 kubelet[3463]: W0306 02:59:53.145825 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.145872 kubelet[3463]: E0306 02:59:53.145832 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.145979 kubelet[3463]: E0306 02:59:53.145964 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.145979 kubelet[3463]: W0306 02:59:53.145974 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146033 kubelet[3463]: E0306 02:59:53.145981 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146104 kubelet[3463]: E0306 02:59:53.146093 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146104 kubelet[3463]: W0306 02:59:53.146101 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146157 kubelet[3463]: E0306 02:59:53.146120 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146240 kubelet[3463]: E0306 02:59:53.146227 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146240 kubelet[3463]: W0306 02:59:53.146236 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146288 kubelet[3463]: E0306 02:59:53.146242 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146360 kubelet[3463]: E0306 02:59:53.146348 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146360 kubelet[3463]: W0306 02:59:53.146357 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146410 kubelet[3463]: E0306 02:59:53.146363 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146532 kubelet[3463]: E0306 02:59:53.146519 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146532 kubelet[3463]: W0306 02:59:53.146528 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146571 kubelet[3463]: E0306 02:59:53.146535 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146656 kubelet[3463]: E0306 02:59:53.146645 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146656 kubelet[3463]: W0306 02:59:53.146653 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146701 kubelet[3463]: E0306 02:59:53.146659 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146756 kubelet[3463]: E0306 02:59:53.146747 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146756 kubelet[3463]: W0306 02:59:53.146753 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146796 kubelet[3463]: E0306 02:59:53.146758 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146856 kubelet[3463]: E0306 02:59:53.146839 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146856 kubelet[3463]: W0306 02:59:53.146852 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146856 kubelet[3463]: E0306 02:59:53.146857 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.146943 kubelet[3463]: E0306 02:59:53.146935 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.146943 kubelet[3463]: W0306 02:59:53.146941 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.146985 kubelet[3463]: E0306 02:59:53.146946 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.147043 kubelet[3463]: E0306 02:59:53.147034 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.147043 kubelet[3463]: W0306 02:59:53.147040 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.147087 kubelet[3463]: E0306 02:59:53.147045 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.147133 kubelet[3463]: E0306 02:59:53.147125 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.147133 kubelet[3463]: W0306 02:59:53.147131 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.147181 kubelet[3463]: E0306 02:59:53.147136 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.147232 kubelet[3463]: E0306 02:59:53.147224 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.147232 kubelet[3463]: W0306 02:59:53.147230 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.147274 kubelet[3463]: E0306 02:59:53.147235 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.164044 kubelet[3463]: E0306 02:59:53.163735 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.164044 kubelet[3463]: W0306 02:59:53.163750 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.164044 kubelet[3463]: E0306 02:59:53.163762 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.164264 kubelet[3463]: E0306 02:59:53.164252 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.164444 kubelet[3463]: W0306 02:59:53.164318 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.164444 kubelet[3463]: E0306 02:59:53.164333 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.164893 kubelet[3463]: E0306 02:59:53.164829 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.164893 kubelet[3463]: W0306 02:59:53.164841 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.164893 kubelet[3463]: E0306 02:59:53.164850 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.165460 kubelet[3463]: E0306 02:59:53.165436 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.165460 kubelet[3463]: W0306 02:59:53.165456 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.165539 kubelet[3463]: E0306 02:59:53.165467 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.165614 kubelet[3463]: E0306 02:59:53.165597 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.165614 kubelet[3463]: W0306 02:59:53.165608 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.165614 kubelet[3463]: E0306 02:59:53.165615 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.165879 kubelet[3463]: E0306 02:59:53.165836 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.165879 kubelet[3463]: W0306 02:59:53.165847 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.165879 kubelet[3463]: E0306 02:59:53.165856 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.166043 kubelet[3463]: E0306 02:59:53.166029 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.166043 kubelet[3463]: W0306 02:59:53.166039 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.166086 kubelet[3463]: E0306 02:59:53.166046 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.166188 kubelet[3463]: E0306 02:59:53.166176 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.166188 kubelet[3463]: W0306 02:59:53.166185 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.166246 kubelet[3463]: E0306 02:59:53.166191 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.166324 kubelet[3463]: E0306 02:59:53.166313 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.166324 kubelet[3463]: W0306 02:59:53.166321 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.166366 kubelet[3463]: E0306 02:59:53.166327 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.166518 kubelet[3463]: E0306 02:59:53.166505 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.166518 kubelet[3463]: W0306 02:59:53.166514 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.166571 kubelet[3463]: E0306 02:59:53.166521 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.166641 kubelet[3463]: E0306 02:59:53.166627 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.166641 kubelet[3463]: W0306 02:59:53.166638 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.166677 kubelet[3463]: E0306 02:59:53.166644 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.166799 kubelet[3463]: E0306 02:59:53.166785 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.166799 kubelet[3463]: W0306 02:59:53.166795 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.166851 kubelet[3463]: E0306 02:59:53.166802 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.167057 kubelet[3463]: E0306 02:59:53.167043 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.167057 kubelet[3463]: W0306 02:59:53.167053 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.167103 kubelet[3463]: E0306 02:59:53.167060 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.167209 kubelet[3463]: E0306 02:59:53.167197 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.167209 kubelet[3463]: W0306 02:59:53.167207 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.167262 kubelet[3463]: E0306 02:59:53.167213 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.167335 kubelet[3463]: E0306 02:59:53.167322 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.167335 kubelet[3463]: W0306 02:59:53.167329 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.167335 kubelet[3463]: E0306 02:59:53.167335 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.167440 kubelet[3463]: E0306 02:59:53.167421 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.167440 kubelet[3463]: W0306 02:59:53.167436 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.167487 kubelet[3463]: E0306 02:59:53.167441 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.167548 kubelet[3463]: E0306 02:59:53.167537 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.167548 kubelet[3463]: W0306 02:59:53.167544 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.167580 kubelet[3463]: E0306 02:59:53.167549 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:53.167916 kubelet[3463]: E0306 02:59:53.167902 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:53.167916 kubelet[3463]: W0306 02:59:53.167913 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:53.167971 kubelet[3463]: E0306 02:59:53.167920 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.092252 kubelet[3463]: I0306 02:59:54.091805 3463 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 02:59:54.205784 kubelet[3463]: E0306 02:59:54.153217 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.205784 kubelet[3463]: W0306 02:59:54.153234 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.205784 kubelet[3463]: E0306 02:59:54.153251 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.205784 kubelet[3463]: E0306 02:59:54.153376 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.205784 kubelet[3463]: W0306 02:59:54.153382 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.205784 kubelet[3463]: E0306 02:59:54.153389 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.205784 kubelet[3463]: E0306 02:59:54.153524 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.205784 kubelet[3463]: W0306 02:59:54.153531 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.205784 kubelet[3463]: E0306 02:59:54.153537 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.205784 kubelet[3463]: E0306 02:59:54.153648 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.205993 kubelet[3463]: W0306 02:59:54.153654 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.205993 kubelet[3463]: E0306 02:59:54.153659 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.205993 kubelet[3463]: E0306 02:59:54.153771 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.205993 kubelet[3463]: W0306 02:59:54.153776 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.205993 kubelet[3463]: E0306 02:59:54.153782 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.205993 kubelet[3463]: E0306 02:59:54.153908 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.205993 kubelet[3463]: W0306 02:59:54.153916 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.205993 kubelet[3463]: E0306 02:59:54.153924 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.205993 kubelet[3463]: E0306 02:59:54.154042 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.205993 kubelet[3463]: W0306 02:59:54.154048 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206154 kubelet[3463]: E0306 02:59:54.154053 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206154 kubelet[3463]: E0306 02:59:54.154157 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206154 kubelet[3463]: W0306 02:59:54.154162 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206154 kubelet[3463]: E0306 02:59:54.154167 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206154 kubelet[3463]: E0306 02:59:54.154317 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206154 kubelet[3463]: W0306 02:59:54.154322 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206154 kubelet[3463]: E0306 02:59:54.154329 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206154 kubelet[3463]: E0306 02:59:54.154443 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206154 kubelet[3463]: W0306 02:59:54.154448 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206154 kubelet[3463]: E0306 02:59:54.154454 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206302 kubelet[3463]: E0306 02:59:54.154554 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206302 kubelet[3463]: W0306 02:59:54.154559 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206302 kubelet[3463]: E0306 02:59:54.154564 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206302 kubelet[3463]: E0306 02:59:54.154667 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206302 kubelet[3463]: W0306 02:59:54.154672 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206302 kubelet[3463]: E0306 02:59:54.154678 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206302 kubelet[3463]: E0306 02:59:54.154785 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206302 kubelet[3463]: W0306 02:59:54.154790 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206302 kubelet[3463]: E0306 02:59:54.154795 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206302 kubelet[3463]: E0306 02:59:54.154901 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206435 kubelet[3463]: W0306 02:59:54.154906 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206435 kubelet[3463]: E0306 02:59:54.154911 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206435 kubelet[3463]: E0306 02:59:54.155280 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206435 kubelet[3463]: W0306 02:59:54.155290 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206435 kubelet[3463]: E0306 02:59:54.155298 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206435 kubelet[3463]: E0306 02:59:54.171489 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206435 kubelet[3463]: W0306 02:59:54.171502 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206435 kubelet[3463]: E0306 02:59:54.171511 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206435 kubelet[3463]: E0306 02:59:54.171664 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206435 kubelet[3463]: W0306 02:59:54.171670 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206563 kubelet[3463]: E0306 02:59:54.171678 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206563 kubelet[3463]: E0306 02:59:54.171859 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206563 kubelet[3463]: W0306 02:59:54.171871 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206563 kubelet[3463]: E0306 02:59:54.171882 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206563 kubelet[3463]: E0306 02:59:54.172008 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206563 kubelet[3463]: W0306 02:59:54.172015 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206563 kubelet[3463]: E0306 02:59:54.172023 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206563 kubelet[3463]: E0306 02:59:54.172136 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206563 kubelet[3463]: W0306 02:59:54.172141 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206563 kubelet[3463]: E0306 02:59:54.172147 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206695 kubelet[3463]: E0306 02:59:54.172292 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206695 kubelet[3463]: W0306 02:59:54.172298 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206695 kubelet[3463]: E0306 02:59:54.172305 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206695 kubelet[3463]: E0306 02:59:54.172482 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206695 kubelet[3463]: W0306 02:59:54.172490 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206695 kubelet[3463]: E0306 02:59:54.172498 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206695 kubelet[3463]: E0306 02:59:54.172646 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206695 kubelet[3463]: W0306 02:59:54.172655 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206695 kubelet[3463]: E0306 02:59:54.172661 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206695 kubelet[3463]: E0306 02:59:54.172773 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206836 kubelet[3463]: W0306 02:59:54.172780 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206836 kubelet[3463]: E0306 02:59:54.172787 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206836 kubelet[3463]: E0306 02:59:54.172887 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206836 kubelet[3463]: W0306 02:59:54.172892 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206836 kubelet[3463]: E0306 02:59:54.172897 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206836 kubelet[3463]: E0306 02:59:54.173030 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206836 kubelet[3463]: W0306 02:59:54.173037 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206836 kubelet[3463]: E0306 02:59:54.173043 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206836 kubelet[3463]: E0306 02:59:54.173267 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206836 kubelet[3463]: W0306 02:59:54.173276 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206965 kubelet[3463]: E0306 02:59:54.173284 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206965 kubelet[3463]: E0306 02:59:54.173446 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206965 kubelet[3463]: W0306 02:59:54.173453 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206965 kubelet[3463]: E0306 02:59:54.173460 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206965 kubelet[3463]: E0306 02:59:54.173577 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206965 kubelet[3463]: W0306 02:59:54.173582 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206965 kubelet[3463]: E0306 02:59:54.173590 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.206965 kubelet[3463]: E0306 02:59:54.173691 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.206965 kubelet[3463]: W0306 02:59:54.173697 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.206965 kubelet[3463]: E0306 02:59:54.173702 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.207099 kubelet[3463]: E0306 02:59:54.173818 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.207099 kubelet[3463]: W0306 02:59:54.173824 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.207099 kubelet[3463]: E0306 02:59:54.173829 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.207099 kubelet[3463]: E0306 02:59:54.174003 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.207099 kubelet[3463]: W0306 02:59:54.174010 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.207099 kubelet[3463]: E0306 02:59:54.174018 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:54.207099 kubelet[3463]: E0306 02:59:54.174445 3463 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 02:59:54.207099 kubelet[3463]: W0306 02:59:54.174454 3463 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 02:59:54.207099 kubelet[3463]: E0306 02:59:54.174462 3463 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 02:59:55.002951 kubelet[3463]: E0306 02:59:55.002556 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 02:59:57.003537 kubelet[3463]: E0306 02:59:57.003499 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 02:59:57.607945 containerd[1884]: time="2026-03-06T02:59:57.607898880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:57.656223 containerd[1884]: time="2026-03-06T02:59:57.656154646Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 6 02:59:57.659283 containerd[1884]: time="2026-03-06T02:59:57.659252038Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:57.703458 containerd[1884]: time="2026-03-06T02:59:57.702922413Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:57.703458 containerd[1884]: time="2026-03-06T02:59:57.703370979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 5.534290785s" Mar 6 02:59:57.703458 containerd[1884]: time="2026-03-06T02:59:57.703391132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 6 02:59:57.768243 containerd[1884]: time="2026-03-06T02:59:57.768211045Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 6 02:59:57.956419 containerd[1884]: time="2026-03-06T02:59:57.956387686Z" level=info msg="Container 481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564: CDI devices from CRI Config.CDIDevices: []" Mar 6 02:59:57.959207 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2734891668.mount: Deactivated successfully. Mar 6 02:59:58.073025 containerd[1884]: time="2026-03-06T02:59:58.072984755Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564\"" Mar 6 02:59:58.073602 containerd[1884]: time="2026-03-06T02:59:58.073396680Z" level=info msg="StartContainer for \"481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564\"" Mar 6 02:59:58.075289 containerd[1884]: time="2026-03-06T02:59:58.075264498Z" level=info msg="connecting to shim 481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564" address="unix:///run/containerd/s/a848697741f4917bbf6c307ddf06d7e909ae7097dae48ecce018f6168710a773" protocol=ttrpc version=3 Mar 6 02:59:58.094380 systemd[1]: Started cri-containerd-481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564.scope - libcontainer container 481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564. Mar 6 02:59:58.154700 systemd[1]: cri-containerd-481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564.scope: Deactivated successfully. Mar 6 02:59:58.161536 containerd[1884]: time="2026-03-06T02:59:58.161503526Z" level=info msg="StartContainer for \"481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564\" returns successfully" Mar 6 02:59:58.162003 containerd[1884]: time="2026-03-06T02:59:58.161878370Z" level=info msg="received container exit event container_id:\"481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564\" id:\"481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564\" pid:4128 exited_at:{seconds:1772765998 nanos:160562409}" Mar 6 02:59:58.178923 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-481833fb4368cdff4e2d77e1e95611dca7f797007367ba2150174ca1c7741564-rootfs.mount: Deactivated successfully. Mar 6 02:59:59.002591 kubelet[3463]: E0306 02:59:59.002557 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:01.001972 kubelet[3463]: I0306 03:00:01.001709 3463 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 03:00:01.003489 kubelet[3463]: E0306 03:00:01.003011 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:01.110714 containerd[1884]: time="2026-03-06T03:00:01.110649694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 6 03:00:03.002788 kubelet[3463]: E0306 03:00:03.002730 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:05.003453 kubelet[3463]: E0306 03:00:05.003364 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:07.002927 kubelet[3463]: E0306 03:00:07.002828 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:09.003210 kubelet[3463]: E0306 03:00:09.002973 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:11.003203 kubelet[3463]: E0306 03:00:11.002882 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:13.002833 kubelet[3463]: E0306 03:00:13.002774 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:15.003453 kubelet[3463]: E0306 03:00:15.003399 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:17.002844 kubelet[3463]: E0306 03:00:17.002707 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:19.003700 kubelet[3463]: E0306 03:00:19.003567 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:19.432608 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2076097124.mount: Deactivated successfully. Mar 6 03:00:19.934656 containerd[1884]: time="2026-03-06T03:00:19.934606227Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:19.939196 containerd[1884]: time="2026-03-06T03:00:19.939119588Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 6 03:00:19.943214 containerd[1884]: time="2026-03-06T03:00:19.943146903Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:19.947384 containerd[1884]: time="2026-03-06T03:00:19.947329238Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:19.947891 containerd[1884]: time="2026-03-06T03:00:19.947578966Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 18.836892647s" Mar 6 03:00:19.947891 containerd[1884]: time="2026-03-06T03:00:19.947607671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 6 03:00:19.954668 containerd[1884]: time="2026-03-06T03:00:19.954636685Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 6 03:00:19.976576 containerd[1884]: time="2026-03-06T03:00:19.976539135Z" level=info msg="Container 880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:19.994094 containerd[1884]: time="2026-03-06T03:00:19.993980458Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641\"" Mar 6 03:00:19.997386 containerd[1884]: time="2026-03-06T03:00:19.997352409Z" level=info msg="StartContainer for \"880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641\"" Mar 6 03:00:19.998851 containerd[1884]: time="2026-03-06T03:00:19.998789397Z" level=info msg="connecting to shim 880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641" address="unix:///run/containerd/s/a848697741f4917bbf6c307ddf06d7e909ae7097dae48ecce018f6168710a773" protocol=ttrpc version=3 Mar 6 03:00:20.016308 systemd[1]: Started cri-containerd-880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641.scope - libcontainer container 880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641. Mar 6 03:00:20.077893 containerd[1884]: time="2026-03-06T03:00:20.077853691Z" level=info msg="StartContainer for \"880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641\" returns successfully" Mar 6 03:00:20.103835 systemd[1]: cri-containerd-880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641.scope: Deactivated successfully. Mar 6 03:00:20.106082 containerd[1884]: time="2026-03-06T03:00:20.105755620Z" level=info msg="received container exit event container_id:\"880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641\" id:\"880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641\" pid:4187 exited_at:{seconds:1772766020 nanos:105138505}" Mar 6 03:00:20.122714 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-880835eb45596c2fb9eefd55ea246fc19e31958ac331ba5045e78d894edad641-rootfs.mount: Deactivated successfully. Mar 6 03:00:21.003091 kubelet[3463]: E0306 03:00:21.003031 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:22.149941 containerd[1884]: time="2026-03-06T03:00:22.149857810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 6 03:00:23.003026 kubelet[3463]: E0306 03:00:23.002974 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:25.002993 kubelet[3463]: E0306 03:00:25.002939 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:27.002614 kubelet[3463]: E0306 03:00:27.002566 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:29.002928 kubelet[3463]: E0306 03:00:29.002614 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:31.003151 kubelet[3463]: E0306 03:00:31.002707 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:33.003493 kubelet[3463]: E0306 03:00:33.003435 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:35.003204 kubelet[3463]: E0306 03:00:35.002855 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:37.003278 kubelet[3463]: E0306 03:00:37.003063 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:39.003142 kubelet[3463]: E0306 03:00:39.003097 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:41.003460 kubelet[3463]: E0306 03:00:41.003414 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:43.003344 kubelet[3463]: E0306 03:00:43.003190 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:45.003153 kubelet[3463]: E0306 03:00:45.003101 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:47.003358 kubelet[3463]: E0306 03:00:47.003307 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:48.891058 containerd[1884]: time="2026-03-06T03:00:48.890508460Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:48.895007 containerd[1884]: time="2026-03-06T03:00:48.894976421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 6 03:00:48.900056 containerd[1884]: time="2026-03-06T03:00:48.900027088Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:48.904471 containerd[1884]: time="2026-03-06T03:00:48.903974794Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:48.904471 containerd[1884]: time="2026-03-06T03:00:48.904360214Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 26.754449002s" Mar 6 03:00:48.904471 containerd[1884]: time="2026-03-06T03:00:48.904385886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 6 03:00:48.924482 containerd[1884]: time="2026-03-06T03:00:48.924443159Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 6 03:00:48.956432 containerd[1884]: time="2026-03-06T03:00:48.956392646Z" level=info msg="Container 92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:48.978899 containerd[1884]: time="2026-03-06T03:00:48.978753093Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee\"" Mar 6 03:00:48.980220 containerd[1884]: time="2026-03-06T03:00:48.979601863Z" level=info msg="StartContainer for \"92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee\"" Mar 6 03:00:48.981306 containerd[1884]: time="2026-03-06T03:00:48.981280883Z" level=info msg="connecting to shim 92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee" address="unix:///run/containerd/s/a848697741f4917bbf6c307ddf06d7e909ae7097dae48ecce018f6168710a773" protocol=ttrpc version=3 Mar 6 03:00:49.002867 kubelet[3463]: E0306 03:00:49.002826 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:49.005347 systemd[1]: Started cri-containerd-92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee.scope - libcontainer container 92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee. Mar 6 03:00:49.078752 containerd[1884]: time="2026-03-06T03:00:49.078669134Z" level=info msg="StartContainer for \"92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee\" returns successfully" Mar 6 03:00:50.342904 containerd[1884]: time="2026-03-06T03:00:50.342860178Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 6 03:00:50.346495 systemd[1]: cri-containerd-92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee.scope: Deactivated successfully. Mar 6 03:00:50.347015 systemd[1]: cri-containerd-92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee.scope: Consumed 363ms CPU time, 189M memory peak, 171.3M written to disk. Mar 6 03:00:50.348645 containerd[1884]: time="2026-03-06T03:00:50.348615429Z" level=info msg="received container exit event container_id:\"92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee\" id:\"92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee\" pid:4250 exited_at:{seconds:1772766050 nanos:348321916}" Mar 6 03:00:50.365098 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-92fcbac998e6fa6da91cd11fc3dc1b8e2504b5f31ba8d7254fed3b9954b827ee-rootfs.mount: Deactivated successfully. Mar 6 03:00:50.399001 kubelet[3463]: I0306 03:00:50.398969 3463 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 6 03:00:51.219213 systemd[1]: Created slice kubepods-besteffort-pod34238614_4e43_4e4c_a383_1e98f49409d2.slice - libcontainer container kubepods-besteffort-pod34238614_4e43_4e4c_a383_1e98f49409d2.slice. Mar 6 03:00:51.227747 systemd[1]: Created slice kubepods-burstable-pod7fee49c5_1cc8_4ecf_adca_b62999dce111.slice - libcontainer container kubepods-burstable-pod7fee49c5_1cc8_4ecf_adca_b62999dce111.slice. Mar 6 03:00:51.235858 systemd[1]: Created slice kubepods-besteffort-pode4484eb5_8708_4838_b3a7_3f82f9b3c273.slice - libcontainer container kubepods-besteffort-pode4484eb5_8708_4838_b3a7_3f82f9b3c273.slice. Mar 6 03:00:51.240277 systemd[1]: Created slice kubepods-besteffort-pod9e4a41da_673a_45a8_bf0f_7d9710818af6.slice - libcontainer container kubepods-besteffort-pod9e4a41da_673a_45a8_bf0f_7d9710818af6.slice. Mar 6 03:00:51.258058 containerd[1884]: time="2026-03-06T03:00:51.257694857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vgvhp,Uid:34238614-4e43-4e4c-a383-1e98f49409d2,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:51.263940 systemd[1]: Created slice kubepods-besteffort-podf4223e88_de52_4ec0_bd0f_e57546bb2a16.slice - libcontainer container kubepods-besteffort-podf4223e88_de52_4ec0_bd0f_e57546bb2a16.slice. Mar 6 03:00:51.279978 systemd[1]: Created slice kubepods-burstable-pod2354fc17_8c81_41a6_b3cd_47dc713195e5.slice - libcontainer container kubepods-burstable-pod2354fc17_8c81_41a6_b3cd_47dc713195e5.slice. Mar 6 03:00:51.286979 systemd[1]: Created slice kubepods-besteffort-podaf2cd53b_20a0_4423_8418_fd70d3b27c95.slice - libcontainer container kubepods-besteffort-podaf2cd53b_20a0_4423_8418_fd70d3b27c95.slice. Mar 6 03:00:51.299709 systemd[1]: Created slice kubepods-besteffort-pod3ff27126_6010_4931_ac8e_f991d4289c41.slice - libcontainer container kubepods-besteffort-pod3ff27126_6010_4931_ac8e_f991d4289c41.slice. Mar 6 03:00:51.307043 kubelet[3463]: I0306 03:00:51.306974 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-backend-key-pair\") pod \"whisker-76895bf79b-6blbt\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " pod="calico-system/whisker-76895bf79b-6blbt" Mar 6 03:00:51.307224 kubelet[3463]: I0306 03:00:51.307087 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-ca-bundle\") pod \"whisker-76895bf79b-6blbt\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " pod="calico-system/whisker-76895bf79b-6blbt" Mar 6 03:00:51.307224 kubelet[3463]: I0306 03:00:51.307103 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h4x2\" (UniqueName: \"kubernetes.io/projected/3ff27126-6010-4931-ac8e-f991d4289c41-kube-api-access-6h4x2\") pod \"whisker-76895bf79b-6blbt\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " pod="calico-system/whisker-76895bf79b-6blbt" Mar 6 03:00:51.307224 kubelet[3463]: I0306 03:00:51.307115 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67696\" (UniqueName: \"kubernetes.io/projected/9e4a41da-673a-45a8-bf0f-7d9710818af6-kube-api-access-67696\") pod \"calico-kube-controllers-59b45bc7fd-tv5p4\" (UID: \"9e4a41da-673a-45a8-bf0f-7d9710818af6\") " pod="calico-system/calico-kube-controllers-59b45bc7fd-tv5p4" Mar 6 03:00:51.307376 kubelet[3463]: I0306 03:00:51.307126 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxhkt\" (UniqueName: \"kubernetes.io/projected/7fee49c5-1cc8-4ecf-adca-b62999dce111-kube-api-access-rxhkt\") pod \"coredns-7d764666f9-6tqr6\" (UID: \"7fee49c5-1cc8-4ecf-adca-b62999dce111\") " pod="kube-system/coredns-7d764666f9-6tqr6" Mar 6 03:00:51.307461 kubelet[3463]: I0306 03:00:51.307445 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8f6z\" (UniqueName: \"kubernetes.io/projected/2354fc17-8c81-41a6-b3cd-47dc713195e5-kube-api-access-s8f6z\") pod \"coredns-7d764666f9-prjf5\" (UID: \"2354fc17-8c81-41a6-b3cd-47dc713195e5\") " pod="kube-system/coredns-7d764666f9-prjf5" Mar 6 03:00:51.307581 kubelet[3463]: I0306 03:00:51.307545 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e4484eb5-8708-4838-b3a7-3f82f9b3c273-calico-apiserver-certs\") pod \"calico-apiserver-6b94f7659d-7htj6\" (UID: \"e4484eb5-8708-4838-b3a7-3f82f9b3c273\") " pod="calico-system/calico-apiserver-6b94f7659d-7htj6" Mar 6 03:00:51.307581 kubelet[3463]: I0306 03:00:51.307560 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f4223e88-de52-4ec0-bd0f-e57546bb2a16-goldmane-key-pair\") pod \"goldmane-9f7667bb8-mmcmx\" (UID: \"f4223e88-de52-4ec0-bd0f-e57546bb2a16\") " pod="calico-system/goldmane-9f7667bb8-mmcmx" Mar 6 03:00:51.307668 kubelet[3463]: I0306 03:00:51.307573 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-nginx-config\") pod \"whisker-76895bf79b-6blbt\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " pod="calico-system/whisker-76895bf79b-6blbt" Mar 6 03:00:51.307792 kubelet[3463]: I0306 03:00:51.307743 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fee49c5-1cc8-4ecf-adca-b62999dce111-config-volume\") pod \"coredns-7d764666f9-6tqr6\" (UID: \"7fee49c5-1cc8-4ecf-adca-b62999dce111\") " pod="kube-system/coredns-7d764666f9-6tqr6" Mar 6 03:00:51.307792 kubelet[3463]: I0306 03:00:51.307760 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/af2cd53b-20a0-4423-8418-fd70d3b27c95-calico-apiserver-certs\") pod \"calico-apiserver-6b94f7659d-l66t8\" (UID: \"af2cd53b-20a0-4423-8418-fd70d3b27c95\") " pod="calico-system/calico-apiserver-6b94f7659d-l66t8" Mar 6 03:00:51.307792 kubelet[3463]: I0306 03:00:51.307771 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e4a41da-673a-45a8-bf0f-7d9710818af6-tigera-ca-bundle\") pod \"calico-kube-controllers-59b45bc7fd-tv5p4\" (UID: \"9e4a41da-673a-45a8-bf0f-7d9710818af6\") " pod="calico-system/calico-kube-controllers-59b45bc7fd-tv5p4" Mar 6 03:00:51.307937 kubelet[3463]: I0306 03:00:51.307895 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2354fc17-8c81-41a6-b3cd-47dc713195e5-config-volume\") pod \"coredns-7d764666f9-prjf5\" (UID: \"2354fc17-8c81-41a6-b3cd-47dc713195e5\") " pod="kube-system/coredns-7d764666f9-prjf5" Mar 6 03:00:51.307937 kubelet[3463]: I0306 03:00:51.307913 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wql2r\" (UniqueName: \"kubernetes.io/projected/e4484eb5-8708-4838-b3a7-3f82f9b3c273-kube-api-access-wql2r\") pod \"calico-apiserver-6b94f7659d-7htj6\" (UID: \"e4484eb5-8708-4838-b3a7-3f82f9b3c273\") " pod="calico-system/calico-apiserver-6b94f7659d-7htj6" Mar 6 03:00:51.307937 kubelet[3463]: I0306 03:00:51.307923 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4223e88-de52-4ec0-bd0f-e57546bb2a16-config\") pod \"goldmane-9f7667bb8-mmcmx\" (UID: \"f4223e88-de52-4ec0-bd0f-e57546bb2a16\") " pod="calico-system/goldmane-9f7667bb8-mmcmx" Mar 6 03:00:51.308051 kubelet[3463]: I0306 03:00:51.308041 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46s8f\" (UniqueName: \"kubernetes.io/projected/af2cd53b-20a0-4423-8418-fd70d3b27c95-kube-api-access-46s8f\") pod \"calico-apiserver-6b94f7659d-l66t8\" (UID: \"af2cd53b-20a0-4423-8418-fd70d3b27c95\") " pod="calico-system/calico-apiserver-6b94f7659d-l66t8" Mar 6 03:00:51.308423 kubelet[3463]: I0306 03:00:51.308370 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4223e88-de52-4ec0-bd0f-e57546bb2a16-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-mmcmx\" (UID: \"f4223e88-de52-4ec0-bd0f-e57546bb2a16\") " pod="calico-system/goldmane-9f7667bb8-mmcmx" Mar 6 03:00:51.308423 kubelet[3463]: I0306 03:00:51.308386 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hslc4\" (UniqueName: \"kubernetes.io/projected/f4223e88-de52-4ec0-bd0f-e57546bb2a16-kube-api-access-hslc4\") pod \"goldmane-9f7667bb8-mmcmx\" (UID: \"f4223e88-de52-4ec0-bd0f-e57546bb2a16\") " pod="calico-system/goldmane-9f7667bb8-mmcmx" Mar 6 03:00:51.335527 containerd[1884]: time="2026-03-06T03:00:51.335482842Z" level=error msg="Failed to destroy network for sandbox \"ae29de1db7f9a0f6699570c2f3d33a137f74344448d6591852e1f56d9a866723\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.337092 systemd[1]: run-netns-cni\x2d1be6f9eb\x2dff12\x2d1ed2\x2d6523\x2db1783b27168d.mount: Deactivated successfully. Mar 6 03:00:51.342465 containerd[1884]: time="2026-03-06T03:00:51.342415226Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vgvhp,Uid:34238614-4e43-4e4c-a383-1e98f49409d2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae29de1db7f9a0f6699570c2f3d33a137f74344448d6591852e1f56d9a866723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.342804 kubelet[3463]: E0306 03:00:51.342766 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae29de1db7f9a0f6699570c2f3d33a137f74344448d6591852e1f56d9a866723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.342863 kubelet[3463]: E0306 03:00:51.342828 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae29de1db7f9a0f6699570c2f3d33a137f74344448d6591852e1f56d9a866723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vgvhp" Mar 6 03:00:51.342863 kubelet[3463]: E0306 03:00:51.342846 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae29de1db7f9a0f6699570c2f3d33a137f74344448d6591852e1f56d9a866723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vgvhp" Mar 6 03:00:51.342938 kubelet[3463]: E0306 03:00:51.342901 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vgvhp_calico-system(34238614-4e43-4e4c-a383-1e98f49409d2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vgvhp_calico-system(34238614-4e43-4e4c-a383-1e98f49409d2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae29de1db7f9a0f6699570c2f3d33a137f74344448d6591852e1f56d9a866723\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vgvhp" podUID="34238614-4e43-4e4c-a383-1e98f49409d2" Mar 6 03:00:51.563493 containerd[1884]: time="2026-03-06T03:00:51.563364580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b45bc7fd-tv5p4,Uid:9e4a41da-673a-45a8-bf0f-7d9710818af6,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:51.568686 containerd[1884]: time="2026-03-06T03:00:51.568439554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-6tqr6,Uid:7fee49c5-1cc8-4ecf-adca-b62999dce111,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:51.573247 containerd[1884]: time="2026-03-06T03:00:51.573212743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-7htj6,Uid:e4484eb5-8708-4838-b3a7-3f82f9b3c273,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:51.580037 containerd[1884]: time="2026-03-06T03:00:51.580007946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-mmcmx,Uid:f4223e88-de52-4ec0-bd0f-e57546bb2a16,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:51.591439 containerd[1884]: time="2026-03-06T03:00:51.591401500Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-prjf5,Uid:2354fc17-8c81-41a6-b3cd-47dc713195e5,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:51.603681 containerd[1884]: time="2026-03-06T03:00:51.603363840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-l66t8,Uid:af2cd53b-20a0-4423-8418-fd70d3b27c95,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:51.612690 containerd[1884]: time="2026-03-06T03:00:51.612658144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76895bf79b-6blbt,Uid:3ff27126-6010-4931-ac8e-f991d4289c41,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:51.670959 containerd[1884]: time="2026-03-06T03:00:51.670902843Z" level=error msg="Failed to destroy network for sandbox \"05901fcca95619990706b3167bef75d2b28955592352b5914a1288a3cec42d92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.680632 containerd[1884]: time="2026-03-06T03:00:51.680569311Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-6tqr6,Uid:7fee49c5-1cc8-4ecf-adca-b62999dce111,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"05901fcca95619990706b3167bef75d2b28955592352b5914a1288a3cec42d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.681118 kubelet[3463]: E0306 03:00:51.681065 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05901fcca95619990706b3167bef75d2b28955592352b5914a1288a3cec42d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.681118 kubelet[3463]: E0306 03:00:51.681122 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05901fcca95619990706b3167bef75d2b28955592352b5914a1288a3cec42d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-6tqr6" Mar 6 03:00:51.681803 kubelet[3463]: E0306 03:00:51.681137 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05901fcca95619990706b3167bef75d2b28955592352b5914a1288a3cec42d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-6tqr6" Mar 6 03:00:51.682893 kubelet[3463]: E0306 03:00:51.682829 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-6tqr6_kube-system(7fee49c5-1cc8-4ecf-adca-b62999dce111)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-6tqr6_kube-system(7fee49c5-1cc8-4ecf-adca-b62999dce111)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05901fcca95619990706b3167bef75d2b28955592352b5914a1288a3cec42d92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-6tqr6" podUID="7fee49c5-1cc8-4ecf-adca-b62999dce111" Mar 6 03:00:51.693343 containerd[1884]: time="2026-03-06T03:00:51.693290554Z" level=error msg="Failed to destroy network for sandbox \"0365f8cfa4045bf4fb114b6caf1060aa9155e1f5a370b877e8f3814dc8e58e81\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.700589 containerd[1884]: time="2026-03-06T03:00:51.700536892Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b45bc7fd-tv5p4,Uid:9e4a41da-673a-45a8-bf0f-7d9710818af6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0365f8cfa4045bf4fb114b6caf1060aa9155e1f5a370b877e8f3814dc8e58e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.700854 kubelet[3463]: E0306 03:00:51.700765 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0365f8cfa4045bf4fb114b6caf1060aa9155e1f5a370b877e8f3814dc8e58e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.700854 kubelet[3463]: E0306 03:00:51.700814 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0365f8cfa4045bf4fb114b6caf1060aa9155e1f5a370b877e8f3814dc8e58e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59b45bc7fd-tv5p4" Mar 6 03:00:51.700854 kubelet[3463]: E0306 03:00:51.700828 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0365f8cfa4045bf4fb114b6caf1060aa9155e1f5a370b877e8f3814dc8e58e81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59b45bc7fd-tv5p4" Mar 6 03:00:51.700990 kubelet[3463]: E0306 03:00:51.700881 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-59b45bc7fd-tv5p4_calico-system(9e4a41da-673a-45a8-bf0f-7d9710818af6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-59b45bc7fd-tv5p4_calico-system(9e4a41da-673a-45a8-bf0f-7d9710818af6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0365f8cfa4045bf4fb114b6caf1060aa9155e1f5a370b877e8f3814dc8e58e81\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-59b45bc7fd-tv5p4" podUID="9e4a41da-673a-45a8-bf0f-7d9710818af6" Mar 6 03:00:51.726226 containerd[1884]: time="2026-03-06T03:00:51.726120279Z" level=error msg="Failed to destroy network for sandbox \"bca21c662a717d22826adf4be5ed6d865ce35f400e2d6494db3d17f0e5b262f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.732620 containerd[1884]: time="2026-03-06T03:00:51.732571727Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-7htj6,Uid:e4484eb5-8708-4838-b3a7-3f82f9b3c273,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bca21c662a717d22826adf4be5ed6d865ce35f400e2d6494db3d17f0e5b262f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.733230 kubelet[3463]: E0306 03:00:51.732791 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bca21c662a717d22826adf4be5ed6d865ce35f400e2d6494db3d17f0e5b262f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.733230 kubelet[3463]: E0306 03:00:51.732882 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bca21c662a717d22826adf4be5ed6d865ce35f400e2d6494db3d17f0e5b262f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6b94f7659d-7htj6" Mar 6 03:00:51.733230 kubelet[3463]: E0306 03:00:51.732900 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bca21c662a717d22826adf4be5ed6d865ce35f400e2d6494db3d17f0e5b262f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6b94f7659d-7htj6" Mar 6 03:00:51.733349 kubelet[3463]: E0306 03:00:51.733058 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b94f7659d-7htj6_calico-system(e4484eb5-8708-4838-b3a7-3f82f9b3c273)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b94f7659d-7htj6_calico-system(e4484eb5-8708-4838-b3a7-3f82f9b3c273)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bca21c662a717d22826adf4be5ed6d865ce35f400e2d6494db3d17f0e5b262f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6b94f7659d-7htj6" podUID="e4484eb5-8708-4838-b3a7-3f82f9b3c273" Mar 6 03:00:51.743942 containerd[1884]: time="2026-03-06T03:00:51.743902255Z" level=error msg="Failed to destroy network for sandbox \"9993f851f64ed7d1ff283196f6a48f32c3ee5481705538f59106aa73f3e87f50\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.750097 containerd[1884]: time="2026-03-06T03:00:51.749990581Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-mmcmx,Uid:f4223e88-de52-4ec0-bd0f-e57546bb2a16,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9993f851f64ed7d1ff283196f6a48f32c3ee5481705538f59106aa73f3e87f50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.750664 kubelet[3463]: E0306 03:00:51.750339 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9993f851f64ed7d1ff283196f6a48f32c3ee5481705538f59106aa73f3e87f50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.750664 kubelet[3463]: E0306 03:00:51.750391 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9993f851f64ed7d1ff283196f6a48f32c3ee5481705538f59106aa73f3e87f50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-mmcmx" Mar 6 03:00:51.750664 kubelet[3463]: E0306 03:00:51.750407 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9993f851f64ed7d1ff283196f6a48f32c3ee5481705538f59106aa73f3e87f50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-mmcmx" Mar 6 03:00:51.750772 kubelet[3463]: E0306 03:00:51.750456 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-mmcmx_calico-system(f4223e88-de52-4ec0-bd0f-e57546bb2a16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-mmcmx_calico-system(f4223e88-de52-4ec0-bd0f-e57546bb2a16)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9993f851f64ed7d1ff283196f6a48f32c3ee5481705538f59106aa73f3e87f50\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-mmcmx" podUID="f4223e88-de52-4ec0-bd0f-e57546bb2a16" Mar 6 03:00:51.761821 containerd[1884]: time="2026-03-06T03:00:51.761782443Z" level=error msg="Failed to destroy network for sandbox \"5d8bc32b4f954761e957f285c97137bd172968356b89f6b518ede1b64d1ed4f7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.761952 containerd[1884]: time="2026-03-06T03:00:51.761907191Z" level=error msg="Failed to destroy network for sandbox \"6352596eb573aec2d674a43b2061cd9e1b807fc5fbfe8a46f61bcd34d90285e6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.762679 containerd[1884]: time="2026-03-06T03:00:51.762649142Z" level=error msg="Failed to destroy network for sandbox \"837ba373d732efa8dff92a07c28225e5c8e8b22a6ddc9b78c6eca9cb2489941a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.765594 containerd[1884]: time="2026-03-06T03:00:51.765564433Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76895bf79b-6blbt,Uid:3ff27126-6010-4931-ac8e-f991d4289c41,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6352596eb573aec2d674a43b2061cd9e1b807fc5fbfe8a46f61bcd34d90285e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.765903 kubelet[3463]: E0306 03:00:51.765859 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6352596eb573aec2d674a43b2061cd9e1b807fc5fbfe8a46f61bcd34d90285e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.765962 kubelet[3463]: E0306 03:00:51.765912 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6352596eb573aec2d674a43b2061cd9e1b807fc5fbfe8a46f61bcd34d90285e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76895bf79b-6blbt" Mar 6 03:00:51.765962 kubelet[3463]: E0306 03:00:51.765927 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6352596eb573aec2d674a43b2061cd9e1b807fc5fbfe8a46f61bcd34d90285e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76895bf79b-6blbt" Mar 6 03:00:51.766009 kubelet[3463]: E0306 03:00:51.765972 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-76895bf79b-6blbt_calico-system(3ff27126-6010-4931-ac8e-f991d4289c41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-76895bf79b-6blbt_calico-system(3ff27126-6010-4931-ac8e-f991d4289c41)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6352596eb573aec2d674a43b2061cd9e1b807fc5fbfe8a46f61bcd34d90285e6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-76895bf79b-6blbt" podUID="3ff27126-6010-4931-ac8e-f991d4289c41" Mar 6 03:00:51.769497 containerd[1884]: time="2026-03-06T03:00:51.769458834Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-l66t8,Uid:af2cd53b-20a0-4423-8418-fd70d3b27c95,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d8bc32b4f954761e957f285c97137bd172968356b89f6b518ede1b64d1ed4f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.769822 kubelet[3463]: E0306 03:00:51.769706 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d8bc32b4f954761e957f285c97137bd172968356b89f6b518ede1b64d1ed4f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.769822 kubelet[3463]: E0306 03:00:51.769740 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d8bc32b4f954761e957f285c97137bd172968356b89f6b518ede1b64d1ed4f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6b94f7659d-l66t8" Mar 6 03:00:51.769822 kubelet[3463]: E0306 03:00:51.769752 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d8bc32b4f954761e957f285c97137bd172968356b89f6b518ede1b64d1ed4f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6b94f7659d-l66t8" Mar 6 03:00:51.769919 kubelet[3463]: E0306 03:00:51.769787 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b94f7659d-l66t8_calico-system(af2cd53b-20a0-4423-8418-fd70d3b27c95)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b94f7659d-l66t8_calico-system(af2cd53b-20a0-4423-8418-fd70d3b27c95)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d8bc32b4f954761e957f285c97137bd172968356b89f6b518ede1b64d1ed4f7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6b94f7659d-l66t8" podUID="af2cd53b-20a0-4423-8418-fd70d3b27c95" Mar 6 03:00:51.774078 containerd[1884]: time="2026-03-06T03:00:51.774029456Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-prjf5,Uid:2354fc17-8c81-41a6-b3cd-47dc713195e5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"837ba373d732efa8dff92a07c28225e5c8e8b22a6ddc9b78c6eca9cb2489941a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.775742 kubelet[3463]: E0306 03:00:51.775157 3463 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"837ba373d732efa8dff92a07c28225e5c8e8b22a6ddc9b78c6eca9cb2489941a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:51.775742 kubelet[3463]: E0306 03:00:51.775231 3463 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"837ba373d732efa8dff92a07c28225e5c8e8b22a6ddc9b78c6eca9cb2489941a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-prjf5" Mar 6 03:00:51.775742 kubelet[3463]: E0306 03:00:51.775245 3463 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"837ba373d732efa8dff92a07c28225e5c8e8b22a6ddc9b78c6eca9cb2489941a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-prjf5" Mar 6 03:00:51.775850 kubelet[3463]: E0306 03:00:51.775316 3463 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-prjf5_kube-system(2354fc17-8c81-41a6-b3cd-47dc713195e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-prjf5_kube-system(2354fc17-8c81-41a6-b3cd-47dc713195e5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"837ba373d732efa8dff92a07c28225e5c8e8b22a6ddc9b78c6eca9cb2489941a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-prjf5" podUID="2354fc17-8c81-41a6-b3cd-47dc713195e5" Mar 6 03:00:52.220687 containerd[1884]: time="2026-03-06T03:00:52.220633016Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 6 03:00:52.244932 containerd[1884]: time="2026-03-06T03:00:52.244885905Z" level=info msg="Container 2740dac05edcf86e638a3a9ab184f49bd5f5d7ea623a1b347a7f74e3d3d8480c: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:52.261715 containerd[1884]: time="2026-03-06T03:00:52.261665267Z" level=info msg="CreateContainer within sandbox \"98ef9fbb324710f1dd137da161b8c36895afce89b82111de53ec96e709479046\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"2740dac05edcf86e638a3a9ab184f49bd5f5d7ea623a1b347a7f74e3d3d8480c\"" Mar 6 03:00:52.262602 containerd[1884]: time="2026-03-06T03:00:52.262408074Z" level=info msg="StartContainer for \"2740dac05edcf86e638a3a9ab184f49bd5f5d7ea623a1b347a7f74e3d3d8480c\"" Mar 6 03:00:52.264651 containerd[1884]: time="2026-03-06T03:00:52.264622119Z" level=info msg="connecting to shim 2740dac05edcf86e638a3a9ab184f49bd5f5d7ea623a1b347a7f74e3d3d8480c" address="unix:///run/containerd/s/a848697741f4917bbf6c307ddf06d7e909ae7097dae48ecce018f6168710a773" protocol=ttrpc version=3 Mar 6 03:00:52.282348 systemd[1]: Started cri-containerd-2740dac05edcf86e638a3a9ab184f49bd5f5d7ea623a1b347a7f74e3d3d8480c.scope - libcontainer container 2740dac05edcf86e638a3a9ab184f49bd5f5d7ea623a1b347a7f74e3d3d8480c. Mar 6 03:00:52.349562 containerd[1884]: time="2026-03-06T03:00:52.349510821Z" level=info msg="StartContainer for \"2740dac05edcf86e638a3a9ab184f49bd5f5d7ea623a1b347a7f74e3d3d8480c\" returns successfully" Mar 6 03:00:52.616259 kubelet[3463]: I0306 03:00:52.615938 3463 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-nginx-config\" (UniqueName: \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-nginx-config\") pod \"3ff27126-6010-4931-ac8e-f991d4289c41\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " Mar 6 03:00:52.616259 kubelet[3463]: I0306 03:00:52.616118 3463 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/3ff27126-6010-4931-ac8e-f991d4289c41-kube-api-access-6h4x2\" (UniqueName: \"kubernetes.io/projected/3ff27126-6010-4931-ac8e-f991d4289c41-kube-api-access-6h4x2\") pod \"3ff27126-6010-4931-ac8e-f991d4289c41\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " Mar 6 03:00:52.616259 kubelet[3463]: I0306 03:00:52.616140 3463 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-backend-key-pair\") pod \"3ff27126-6010-4931-ac8e-f991d4289c41\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " Mar 6 03:00:52.616259 kubelet[3463]: I0306 03:00:52.616157 3463 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-ca-bundle\") pod \"3ff27126-6010-4931-ac8e-f991d4289c41\" (UID: \"3ff27126-6010-4931-ac8e-f991d4289c41\") " Mar 6 03:00:52.616430 kubelet[3463]: I0306 03:00:52.616396 3463 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-nginx-config" pod "3ff27126-6010-4931-ac8e-f991d4289c41" (UID: "3ff27126-6010-4931-ac8e-f991d4289c41"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 03:00:52.616790 kubelet[3463]: I0306 03:00:52.616764 3463 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-nginx-config\") on node \"ci-4459.2.3-n-38e0d2a52a\" DevicePath \"\"" Mar 6 03:00:52.617003 kubelet[3463]: I0306 03:00:52.616980 3463 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-ca-bundle" pod "3ff27126-6010-4931-ac8e-f991d4289c41" (UID: "3ff27126-6010-4931-ac8e-f991d4289c41"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 03:00:52.619952 systemd[1]: var-lib-kubelet-pods-3ff27126\x2d6010\x2d4931\x2dac8e\x2df991d4289c41-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6h4x2.mount: Deactivated successfully. Mar 6 03:00:52.620621 kubelet[3463]: I0306 03:00:52.620591 3463 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff27126-6010-4931-ac8e-f991d4289c41-kube-api-access-6h4x2" pod "3ff27126-6010-4931-ac8e-f991d4289c41" (UID: "3ff27126-6010-4931-ac8e-f991d4289c41"). InnerVolumeSpecName "kube-api-access-6h4x2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 6 03:00:52.623151 kubelet[3463]: I0306 03:00:52.623110 3463 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-backend-key-pair" pod "3ff27126-6010-4931-ac8e-f991d4289c41" (UID: "3ff27126-6010-4931-ac8e-f991d4289c41"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 6 03:00:52.623376 systemd[1]: var-lib-kubelet-pods-3ff27126\x2d6010\x2d4931\x2dac8e\x2df991d4289c41-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 6 03:00:52.717637 kubelet[3463]: I0306 03:00:52.717589 3463 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6h4x2\" (UniqueName: \"kubernetes.io/projected/3ff27126-6010-4931-ac8e-f991d4289c41-kube-api-access-6h4x2\") on node \"ci-4459.2.3-n-38e0d2a52a\" DevicePath \"\"" Mar 6 03:00:52.717637 kubelet[3463]: I0306 03:00:52.717627 3463 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-backend-key-pair\") on node \"ci-4459.2.3-n-38e0d2a52a\" DevicePath \"\"" Mar 6 03:00:52.717637 kubelet[3463]: I0306 03:00:52.717634 3463 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ff27126-6010-4931-ac8e-f991d4289c41-whisker-ca-bundle\") on node \"ci-4459.2.3-n-38e0d2a52a\" DevicePath \"\"" Mar 6 03:00:53.220601 systemd[1]: Removed slice kubepods-besteffort-pod3ff27126_6010_4931_ac8e_f991d4289c41.slice - libcontainer container kubepods-besteffort-pod3ff27126_6010_4931_ac8e_f991d4289c41.slice. Mar 6 03:00:53.247061 kubelet[3463]: I0306 03:00:53.244825 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-bsrgz" podStartSLOduration=2.949728418 podStartE2EDuration="1m7.24481323s" podCreationTimestamp="2026-03-06 02:59:46 +0000 UTC" firstStartedPulling="2026-03-06 02:59:47.911627539 +0000 UTC m=+20.028600917" lastFinishedPulling="2026-03-06 03:00:52.206712351 +0000 UTC m=+84.323685729" observedRunningTime="2026-03-06 03:00:53.231680646 +0000 UTC m=+85.348654024" watchObservedRunningTime="2026-03-06 03:00:53.24481323 +0000 UTC m=+85.361786608" Mar 6 03:00:53.317040 systemd[1]: Created slice kubepods-besteffort-pod53d58938_075e_4940_bbd8_a68c2e015c05.slice - libcontainer container kubepods-besteffort-pod53d58938_075e_4940_bbd8_a68c2e015c05.slice. Mar 6 03:00:53.423154 kubelet[3463]: I0306 03:00:53.422982 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53d58938-075e-4940-bbd8-a68c2e015c05-whisker-ca-bundle\") pod \"whisker-5c89594cd5-cjvrd\" (UID: \"53d58938-075e-4940-bbd8-a68c2e015c05\") " pod="calico-system/whisker-5c89594cd5-cjvrd" Mar 6 03:00:53.423154 kubelet[3463]: I0306 03:00:53.423030 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/53d58938-075e-4940-bbd8-a68c2e015c05-whisker-backend-key-pair\") pod \"whisker-5c89594cd5-cjvrd\" (UID: \"53d58938-075e-4940-bbd8-a68c2e015c05\") " pod="calico-system/whisker-5c89594cd5-cjvrd" Mar 6 03:00:53.423154 kubelet[3463]: I0306 03:00:53.423056 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/53d58938-075e-4940-bbd8-a68c2e015c05-nginx-config\") pod \"whisker-5c89594cd5-cjvrd\" (UID: \"53d58938-075e-4940-bbd8-a68c2e015c05\") " pod="calico-system/whisker-5c89594cd5-cjvrd" Mar 6 03:00:53.423154 kubelet[3463]: I0306 03:00:53.423070 3463 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pcbb\" (UniqueName: \"kubernetes.io/projected/53d58938-075e-4940-bbd8-a68c2e015c05-kube-api-access-2pcbb\") pod \"whisker-5c89594cd5-cjvrd\" (UID: \"53d58938-075e-4940-bbd8-a68c2e015c05\") " pod="calico-system/whisker-5c89594cd5-cjvrd" Mar 6 03:00:55.106978 kubelet[3463]: I0306 03:00:55.106459 3463 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="3ff27126-6010-4931-ac8e-f991d4289c41" path="/var/lib/kubelet/pods/3ff27126-6010-4931-ac8e-f991d4289c41/volumes" Mar 6 03:00:55.106978 kubelet[3463]: E0306 03:00:55.106929 3463 kubelet.go:2691] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.104s" Mar 6 03:00:55.613356 containerd[1884]: time="2026-03-06T03:00:55.613022270Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c89594cd5-cjvrd,Uid:53d58938-075e-4940-bbd8-a68c2e015c05,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:56.257324 systemd-networkd[1498]: vxlan.calico: Link UP Mar 6 03:00:56.257332 systemd-networkd[1498]: vxlan.calico: Gained carrier Mar 6 03:00:56.358601 systemd-networkd[1498]: cali8a69d246667: Link UP Mar 6 03:00:56.358766 systemd-networkd[1498]: cali8a69d246667: Gained carrier Mar 6 03:00:56.379199 containerd[1884]: 2026-03-06 03:00:56.209 [INFO][4764] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0 whisker-5c89594cd5- calico-system 53d58938-075e-4940-bbd8-a68c2e015c05 993 0 2026-03-06 03:00:53 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5c89594cd5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a whisker-5c89594cd5-cjvrd eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali8a69d246667 [] [] }} ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-" Mar 6 03:00:56.379199 containerd[1884]: 2026-03-06 03:00:56.209 [INFO][4764] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" Mar 6 03:00:56.379199 containerd[1884]: 2026-03-06 03:00:56.228 [INFO][4775] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" HandleID="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.233 [INFO][4775] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" HandleID="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbdd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"whisker-5c89594cd5-cjvrd", "timestamp":"2026-03-06 03:00:56.228286463 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003018c0)} Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.233 [INFO][4775] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.233 [INFO][4775] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.233 [INFO][4775] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.235 [INFO][4775] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.239 [INFO][4775] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.242 [INFO][4775] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.243 [INFO][4775] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379379 containerd[1884]: 2026-03-06 03:00:56.245 [INFO][4775] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379563 containerd[1884]: 2026-03-06 03:00:56.245 [INFO][4775] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379563 containerd[1884]: 2026-03-06 03:00:56.246 [INFO][4775] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f Mar 6 03:00:56.379563 containerd[1884]: 2026-03-06 03:00:56.252 [INFO][4775] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379563 containerd[1884]: 2026-03-06 03:00:56.260 [INFO][4775] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.1/26] block=192.168.81.0/26 handle="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379563 containerd[1884]: 2026-03-06 03:00:56.260 [INFO][4775] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.1/26] handle="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:00:56.379563 containerd[1884]: 2026-03-06 03:00:56.261 [INFO][4775] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:00:56.379563 containerd[1884]: 2026-03-06 03:00:56.261 [INFO][4775] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.1/26] IPv6=[] ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" HandleID="k8s-pod-network.ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" Mar 6 03:00:56.379661 containerd[1884]: 2026-03-06 03:00:56.267 [INFO][4764] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0", GenerateName:"whisker-5c89594cd5-", Namespace:"calico-system", SelfLink:"", UID:"53d58938-075e-4940-bbd8-a68c2e015c05", ResourceVersion:"993", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5c89594cd5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"whisker-5c89594cd5-cjvrd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.81.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8a69d246667", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:00:56.379661 containerd[1884]: 2026-03-06 03:00:56.267 [INFO][4764] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.1/32] ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" Mar 6 03:00:56.379739 containerd[1884]: 2026-03-06 03:00:56.267 [INFO][4764] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8a69d246667 ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" Mar 6 03:00:56.379739 containerd[1884]: 2026-03-06 03:00:56.358 [INFO][4764] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" Mar 6 03:00:56.379769 containerd[1884]: 2026-03-06 03:00:56.359 [INFO][4764] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0", GenerateName:"whisker-5c89594cd5-", Namespace:"calico-system", SelfLink:"", UID:"53d58938-075e-4940-bbd8-a68c2e015c05", ResourceVersion:"993", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5c89594cd5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f", Pod:"whisker-5c89594cd5-cjvrd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.81.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8a69d246667", MAC:"2e:ed:56:30:cb:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:00:56.379803 containerd[1884]: 2026-03-06 03:00:56.372 [INFO][4764] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" Namespace="calico-system" Pod="whisker-5c89594cd5-cjvrd" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-whisker--5c89594cd5--cjvrd-eth0" Mar 6 03:00:56.713961 containerd[1884]: time="2026-03-06T03:00:56.713915059Z" level=info msg="connecting to shim ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f" address="unix:///run/containerd/s/e5e68b59e704a627fe8bffe52cec8b9cc3bad8752f2c4624adb97ed97978dcbd" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:56.732315 systemd[1]: Started cri-containerd-ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f.scope - libcontainer container ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f. Mar 6 03:00:56.768603 containerd[1884]: time="2026-03-06T03:00:56.768437394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c89594cd5-cjvrd,Uid:53d58938-075e-4940-bbd8-a68c2e015c05,Namespace:calico-system,Attempt:0,} returns sandbox id \"ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f\"" Mar 6 03:00:56.775550 containerd[1884]: time="2026-03-06T03:00:56.775492901Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 6 03:00:58.174336 systemd-networkd[1498]: vxlan.calico: Gained IPv6LL Mar 6 03:00:58.238403 systemd-networkd[1498]: cali8a69d246667: Gained IPv6LL Mar 6 03:01:04.010379 containerd[1884]: time="2026-03-06T03:01:04.010340293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-mmcmx,Uid:f4223e88-de52-4ec0-bd0f-e57546bb2a16,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:04.067413 containerd[1884]: time="2026-03-06T03:01:04.067370306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-l66t8,Uid:af2cd53b-20a0-4423-8418-fd70d3b27c95,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:04.259186 systemd-networkd[1498]: calic45f81e1a79: Link UP Mar 6 03:01:04.261315 systemd-networkd[1498]: calic45f81e1a79: Gained carrier Mar 6 03:01:04.283690 containerd[1884]: 2026-03-06 03:01:04.188 [INFO][4933] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0 goldmane-9f7667bb8- calico-system f4223e88-de52-4ec0-bd0f-e57546bb2a16 937 0 2026-03-06 02:59:46 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a goldmane-9f7667bb8-mmcmx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic45f81e1a79 [] [] }} ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-" Mar 6 03:01:04.283690 containerd[1884]: 2026-03-06 03:01:04.188 [INFO][4933] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" Mar 6 03:01:04.283690 containerd[1884]: 2026-03-06 03:01:04.205 [INFO][4944] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" HandleID="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.210 [INFO][4944] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" HandleID="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb270), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"goldmane-9f7667bb8-mmcmx", "timestamp":"2026-03-06 03:01:04.205943027 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000305080)} Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.211 [INFO][4944] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.211 [INFO][4944] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.211 [INFO][4944] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.214 [INFO][4944] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.220 [INFO][4944] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.225 [INFO][4944] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.229 [INFO][4944] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284273 containerd[1884]: 2026-03-06 03:01:04.231 [INFO][4944] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284430 containerd[1884]: 2026-03-06 03:01:04.231 [INFO][4944] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284430 containerd[1884]: 2026-03-06 03:01:04.234 [INFO][4944] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c Mar 6 03:01:04.284430 containerd[1884]: 2026-03-06 03:01:04.239 [INFO][4944] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284430 containerd[1884]: 2026-03-06 03:01:04.249 [INFO][4944] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.2/26] block=192.168.81.0/26 handle="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284430 containerd[1884]: 2026-03-06 03:01:04.249 [INFO][4944] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.2/26] handle="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.284430 containerd[1884]: 2026-03-06 03:01:04.249 [INFO][4944] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:04.284430 containerd[1884]: 2026-03-06 03:01:04.249 [INFO][4944] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.2/26] IPv6=[] ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" HandleID="k8s-pod-network.17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" Mar 6 03:01:04.284527 containerd[1884]: 2026-03-06 03:01:04.252 [INFO][4933] cni-plugin/k8s.go 418: Populated endpoint ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"f4223e88-de52-4ec0-bd0f-e57546bb2a16", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"goldmane-9f7667bb8-mmcmx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.81.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic45f81e1a79", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:04.284527 containerd[1884]: 2026-03-06 03:01:04.253 [INFO][4933] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.2/32] ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" Mar 6 03:01:04.284586 containerd[1884]: 2026-03-06 03:01:04.253 [INFO][4933] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic45f81e1a79 ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" Mar 6 03:01:04.284586 containerd[1884]: 2026-03-06 03:01:04.263 [INFO][4933] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" Mar 6 03:01:04.284617 containerd[1884]: 2026-03-06 03:01:04.263 [INFO][4933] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"f4223e88-de52-4ec0-bd0f-e57546bb2a16", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c", Pod:"goldmane-9f7667bb8-mmcmx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.81.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic45f81e1a79", MAC:"12:a5:63:5d:97:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:04.284649 containerd[1884]: 2026-03-06 03:01:04.279 [INFO][4933] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" Namespace="calico-system" Pod="goldmane-9f7667bb8-mmcmx" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-goldmane--9f7667bb8--mmcmx-eth0" Mar 6 03:01:04.347474 systemd-networkd[1498]: calibaac65f1f23: Link UP Mar 6 03:01:04.348135 systemd-networkd[1498]: calibaac65f1f23: Gained carrier Mar 6 03:01:04.365536 containerd[1884]: 2026-03-06 03:01:04.254 [INFO][4953] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0 calico-apiserver-6b94f7659d- calico-system af2cd53b-20a0-4423-8418-fd70d3b27c95 938 0 2026-03-06 02:59:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b94f7659d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a calico-apiserver-6b94f7659d-l66t8 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calibaac65f1f23 [] [] }} ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-" Mar 6 03:01:04.365536 containerd[1884]: 2026-03-06 03:01:04.254 [INFO][4953] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" Mar 6 03:01:04.365536 containerd[1884]: 2026-03-06 03:01:04.289 [INFO][4965] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" HandleID="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.296 [INFO][4965] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" HandleID="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbe80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"calico-apiserver-6b94f7659d-l66t8", "timestamp":"2026-03-06 03:01:04.28901731 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001862c0)} Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.296 [INFO][4965] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.296 [INFO][4965] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.296 [INFO][4965] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.314 [INFO][4965] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.320 [INFO][4965] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.326 [INFO][4965] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.327 [INFO][4965] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.365692 containerd[1884]: 2026-03-06 03:01:04.329 [INFO][4965] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.366262 containerd[1884]: 2026-03-06 03:01:04.329 [INFO][4965] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.366262 containerd[1884]: 2026-03-06 03:01:04.330 [INFO][4965] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b Mar 6 03:01:04.366262 containerd[1884]: 2026-03-06 03:01:04.336 [INFO][4965] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.366262 containerd[1884]: 2026-03-06 03:01:04.343 [INFO][4965] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.3/26] block=192.168.81.0/26 handle="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.366262 containerd[1884]: 2026-03-06 03:01:04.343 [INFO][4965] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.3/26] handle="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:04.366262 containerd[1884]: 2026-03-06 03:01:04.343 [INFO][4965] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:04.366262 containerd[1884]: 2026-03-06 03:01:04.343 [INFO][4965] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.3/26] IPv6=[] ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" HandleID="k8s-pod-network.01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" Mar 6 03:01:04.366794 containerd[1884]: 2026-03-06 03:01:04.345 [INFO][4953] cni-plugin/k8s.go 418: Populated endpoint ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0", GenerateName:"calico-apiserver-6b94f7659d-", Namespace:"calico-system", SelfLink:"", UID:"af2cd53b-20a0-4423-8418-fd70d3b27c95", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b94f7659d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"calico-apiserver-6b94f7659d-l66t8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.81.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calibaac65f1f23", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:04.366850 containerd[1884]: 2026-03-06 03:01:04.345 [INFO][4953] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.3/32] ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" Mar 6 03:01:04.366850 containerd[1884]: 2026-03-06 03:01:04.345 [INFO][4953] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibaac65f1f23 ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" Mar 6 03:01:04.366850 containerd[1884]: 2026-03-06 03:01:04.348 [INFO][4953] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" Mar 6 03:01:04.366896 containerd[1884]: 2026-03-06 03:01:04.348 [INFO][4953] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0", GenerateName:"calico-apiserver-6b94f7659d-", Namespace:"calico-system", SelfLink:"", UID:"af2cd53b-20a0-4423-8418-fd70d3b27c95", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b94f7659d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b", Pod:"calico-apiserver-6b94f7659d-l66t8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.81.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calibaac65f1f23", MAC:"4a:13:f2:10:4a:6e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:04.366931 containerd[1884]: 2026-03-06 03:01:04.362 [INFO][4953] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-l66t8" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--l66t8-eth0" Mar 6 03:01:04.773577 containerd[1884]: time="2026-03-06T03:01:04.773525713Z" level=info msg="connecting to shim 17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c" address="unix:///run/containerd/s/506f3d9197b87a4d690be737f0d71246cf43ab5328833a11f326edd9f8a6a6a5" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:04.791320 systemd[1]: Started cri-containerd-17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c.scope - libcontainer container 17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c. Mar 6 03:01:04.909939 containerd[1884]: time="2026-03-06T03:01:04.909885478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-mmcmx,Uid:f4223e88-de52-4ec0-bd0f-e57546bb2a16,Namespace:calico-system,Attempt:0,} returns sandbox id \"17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c\"" Mar 6 03:01:04.977897 containerd[1884]: time="2026-03-06T03:01:04.977847325Z" level=info msg="connecting to shim 01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b" address="unix:///run/containerd/s/e4db46707fc67e39feec1924f58615d542a7fde72c7ae7bb177a926af1187610" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:04.995302 systemd[1]: Started cri-containerd-01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b.scope - libcontainer container 01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b. Mar 6 03:01:05.025103 containerd[1884]: time="2026-03-06T03:01:05.024947263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-l66t8,Uid:af2cd53b-20a0-4423-8418-fd70d3b27c95,Namespace:calico-system,Attempt:0,} returns sandbox id \"01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b\"" Mar 6 03:01:05.406384 systemd-networkd[1498]: calibaac65f1f23: Gained IPv6LL Mar 6 03:01:05.790315 systemd-networkd[1498]: calic45f81e1a79: Gained IPv6LL Mar 6 03:01:06.012385 containerd[1884]: time="2026-03-06T03:01:06.012341606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vgvhp,Uid:34238614-4e43-4e4c-a383-1e98f49409d2,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:06.016378 containerd[1884]: time="2026-03-06T03:01:06.016338370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-prjf5,Uid:2354fc17-8c81-41a6-b3cd-47dc713195e5,Namespace:kube-system,Attempt:0,}" Mar 6 03:01:06.062416 containerd[1884]: time="2026-03-06T03:01:06.062308681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b45bc7fd-tv5p4,Uid:9e4a41da-673a-45a8-bf0f-7d9710818af6,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:06.110746 containerd[1884]: time="2026-03-06T03:01:06.110657947Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-7htj6,Uid:e4484eb5-8708-4838-b3a7-3f82f9b3c273,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:06.303324 systemd-networkd[1498]: cali01edcc38c2e: Link UP Mar 6 03:01:06.304246 systemd-networkd[1498]: cali01edcc38c2e: Gained carrier Mar 6 03:01:06.332221 containerd[1884]: 2026-03-06 03:01:06.182 [INFO][5121] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0 csi-node-driver- calico-system 34238614-4e43-4e4c-a383-1e98f49409d2 690 0 2026-03-06 02:59:47 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a csi-node-driver-vgvhp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali01edcc38c2e [] [] }} ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-" Mar 6 03:01:06.332221 containerd[1884]: 2026-03-06 03:01:06.182 [INFO][5121] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" Mar 6 03:01:06.332221 containerd[1884]: 2026-03-06 03:01:06.202 [INFO][5134] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" HandleID="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.208 [INFO][5134] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" HandleID="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002e3e80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"csi-node-driver-vgvhp", "timestamp":"2026-03-06 03:01:06.202039759 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400055cdc0)} Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.208 [INFO][5134] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.208 [INFO][5134] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.208 [INFO][5134] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.210 [INFO][5134] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.214 [INFO][5134] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.223 [INFO][5134] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.225 [INFO][5134] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332453 containerd[1884]: 2026-03-06 03:01:06.265 [INFO][5134] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332624 containerd[1884]: 2026-03-06 03:01:06.265 [INFO][5134] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332624 containerd[1884]: 2026-03-06 03:01:06.268 [INFO][5134] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070 Mar 6 03:01:06.332624 containerd[1884]: 2026-03-06 03:01:06.279 [INFO][5134] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332624 containerd[1884]: 2026-03-06 03:01:06.289 [INFO][5134] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.4/26] block=192.168.81.0/26 handle="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332624 containerd[1884]: 2026-03-06 03:01:06.289 [INFO][5134] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.4/26] handle="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.332624 containerd[1884]: 2026-03-06 03:01:06.289 [INFO][5134] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:06.332624 containerd[1884]: 2026-03-06 03:01:06.289 [INFO][5134] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.4/26] IPv6=[] ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" HandleID="k8s-pod-network.8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" Mar 6 03:01:06.332719 containerd[1884]: 2026-03-06 03:01:06.293 [INFO][5121] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"34238614-4e43-4e4c-a383-1e98f49409d2", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"csi-node-driver-vgvhp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.81.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali01edcc38c2e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.332759 containerd[1884]: 2026-03-06 03:01:06.295 [INFO][5121] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.4/32] ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" Mar 6 03:01:06.332759 containerd[1884]: 2026-03-06 03:01:06.295 [INFO][5121] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali01edcc38c2e ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" Mar 6 03:01:06.332759 containerd[1884]: 2026-03-06 03:01:06.301 [INFO][5121] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" Mar 6 03:01:06.332804 containerd[1884]: 2026-03-06 03:01:06.302 [INFO][5121] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"34238614-4e43-4e4c-a383-1e98f49409d2", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070", Pod:"csi-node-driver-vgvhp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.81.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali01edcc38c2e", MAC:"16:62:85:97:9c:c8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.332838 containerd[1884]: 2026-03-06 03:01:06.322 [INFO][5121] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" Namespace="calico-system" Pod="csi-node-driver-vgvhp" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-csi--node--driver--vgvhp-eth0" Mar 6 03:01:06.436590 systemd-networkd[1498]: cali354b8af1014: Link UP Mar 6 03:01:06.438539 systemd-networkd[1498]: cali354b8af1014: Gained carrier Mar 6 03:01:06.463396 containerd[1884]: 2026-03-06 03:01:06.326 [INFO][5141] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0 coredns-7d764666f9- kube-system 2354fc17-8c81-41a6-b3cd-47dc713195e5 940 0 2026-03-06 02:59:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a coredns-7d764666f9-prjf5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali354b8af1014 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-" Mar 6 03:01:06.463396 containerd[1884]: 2026-03-06 03:01:06.326 [INFO][5141] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" Mar 6 03:01:06.463396 containerd[1884]: 2026-03-06 03:01:06.362 [INFO][5175] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" HandleID="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.371 [INFO][5175] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" HandleID="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb3e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"coredns-7d764666f9-prjf5", "timestamp":"2026-03-06 03:01:06.36227354 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000286f20)} Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.371 [INFO][5175] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.372 [INFO][5175] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.372 [INFO][5175] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.375 [INFO][5175] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.379 [INFO][5175] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.385 [INFO][5175] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.388 [INFO][5175] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.463865 containerd[1884]: 2026-03-06 03:01:06.393 [INFO][5175] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.464943 containerd[1884]: 2026-03-06 03:01:06.393 [INFO][5175] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.464943 containerd[1884]: 2026-03-06 03:01:06.396 [INFO][5175] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68 Mar 6 03:01:06.464943 containerd[1884]: 2026-03-06 03:01:06.406 [INFO][5175] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.464943 containerd[1884]: 2026-03-06 03:01:06.420 [INFO][5175] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.5/26] block=192.168.81.0/26 handle="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.464943 containerd[1884]: 2026-03-06 03:01:06.421 [INFO][5175] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.5/26] handle="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.464943 containerd[1884]: 2026-03-06 03:01:06.421 [INFO][5175] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:06.464943 containerd[1884]: 2026-03-06 03:01:06.421 [INFO][5175] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.5/26] IPv6=[] ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" HandleID="k8s-pod-network.751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" Mar 6 03:01:06.465054 containerd[1884]: 2026-03-06 03:01:06.429 [INFO][5141] cni-plugin/k8s.go 418: Populated endpoint ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"2354fc17-8c81-41a6-b3cd-47dc713195e5", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"coredns-7d764666f9-prjf5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.81.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali354b8af1014", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.465054 containerd[1884]: 2026-03-06 03:01:06.429 [INFO][5141] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.5/32] ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" Mar 6 03:01:06.465054 containerd[1884]: 2026-03-06 03:01:06.429 [INFO][5141] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali354b8af1014 ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" Mar 6 03:01:06.465054 containerd[1884]: 2026-03-06 03:01:06.438 [INFO][5141] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" Mar 6 03:01:06.465054 containerd[1884]: 2026-03-06 03:01:06.440 [INFO][5141] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"2354fc17-8c81-41a6-b3cd-47dc713195e5", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68", Pod:"coredns-7d764666f9-prjf5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.81.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali354b8af1014", MAC:"26:5c:0a:0c:48:3a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.465918 containerd[1884]: 2026-03-06 03:01:06.458 [INFO][5141] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" Namespace="kube-system" Pod="coredns-7d764666f9-prjf5" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--prjf5-eth0" Mar 6 03:01:06.560324 systemd-networkd[1498]: calic2dbc06b82d: Link UP Mar 6 03:01:06.561513 systemd-networkd[1498]: calic2dbc06b82d: Gained carrier Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.377 [INFO][5155] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0 calico-kube-controllers-59b45bc7fd- calico-system 9e4a41da-673a-45a8-bf0f-7d9710818af6 936 0 2026-03-06 02:59:47 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:59b45bc7fd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a calico-kube-controllers-59b45bc7fd-tv5p4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic2dbc06b82d [] [] }} ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.377 [INFO][5155] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.436 [INFO][5197] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" HandleID="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.457 [INFO][5197] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" HandleID="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400036bdd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"calico-kube-controllers-59b45bc7fd-tv5p4", "timestamp":"2026-03-06 03:01:06.436108431 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40005058c0)} Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.458 [INFO][5197] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.458 [INFO][5197] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.459 [INFO][5197] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.475 [INFO][5197] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.524 [INFO][5197] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.528 [INFO][5197] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.530 [INFO][5197] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.532 [INFO][5197] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.532 [INFO][5197] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.533 [INFO][5197] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.543 [INFO][5197] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.552 [INFO][5197] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.6/26] block=192.168.81.0/26 handle="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.552 [INFO][5197] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.6/26] handle="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.553 [INFO][5197] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:06.582710 containerd[1884]: 2026-03-06 03:01:06.554 [INFO][5197] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.6/26] IPv6=[] ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" HandleID="k8s-pod-network.1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" Mar 6 03:01:06.584509 containerd[1884]: 2026-03-06 03:01:06.556 [INFO][5155] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0", GenerateName:"calico-kube-controllers-59b45bc7fd-", Namespace:"calico-system", SelfLink:"", UID:"9e4a41da-673a-45a8-bf0f-7d9710818af6", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59b45bc7fd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"calico-kube-controllers-59b45bc7fd-tv5p4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.81.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic2dbc06b82d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.584509 containerd[1884]: 2026-03-06 03:01:06.556 [INFO][5155] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.6/32] ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" Mar 6 03:01:06.584509 containerd[1884]: 2026-03-06 03:01:06.556 [INFO][5155] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic2dbc06b82d ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" Mar 6 03:01:06.584509 containerd[1884]: 2026-03-06 03:01:06.562 [INFO][5155] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" Mar 6 03:01:06.584509 containerd[1884]: 2026-03-06 03:01:06.563 [INFO][5155] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0", GenerateName:"calico-kube-controllers-59b45bc7fd-", Namespace:"calico-system", SelfLink:"", UID:"9e4a41da-673a-45a8-bf0f-7d9710818af6", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59b45bc7fd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca", Pod:"calico-kube-controllers-59b45bc7fd-tv5p4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.81.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic2dbc06b82d", MAC:"1a:13:c6:17:8e:e0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.584509 containerd[1884]: 2026-03-06 03:01:06.575 [INFO][5155] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" Namespace="calico-system" Pod="calico-kube-controllers-59b45bc7fd-tv5p4" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--kube--controllers--59b45bc7fd--tv5p4-eth0" Mar 6 03:01:06.656083 systemd-networkd[1498]: cali70b5f259534: Link UP Mar 6 03:01:06.658007 systemd-networkd[1498]: cali70b5f259534: Gained carrier Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.422 [INFO][5186] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0 calico-apiserver-6b94f7659d- calico-system e4484eb5-8708-4838-b3a7-3f82f9b3c273 935 0 2026-03-06 02:59:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b94f7659d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a calico-apiserver-6b94f7659d-7htj6 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali70b5f259534 [] [] }} ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.422 [INFO][5186] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.477 [INFO][5209] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" HandleID="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.521 [INFO][5209] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" HandleID="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400041e570), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"calico-apiserver-6b94f7659d-7htj6", "timestamp":"2026-03-06 03:01:06.477884687 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002fa000)} Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.521 [INFO][5209] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.553 [INFO][5209] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.553 [INFO][5209] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.580 [INFO][5209] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.627 [INFO][5209] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.631 [INFO][5209] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.633 [INFO][5209] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.635 [INFO][5209] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.635 [INFO][5209] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.637 [INFO][5209] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.641 [INFO][5209] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.649 [INFO][5209] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.7/26] block=192.168.81.0/26 handle="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.649 [INFO][5209] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.7/26] handle="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.649 [INFO][5209] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:06.680440 containerd[1884]: 2026-03-06 03:01:06.649 [INFO][5209] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.7/26] IPv6=[] ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" HandleID="k8s-pod-network.e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" Mar 6 03:01:06.680898 containerd[1884]: 2026-03-06 03:01:06.652 [INFO][5186] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0", GenerateName:"calico-apiserver-6b94f7659d-", Namespace:"calico-system", SelfLink:"", UID:"e4484eb5-8708-4838-b3a7-3f82f9b3c273", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b94f7659d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"calico-apiserver-6b94f7659d-7htj6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.81.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali70b5f259534", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.680898 containerd[1884]: 2026-03-06 03:01:06.652 [INFO][5186] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.7/32] ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" Mar 6 03:01:06.680898 containerd[1884]: 2026-03-06 03:01:06.652 [INFO][5186] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali70b5f259534 ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" Mar 6 03:01:06.680898 containerd[1884]: 2026-03-06 03:01:06.658 [INFO][5186] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" Mar 6 03:01:06.680898 containerd[1884]: 2026-03-06 03:01:06.661 [INFO][5186] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0", GenerateName:"calico-apiserver-6b94f7659d-", Namespace:"calico-system", SelfLink:"", UID:"e4484eb5-8708-4838-b3a7-3f82f9b3c273", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b94f7659d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde", Pod:"calico-apiserver-6b94f7659d-7htj6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.81.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali70b5f259534", MAC:"da:0e:8f:38:73:1e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:06.680898 containerd[1884]: 2026-03-06 03:01:06.678 [INFO][5186] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" Namespace="calico-system" Pod="calico-apiserver-6b94f7659d-7htj6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-calico--apiserver--6b94f7659d--7htj6-eth0" Mar 6 03:01:06.969706 containerd[1884]: time="2026-03-06T03:01:06.969616876Z" level=info msg="connecting to shim 8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070" address="unix:///run/containerd/s/9eaaad0aeb4174728099d7794286ba7a3aa2fdfeee37e64c552af40e23bf1f43" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:06.991316 systemd[1]: Started cri-containerd-8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070.scope - libcontainer container 8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070. Mar 6 03:01:07.068902 containerd[1884]: time="2026-03-06T03:01:07.068861939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-6tqr6,Uid:7fee49c5-1cc8-4ecf-adca-b62999dce111,Namespace:kube-system,Attempt:0,}" Mar 6 03:01:07.265931 containerd[1884]: time="2026-03-06T03:01:07.265637966Z" level=info msg="connecting to shim 751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68" address="unix:///run/containerd/s/ce145b9c4a620b2b9492caecf183cb2bae992920868b843a5bd9e6162e417188" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:07.292384 systemd[1]: Started cri-containerd-751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68.scope - libcontainer container 751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68. Mar 6 03:01:07.367346 containerd[1884]: time="2026-03-06T03:01:07.367302801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vgvhp,Uid:34238614-4e43-4e4c-a383-1e98f49409d2,Namespace:calico-system,Attempt:0,} returns sandbox id \"8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070\"" Mar 6 03:01:07.511635 containerd[1884]: time="2026-03-06T03:01:07.511511429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-prjf5,Uid:2354fc17-8c81-41a6-b3cd-47dc713195e5,Namespace:kube-system,Attempt:0,} returns sandbox id \"751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68\"" Mar 6 03:01:07.560599 containerd[1884]: time="2026-03-06T03:01:07.560435171Z" level=info msg="CreateContainer within sandbox \"751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 03:01:07.773790 containerd[1884]: time="2026-03-06T03:01:07.773315593Z" level=info msg="connecting to shim 1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca" address="unix:///run/containerd/s/bd737955b84f21ed9d40b67c4a4e04e302cf5b1997ecde3aa4924621254dd0f9" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:07.817362 systemd-networkd[1498]: cali015687cd680: Link UP Mar 6 03:01:07.823153 systemd-networkd[1498]: cali015687cd680: Gained carrier Mar 6 03:01:07.852437 systemd[1]: Started cri-containerd-1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca.scope - libcontainer container 1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca. Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.694 [INFO][5387] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0 coredns-7d764666f9- kube-system 7fee49c5-1cc8-4ecf-adca-b62999dce111 934 0 2026-03-06 02:59:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.3-n-38e0d2a52a coredns-7d764666f9-6tqr6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali015687cd680 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.695 [INFO][5387] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.715 [INFO][5399] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" HandleID="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.722 [INFO][5399] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" HandleID="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f34b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.3-n-38e0d2a52a", "pod":"coredns-7d764666f9-6tqr6", "timestamp":"2026-03-06 03:01:07.715837082 +0000 UTC"}, Hostname:"ci-4459.2.3-n-38e0d2a52a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003a8f20)} Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.722 [INFO][5399] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.722 [INFO][5399] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.722 [INFO][5399] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.3-n-38e0d2a52a' Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.724 [INFO][5399] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.729 [INFO][5399] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.734 [INFO][5399] ipam/ipam.go 526: Trying affinity for 192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.736 [INFO][5399] ipam/ipam.go 160: Attempting to load block cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.739 [INFO][5399] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.81.0/26 host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.739 [INFO][5399] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.81.0/26 handle="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.740 [INFO][5399] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.745 [INFO][5399] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.81.0/26 handle="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.766 [INFO][5399] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.81.8/26] block=192.168.81.0/26 handle="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.766 [INFO][5399] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.81.8/26] handle="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" host="ci-4459.2.3-n-38e0d2a52a" Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.766 [INFO][5399] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:07.854580 containerd[1884]: 2026-03-06 03:01:07.766 [INFO][5399] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.81.8/26] IPv6=[] ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" HandleID="k8s-pod-network.0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Workload="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" Mar 6 03:01:07.855564 containerd[1884]: 2026-03-06 03:01:07.770 [INFO][5387] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7fee49c5-1cc8-4ecf-adca-b62999dce111", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"", Pod:"coredns-7d764666f9-6tqr6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.81.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali015687cd680", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:07.855564 containerd[1884]: 2026-03-06 03:01:07.810 [INFO][5387] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.81.8/32] ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" Mar 6 03:01:07.855564 containerd[1884]: 2026-03-06 03:01:07.810 [INFO][5387] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali015687cd680 ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" Mar 6 03:01:07.855564 containerd[1884]: 2026-03-06 03:01:07.828 [INFO][5387] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" Mar 6 03:01:07.855564 containerd[1884]: 2026-03-06 03:01:07.829 [INFO][5387] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7fee49c5-1cc8-4ecf-adca-b62999dce111", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 2, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.3-n-38e0d2a52a", ContainerID:"0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e", Pod:"coredns-7d764666f9-6tqr6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.81.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali015687cd680", MAC:"26:61:7f:3c:bc:8d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:07.856003 containerd[1884]: 2026-03-06 03:01:07.850 [INFO][5387] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" Namespace="kube-system" Pod="coredns-7d764666f9-6tqr6" WorkloadEndpoint="ci--4459.2.3--n--38e0d2a52a-k8s-coredns--7d764666f9--6tqr6-eth0" Mar 6 03:01:07.921645 containerd[1884]: time="2026-03-06T03:01:07.921517113Z" level=info msg="connecting to shim e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde" address="unix:///run/containerd/s/02a392ed8207bf7836e77c4e23377e186c19ff4f5c3fd02527d63093da559bb4" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:07.944365 systemd[1]: Started cri-containerd-e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde.scope - libcontainer container e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde. Mar 6 03:01:07.958699 containerd[1884]: time="2026-03-06T03:01:07.958651337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b45bc7fd-tv5p4,Uid:9e4a41da-673a-45a8-bf0f-7d9710818af6,Namespace:calico-system,Attempt:0,} returns sandbox id \"1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca\"" Mar 6 03:01:07.966354 systemd-networkd[1498]: cali01edcc38c2e: Gained IPv6LL Mar 6 03:01:08.073084 containerd[1884]: time="2026-03-06T03:01:08.071974382Z" level=info msg="Container b3c910e806df4d1ae1177adc253592ec0182ef3cac9320881fbec26e4d530db0: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:08.086085 containerd[1884]: time="2026-03-06T03:01:08.086045474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b94f7659d-7htj6,Uid:e4484eb5-8708-4838-b3a7-3f82f9b3c273,Namespace:calico-system,Attempt:0,} returns sandbox id \"e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde\"" Mar 6 03:01:08.094490 systemd-networkd[1498]: cali354b8af1014: Gained IPv6LL Mar 6 03:01:08.103543 containerd[1884]: time="2026-03-06T03:01:08.103491304Z" level=info msg="CreateContainer within sandbox \"751487ca8700eb214b6714b0d9e90a8dad760e9a33cb2cabcdeedb76cec3ae68\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b3c910e806df4d1ae1177adc253592ec0182ef3cac9320881fbec26e4d530db0\"" Mar 6 03:01:08.104718 containerd[1884]: time="2026-03-06T03:01:08.104624867Z" level=info msg="StartContainer for \"b3c910e806df4d1ae1177adc253592ec0182ef3cac9320881fbec26e4d530db0\"" Mar 6 03:01:08.105890 containerd[1884]: time="2026-03-06T03:01:08.105832888Z" level=info msg="connecting to shim b3c910e806df4d1ae1177adc253592ec0182ef3cac9320881fbec26e4d530db0" address="unix:///run/containerd/s/ce145b9c4a620b2b9492caecf183cb2bae992920868b843a5bd9e6162e417188" protocol=ttrpc version=3 Mar 6 03:01:08.129148 containerd[1884]: time="2026-03-06T03:01:08.128722399Z" level=info msg="connecting to shim 0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e" address="unix:///run/containerd/s/583382e1511c78ef00f3cd4a7764a4cfc75f108f00903fd5191dfe3486b26d72" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:08.129379 systemd[1]: Started cri-containerd-b3c910e806df4d1ae1177adc253592ec0182ef3cac9320881fbec26e4d530db0.scope - libcontainer container b3c910e806df4d1ae1177adc253592ec0182ef3cac9320881fbec26e4d530db0. Mar 6 03:01:08.163447 systemd[1]: Started cri-containerd-0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e.scope - libcontainer container 0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e. Mar 6 03:01:08.173238 containerd[1884]: time="2026-03-06T03:01:08.173072567Z" level=info msg="StartContainer for \"b3c910e806df4d1ae1177adc253592ec0182ef3cac9320881fbec26e4d530db0\" returns successfully" Mar 6 03:01:08.215279 containerd[1884]: time="2026-03-06T03:01:08.215142960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-6tqr6,Uid:7fee49c5-1cc8-4ecf-adca-b62999dce111,Namespace:kube-system,Attempt:0,} returns sandbox id \"0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e\"" Mar 6 03:01:08.222296 systemd-networkd[1498]: calic2dbc06b82d: Gained IPv6LL Mar 6 03:01:08.227888 containerd[1884]: time="2026-03-06T03:01:08.227836066Z" level=info msg="CreateContainer within sandbox \"0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 03:01:08.251193 containerd[1884]: time="2026-03-06T03:01:08.251105124Z" level=info msg="Container 8145194542b02c9b9551b633323f768e0356cb18e64852f6bf1378a3157362a7: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:08.271405 containerd[1884]: time="2026-03-06T03:01:08.271357945Z" level=info msg="CreateContainer within sandbox \"0dff1f3847d2994965947d8876a66bf5718d62bef6b387e502ebf5296749d57e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8145194542b02c9b9551b633323f768e0356cb18e64852f6bf1378a3157362a7\"" Mar 6 03:01:08.274432 containerd[1884]: time="2026-03-06T03:01:08.274135191Z" level=info msg="StartContainer for \"8145194542b02c9b9551b633323f768e0356cb18e64852f6bf1378a3157362a7\"" Mar 6 03:01:08.277555 containerd[1884]: time="2026-03-06T03:01:08.277513048Z" level=info msg="connecting to shim 8145194542b02c9b9551b633323f768e0356cb18e64852f6bf1378a3157362a7" address="unix:///run/containerd/s/583382e1511c78ef00f3cd4a7764a4cfc75f108f00903fd5191dfe3486b26d72" protocol=ttrpc version=3 Mar 6 03:01:08.294779 kubelet[3463]: I0306 03:01:08.293524 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-prjf5" podStartSLOduration=94.293512945 podStartE2EDuration="1m34.293512945s" podCreationTimestamp="2026-03-06 02:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:01:08.291366086 +0000 UTC m=+100.408339464" watchObservedRunningTime="2026-03-06 03:01:08.293512945 +0000 UTC m=+100.410486323" Mar 6 03:01:08.309356 systemd[1]: Started cri-containerd-8145194542b02c9b9551b633323f768e0356cb18e64852f6bf1378a3157362a7.scope - libcontainer container 8145194542b02c9b9551b633323f768e0356cb18e64852f6bf1378a3157362a7. Mar 6 03:01:08.356219 containerd[1884]: time="2026-03-06T03:01:08.354911994Z" level=info msg="StartContainer for \"8145194542b02c9b9551b633323f768e0356cb18e64852f6bf1378a3157362a7\" returns successfully" Mar 6 03:01:08.542391 systemd-networkd[1498]: cali70b5f259534: Gained IPv6LL Mar 6 03:01:09.312081 kubelet[3463]: I0306 03:01:09.311913 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-6tqr6" podStartSLOduration=95.311901644 podStartE2EDuration="1m35.311901644s" podCreationTimestamp="2026-03-06 02:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:01:09.310153213 +0000 UTC m=+101.427126591" watchObservedRunningTime="2026-03-06 03:01:09.311901644 +0000 UTC m=+101.428875022" Mar 6 03:01:09.694340 systemd-networkd[1498]: cali015687cd680: Gained IPv6LL Mar 6 03:01:09.862223 containerd[1884]: time="2026-03-06T03:01:09.861599902Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:09.865479 containerd[1884]: time="2026-03-06T03:01:09.865204494Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 6 03:01:09.869892 containerd[1884]: time="2026-03-06T03:01:09.869814005Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:09.875321 containerd[1884]: time="2026-03-06T03:01:09.875283135Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:09.875781 containerd[1884]: time="2026-03-06T03:01:09.875657611Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 13.100136988s" Mar 6 03:01:09.875781 containerd[1884]: time="2026-03-06T03:01:09.875688180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 6 03:01:09.877752 containerd[1884]: time="2026-03-06T03:01:09.877677569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 6 03:01:09.885276 containerd[1884]: time="2026-03-06T03:01:09.885240532Z" level=info msg="CreateContainer within sandbox \"ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 6 03:01:09.911451 containerd[1884]: time="2026-03-06T03:01:09.910828518Z" level=info msg="Container 8e37e515e3ea02afa789d7f572c0a07425ddb42ac3f63d5e0e631477cabf5a9e: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:09.928704 containerd[1884]: time="2026-03-06T03:01:09.928663439Z" level=info msg="CreateContainer within sandbox \"ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"8e37e515e3ea02afa789d7f572c0a07425ddb42ac3f63d5e0e631477cabf5a9e\"" Mar 6 03:01:09.929500 containerd[1884]: time="2026-03-06T03:01:09.929471713Z" level=info msg="StartContainer for \"8e37e515e3ea02afa789d7f572c0a07425ddb42ac3f63d5e0e631477cabf5a9e\"" Mar 6 03:01:09.931080 containerd[1884]: time="2026-03-06T03:01:09.931051914Z" level=info msg="connecting to shim 8e37e515e3ea02afa789d7f572c0a07425ddb42ac3f63d5e0e631477cabf5a9e" address="unix:///run/containerd/s/e5e68b59e704a627fe8bffe52cec8b9cc3bad8752f2c4624adb97ed97978dcbd" protocol=ttrpc version=3 Mar 6 03:01:09.951324 systemd[1]: Started cri-containerd-8e37e515e3ea02afa789d7f572c0a07425ddb42ac3f63d5e0e631477cabf5a9e.scope - libcontainer container 8e37e515e3ea02afa789d7f572c0a07425ddb42ac3f63d5e0e631477cabf5a9e. Mar 6 03:01:09.990403 containerd[1884]: time="2026-03-06T03:01:09.990359970Z" level=info msg="StartContainer for \"8e37e515e3ea02afa789d7f572c0a07425ddb42ac3f63d5e0e631477cabf5a9e\" returns successfully" Mar 6 03:01:10.369524 systemd[1]: Started sshd@7-10.200.20.33:22-10.200.16.10:48560.service - OpenSSH per-connection server daemon (10.200.16.10:48560). Mar 6 03:01:10.766017 sshd[5708]: Accepted publickey for core from 10.200.16.10 port 48560 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:10.768418 sshd-session[5708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:10.772908 systemd-logind[1867]: New session 10 of user core. Mar 6 03:01:10.779340 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 6 03:01:11.018634 sshd[5711]: Connection closed by 10.200.16.10 port 48560 Mar 6 03:01:11.017857 sshd-session[5708]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:11.021955 systemd[1]: sshd@7-10.200.20.33:22-10.200.16.10:48560.service: Deactivated successfully. Mar 6 03:01:11.024025 systemd[1]: session-10.scope: Deactivated successfully. Mar 6 03:01:11.025344 systemd-logind[1867]: Session 10 logged out. Waiting for processes to exit. Mar 6 03:01:11.027197 systemd-logind[1867]: Removed session 10. Mar 6 03:01:15.743292 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2647426949.mount: Deactivated successfully. Mar 6 03:01:16.044770 containerd[1884]: time="2026-03-06T03:01:16.044238796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:16.047990 containerd[1884]: time="2026-03-06T03:01:16.047957599Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 6 03:01:16.052762 containerd[1884]: time="2026-03-06T03:01:16.052712218Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:16.061032 containerd[1884]: time="2026-03-06T03:01:16.060977779Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:16.061701 containerd[1884]: time="2026-03-06T03:01:16.061478482Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 6.183769736s" Mar 6 03:01:16.061701 containerd[1884]: time="2026-03-06T03:01:16.061510059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 6 03:01:16.063513 containerd[1884]: time="2026-03-06T03:01:16.063496345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 6 03:01:16.069494 containerd[1884]: time="2026-03-06T03:01:16.069464738Z" level=info msg="CreateContainer within sandbox \"17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 6 03:01:16.088472 containerd[1884]: time="2026-03-06T03:01:16.087044451Z" level=info msg="Container 96ec5e6085c984c71944a4042075bffa2917a1f2fba0f0ff0a28ce7b90e9f656: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:16.100818 systemd[1]: Started sshd@8-10.200.20.33:22-10.200.16.10:48570.service - OpenSSH per-connection server daemon (10.200.16.10:48570). Mar 6 03:01:16.119264 containerd[1884]: time="2026-03-06T03:01:16.119218256Z" level=info msg="CreateContainer within sandbox \"17029746e640591216d5641da01ef41d4e32e6718eeea1f43e65abefecdffa9c\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"96ec5e6085c984c71944a4042075bffa2917a1f2fba0f0ff0a28ce7b90e9f656\"" Mar 6 03:01:16.120661 containerd[1884]: time="2026-03-06T03:01:16.120590083Z" level=info msg="StartContainer for \"96ec5e6085c984c71944a4042075bffa2917a1f2fba0f0ff0a28ce7b90e9f656\"" Mar 6 03:01:16.121624 containerd[1884]: time="2026-03-06T03:01:16.121577745Z" level=info msg="connecting to shim 96ec5e6085c984c71944a4042075bffa2917a1f2fba0f0ff0a28ce7b90e9f656" address="unix:///run/containerd/s/506f3d9197b87a4d690be737f0d71246cf43ab5328833a11f326edd9f8a6a6a5" protocol=ttrpc version=3 Mar 6 03:01:16.153331 systemd[1]: Started cri-containerd-96ec5e6085c984c71944a4042075bffa2917a1f2fba0f0ff0a28ce7b90e9f656.scope - libcontainer container 96ec5e6085c984c71944a4042075bffa2917a1f2fba0f0ff0a28ce7b90e9f656. Mar 6 03:01:16.191864 containerd[1884]: time="2026-03-06T03:01:16.191825220Z" level=info msg="StartContainer for \"96ec5e6085c984c71944a4042075bffa2917a1f2fba0f0ff0a28ce7b90e9f656\" returns successfully" Mar 6 03:01:16.332892 kubelet[3463]: I0306 03:01:16.332518 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-mmcmx" podStartSLOduration=79.180977195 podStartE2EDuration="1m30.332242077s" podCreationTimestamp="2026-03-06 02:59:46 +0000 UTC" firstStartedPulling="2026-03-06 03:01:04.911262849 +0000 UTC m=+97.028236227" lastFinishedPulling="2026-03-06 03:01:16.062527731 +0000 UTC m=+108.179501109" observedRunningTime="2026-03-06 03:01:16.332004822 +0000 UTC m=+108.448978200" watchObservedRunningTime="2026-03-06 03:01:16.332242077 +0000 UTC m=+108.449215455" Mar 6 03:01:16.488801 sshd[5744]: Accepted publickey for core from 10.200.16.10 port 48570 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:16.559560 sshd-session[5744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:16.564234 systemd-logind[1867]: New session 11 of user core. Mar 6 03:01:16.570311 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 6 03:01:16.779639 sshd[5802]: Connection closed by 10.200.16.10 port 48570 Mar 6 03:01:16.780417 sshd-session[5744]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:16.784860 systemd-logind[1867]: Session 11 logged out. Waiting for processes to exit. Mar 6 03:01:16.785514 systemd[1]: sshd@8-10.200.20.33:22-10.200.16.10:48570.service: Deactivated successfully. Mar 6 03:01:16.787755 systemd[1]: session-11.scope: Deactivated successfully. Mar 6 03:01:16.790564 systemd-logind[1867]: Removed session 11. Mar 6 03:01:21.880419 systemd[1]: Started sshd@9-10.200.20.33:22-10.200.16.10:48638.service - OpenSSH per-connection server daemon (10.200.16.10:48638). Mar 6 03:01:22.326987 sshd[5887]: Accepted publickey for core from 10.200.16.10 port 48638 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:22.328503 sshd-session[5887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:22.332355 systemd-logind[1867]: New session 12 of user core. Mar 6 03:01:22.337304 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 6 03:01:22.615744 sshd[5890]: Connection closed by 10.200.16.10 port 48638 Mar 6 03:01:22.616245 sshd-session[5887]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:22.619785 systemd-logind[1867]: Session 12 logged out. Waiting for processes to exit. Mar 6 03:01:22.620200 systemd[1]: sshd@9-10.200.20.33:22-10.200.16.10:48638.service: Deactivated successfully. Mar 6 03:01:22.622586 systemd[1]: session-12.scope: Deactivated successfully. Mar 6 03:01:22.624743 systemd-logind[1867]: Removed session 12. Mar 6 03:01:27.692715 systemd[1]: Started sshd@10-10.200.20.33:22-10.200.16.10:48654.service - OpenSSH per-connection server daemon (10.200.16.10:48654). Mar 6 03:01:28.097678 sshd[5929]: Accepted publickey for core from 10.200.16.10 port 48654 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:28.098522 sshd-session[5929]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:28.102371 systemd-logind[1867]: New session 13 of user core. Mar 6 03:01:28.111299 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 6 03:01:28.369193 sshd[5940]: Connection closed by 10.200.16.10 port 48654 Mar 6 03:01:28.368990 sshd-session[5929]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:28.374032 systemd[1]: sshd@10-10.200.20.33:22-10.200.16.10:48654.service: Deactivated successfully. Mar 6 03:01:28.376051 systemd[1]: session-13.scope: Deactivated successfully. Mar 6 03:01:28.376972 systemd-logind[1867]: Session 13 logged out. Waiting for processes to exit. Mar 6 03:01:28.378822 systemd-logind[1867]: Removed session 13. Mar 6 03:01:33.444200 systemd[1]: Started sshd@11-10.200.20.33:22-10.200.16.10:55090.service - OpenSSH per-connection server daemon (10.200.16.10:55090). Mar 6 03:01:33.801305 sshd[5953]: Accepted publickey for core from 10.200.16.10 port 55090 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:33.802251 sshd-session[5953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:33.806229 systemd-logind[1867]: New session 14 of user core. Mar 6 03:01:33.813299 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 6 03:01:34.048580 sshd[5956]: Connection closed by 10.200.16.10 port 55090 Mar 6 03:01:34.048091 sshd-session[5953]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:34.052201 systemd-logind[1867]: Session 14 logged out. Waiting for processes to exit. Mar 6 03:01:34.054459 systemd[1]: sshd@11-10.200.20.33:22-10.200.16.10:55090.service: Deactivated successfully. Mar 6 03:01:34.056304 systemd[1]: session-14.scope: Deactivated successfully. Mar 6 03:01:34.057930 systemd-logind[1867]: Removed session 14. Mar 6 03:01:37.731210 containerd[1884]: time="2026-03-06T03:01:37.731064073Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:37.734210 containerd[1884]: time="2026-03-06T03:01:37.734044166Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 6 03:01:37.737910 containerd[1884]: time="2026-03-06T03:01:37.737877189Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:37.742183 containerd[1884]: time="2026-03-06T03:01:37.742146897Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:37.742948 containerd[1884]: time="2026-03-06T03:01:37.742596807Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 21.678530549s" Mar 6 03:01:37.742948 containerd[1884]: time="2026-03-06T03:01:37.742623864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 6 03:01:37.743658 containerd[1884]: time="2026-03-06T03:01:37.743600134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 6 03:01:37.759137 containerd[1884]: time="2026-03-06T03:01:37.759110928Z" level=info msg="CreateContainer within sandbox \"01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 03:01:37.782443 containerd[1884]: time="2026-03-06T03:01:37.782394442Z" level=info msg="Container 792b0de83fdb775fb91700aa208bd8df307cdb84ecc41b7a0de25b8167816562: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:37.786329 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount608889118.mount: Deactivated successfully. Mar 6 03:01:37.798439 containerd[1884]: time="2026-03-06T03:01:37.798406843Z" level=info msg="CreateContainer within sandbox \"01dcd1f8e8b10920b7fa16c917cbf73d86d95bb4878cf741a30dafa1bd5bc27b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"792b0de83fdb775fb91700aa208bd8df307cdb84ecc41b7a0de25b8167816562\"" Mar 6 03:01:37.799130 containerd[1884]: time="2026-03-06T03:01:37.798881570Z" level=info msg="StartContainer for \"792b0de83fdb775fb91700aa208bd8df307cdb84ecc41b7a0de25b8167816562\"" Mar 6 03:01:37.799983 containerd[1884]: time="2026-03-06T03:01:37.799924090Z" level=info msg="connecting to shim 792b0de83fdb775fb91700aa208bd8df307cdb84ecc41b7a0de25b8167816562" address="unix:///run/containerd/s/e4db46707fc67e39feec1924f58615d542a7fde72c7ae7bb177a926af1187610" protocol=ttrpc version=3 Mar 6 03:01:37.819296 systemd[1]: Started cri-containerd-792b0de83fdb775fb91700aa208bd8df307cdb84ecc41b7a0de25b8167816562.scope - libcontainer container 792b0de83fdb775fb91700aa208bd8df307cdb84ecc41b7a0de25b8167816562. Mar 6 03:01:37.855385 containerd[1884]: time="2026-03-06T03:01:37.855349411Z" level=info msg="StartContainer for \"792b0de83fdb775fb91700aa208bd8df307cdb84ecc41b7a0de25b8167816562\" returns successfully" Mar 6 03:01:38.379582 kubelet[3463]: I0306 03:01:38.379491 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-6b94f7659d-l66t8" podStartSLOduration=79.663026645 podStartE2EDuration="1m52.379480616s" podCreationTimestamp="2026-03-06 02:59:46 +0000 UTC" firstStartedPulling="2026-03-06 03:01:05.026830657 +0000 UTC m=+97.143804035" lastFinishedPulling="2026-03-06 03:01:37.74328462 +0000 UTC m=+129.860258006" observedRunningTime="2026-03-06 03:01:38.378683672 +0000 UTC m=+130.495657050" watchObservedRunningTime="2026-03-06 03:01:38.379480616 +0000 UTC m=+130.496453994" Mar 6 03:01:39.124707 systemd[1]: Started sshd@12-10.200.20.33:22-10.200.16.10:55094.service - OpenSSH per-connection server daemon (10.200.16.10:55094). Mar 6 03:01:39.280375 containerd[1884]: time="2026-03-06T03:01:39.280328116Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:39.284569 containerd[1884]: time="2026-03-06T03:01:39.284517421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 6 03:01:39.288488 containerd[1884]: time="2026-03-06T03:01:39.287670381Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:39.293458 containerd[1884]: time="2026-03-06T03:01:39.293429838Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.549799751s" Mar 6 03:01:39.293704 containerd[1884]: time="2026-03-06T03:01:39.293678349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 6 03:01:39.295407 containerd[1884]: time="2026-03-06T03:01:39.295384258Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:39.295820 containerd[1884]: time="2026-03-06T03:01:39.295797062Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 6 03:01:39.304924 containerd[1884]: time="2026-03-06T03:01:39.304900677Z" level=info msg="CreateContainer within sandbox \"8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 6 03:01:39.326198 containerd[1884]: time="2026-03-06T03:01:39.325747212Z" level=info msg="Container 3732c042bd77c450f727fecc8f632bcaeaf1db1bec38c2bb23a8ba79eb01392e: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:39.343445 containerd[1884]: time="2026-03-06T03:01:39.343404433Z" level=info msg="CreateContainer within sandbox \"8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"3732c042bd77c450f727fecc8f632bcaeaf1db1bec38c2bb23a8ba79eb01392e\"" Mar 6 03:01:39.344352 containerd[1884]: time="2026-03-06T03:01:39.344311821Z" level=info msg="StartContainer for \"3732c042bd77c450f727fecc8f632bcaeaf1db1bec38c2bb23a8ba79eb01392e\"" Mar 6 03:01:39.346103 containerd[1884]: time="2026-03-06T03:01:39.346069530Z" level=info msg="connecting to shim 3732c042bd77c450f727fecc8f632bcaeaf1db1bec38c2bb23a8ba79eb01392e" address="unix:///run/containerd/s/9eaaad0aeb4174728099d7794286ba7a3aa2fdfeee37e64c552af40e23bf1f43" protocol=ttrpc version=3 Mar 6 03:01:39.365340 systemd[1]: Started cri-containerd-3732c042bd77c450f727fecc8f632bcaeaf1db1bec38c2bb23a8ba79eb01392e.scope - libcontainer container 3732c042bd77c450f727fecc8f632bcaeaf1db1bec38c2bb23a8ba79eb01392e. Mar 6 03:01:39.424725 containerd[1884]: time="2026-03-06T03:01:39.424684483Z" level=info msg="StartContainer for \"3732c042bd77c450f727fecc8f632bcaeaf1db1bec38c2bb23a8ba79eb01392e\" returns successfully" Mar 6 03:01:39.513960 sshd[6030]: Accepted publickey for core from 10.200.16.10 port 55094 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:39.515807 sshd-session[6030]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:39.520340 systemd-logind[1867]: New session 15 of user core. Mar 6 03:01:39.526279 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 6 03:01:39.757427 sshd[6068]: Connection closed by 10.200.16.10 port 55094 Mar 6 03:01:39.758102 sshd-session[6030]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:39.761891 systemd-logind[1867]: Session 15 logged out. Waiting for processes to exit. Mar 6 03:01:39.762252 systemd[1]: sshd@12-10.200.20.33:22-10.200.16.10:55094.service: Deactivated successfully. Mar 6 03:01:39.765115 systemd[1]: session-15.scope: Deactivated successfully. Mar 6 03:01:39.767329 systemd-logind[1867]: Removed session 15. Mar 6 03:01:42.135732 containerd[1884]: time="2026-03-06T03:01:42.135683155Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:42.138521 containerd[1884]: time="2026-03-06T03:01:42.138451992Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 6 03:01:42.144493 containerd[1884]: time="2026-03-06T03:01:42.143592693Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:42.148053 containerd[1884]: time="2026-03-06T03:01:42.147974604Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:42.148581 containerd[1884]: time="2026-03-06T03:01:42.148548653Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 2.852725406s" Mar 6 03:01:42.148677 containerd[1884]: time="2026-03-06T03:01:42.148662521Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 6 03:01:42.151394 containerd[1884]: time="2026-03-06T03:01:42.151367099Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 6 03:01:42.171583 containerd[1884]: time="2026-03-06T03:01:42.171563310Z" level=info msg="CreateContainer within sandbox \"1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 6 03:01:42.195062 containerd[1884]: time="2026-03-06T03:01:42.195033941Z" level=info msg="Container 5793fa7fc35211c2d8f464ca0ccbe2f0df5f66724d11138a33650b1f453c8bde: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:42.215050 containerd[1884]: time="2026-03-06T03:01:42.215024682Z" level=info msg="CreateContainer within sandbox \"1b457f52bf69db148a1abfb2bfba8ec72f3745bcd11ad6332d03e72b03f757ca\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"5793fa7fc35211c2d8f464ca0ccbe2f0df5f66724d11138a33650b1f453c8bde\"" Mar 6 03:01:42.216214 containerd[1884]: time="2026-03-06T03:01:42.216187397Z" level=info msg="StartContainer for \"5793fa7fc35211c2d8f464ca0ccbe2f0df5f66724d11138a33650b1f453c8bde\"" Mar 6 03:01:42.217778 containerd[1884]: time="2026-03-06T03:01:42.217753709Z" level=info msg="connecting to shim 5793fa7fc35211c2d8f464ca0ccbe2f0df5f66724d11138a33650b1f453c8bde" address="unix:///run/containerd/s/bd737955b84f21ed9d40b67c4a4e04e302cf5b1997ecde3aa4924621254dd0f9" protocol=ttrpc version=3 Mar 6 03:01:42.242326 systemd[1]: Started cri-containerd-5793fa7fc35211c2d8f464ca0ccbe2f0df5f66724d11138a33650b1f453c8bde.scope - libcontainer container 5793fa7fc35211c2d8f464ca0ccbe2f0df5f66724d11138a33650b1f453c8bde. Mar 6 03:01:42.275493 containerd[1884]: time="2026-03-06T03:01:42.275459973Z" level=info msg="StartContainer for \"5793fa7fc35211c2d8f464ca0ccbe2f0df5f66724d11138a33650b1f453c8bde\" returns successfully" Mar 6 03:01:42.479143 kubelet[3463]: I0306 03:01:42.479078 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-59b45bc7fd-tv5p4" podStartSLOduration=81.29167979 podStartE2EDuration="1m55.478946407s" podCreationTimestamp="2026-03-06 02:59:47 +0000 UTC" firstStartedPulling="2026-03-06 03:01:07.963087211 +0000 UTC m=+100.080060597" lastFinishedPulling="2026-03-06 03:01:42.150353836 +0000 UTC m=+134.267327214" observedRunningTime="2026-03-06 03:01:42.442335405 +0000 UTC m=+134.559308791" watchObservedRunningTime="2026-03-06 03:01:42.478946407 +0000 UTC m=+134.595919793" Mar 6 03:01:42.511024 containerd[1884]: time="2026-03-06T03:01:42.510977652Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:42.514307 containerd[1884]: time="2026-03-06T03:01:42.514279265Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 6 03:01:42.515289 containerd[1884]: time="2026-03-06T03:01:42.515264071Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 363.869363ms" Mar 6 03:01:42.515355 containerd[1884]: time="2026-03-06T03:01:42.515292968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 6 03:01:42.516403 containerd[1884]: time="2026-03-06T03:01:42.516376569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 6 03:01:42.522688 containerd[1884]: time="2026-03-06T03:01:42.522659146Z" level=info msg="CreateContainer within sandbox \"e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 03:01:42.540933 containerd[1884]: time="2026-03-06T03:01:42.540342839Z" level=info msg="Container bc743d3597d78068054b10714e28ab0706a69a542993b6f1eff31df63b7e26fd: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:42.560952 containerd[1884]: time="2026-03-06T03:01:42.560923134Z" level=info msg="CreateContainer within sandbox \"e55531600d4e642a1855ff49ed149066939999265f82f0a37d1ef3e8cb074cde\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bc743d3597d78068054b10714e28ab0706a69a542993b6f1eff31df63b7e26fd\"" Mar 6 03:01:42.563961 containerd[1884]: time="2026-03-06T03:01:42.563929586Z" level=info msg="StartContainer for \"bc743d3597d78068054b10714e28ab0706a69a542993b6f1eff31df63b7e26fd\"" Mar 6 03:01:42.565166 containerd[1884]: time="2026-03-06T03:01:42.565118534Z" level=info msg="connecting to shim bc743d3597d78068054b10714e28ab0706a69a542993b6f1eff31df63b7e26fd" address="unix:///run/containerd/s/02a392ed8207bf7836e77c4e23377e186c19ff4f5c3fd02527d63093da559bb4" protocol=ttrpc version=3 Mar 6 03:01:42.582371 systemd[1]: Started cri-containerd-bc743d3597d78068054b10714e28ab0706a69a542993b6f1eff31df63b7e26fd.scope - libcontainer container bc743d3597d78068054b10714e28ab0706a69a542993b6f1eff31df63b7e26fd. Mar 6 03:01:42.615645 containerd[1884]: time="2026-03-06T03:01:42.615610545Z" level=info msg="StartContainer for \"bc743d3597d78068054b10714e28ab0706a69a542993b6f1eff31df63b7e26fd\" returns successfully" Mar 6 03:01:43.442815 kubelet[3463]: I0306 03:01:43.442539 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-6b94f7659d-7htj6" podStartSLOduration=83.014536607 podStartE2EDuration="1m57.442527581s" podCreationTimestamp="2026-03-06 02:59:46 +0000 UTC" firstStartedPulling="2026-03-06 03:01:08.087925485 +0000 UTC m=+100.204898863" lastFinishedPulling="2026-03-06 03:01:42.515916459 +0000 UTC m=+134.632889837" observedRunningTime="2026-03-06 03:01:43.44183744 +0000 UTC m=+135.558810818" watchObservedRunningTime="2026-03-06 03:01:43.442527581 +0000 UTC m=+135.559500959" Mar 6 03:01:44.859359 systemd[1]: Started sshd@13-10.200.20.33:22-10.200.16.10:44470.service - OpenSSH per-connection server daemon (10.200.16.10:44470). Mar 6 03:01:45.260939 sshd[6210]: Accepted publickey for core from 10.200.16.10 port 44470 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:45.262369 sshd-session[6210]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:45.265845 systemd-logind[1867]: New session 16 of user core. Mar 6 03:01:45.274466 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 6 03:01:45.541748 sshd[6213]: Connection closed by 10.200.16.10 port 44470 Mar 6 03:01:45.542286 sshd-session[6210]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:45.545333 systemd[1]: sshd@13-10.200.20.33:22-10.200.16.10:44470.service: Deactivated successfully. Mar 6 03:01:45.547211 systemd[1]: session-16.scope: Deactivated successfully. Mar 6 03:01:45.548032 systemd-logind[1867]: Session 16 logged out. Waiting for processes to exit. Mar 6 03:01:45.549734 systemd-logind[1867]: Removed session 16. Mar 6 03:01:45.610530 systemd[1]: Started sshd@14-10.200.20.33:22-10.200.16.10:44478.service - OpenSSH per-connection server daemon (10.200.16.10:44478). Mar 6 03:01:45.966610 sshd[6225]: Accepted publickey for core from 10.200.16.10 port 44478 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:45.967695 sshd-session[6225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:45.971792 systemd-logind[1867]: New session 17 of user core. Mar 6 03:01:45.976300 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 6 03:01:46.243873 sshd[6228]: Connection closed by 10.200.16.10 port 44478 Mar 6 03:01:46.244486 sshd-session[6225]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:46.247600 systemd[1]: sshd@14-10.200.20.33:22-10.200.16.10:44478.service: Deactivated successfully. Mar 6 03:01:46.249102 systemd[1]: session-17.scope: Deactivated successfully. Mar 6 03:01:46.249797 systemd-logind[1867]: Session 17 logged out. Waiting for processes to exit. Mar 6 03:01:46.251080 systemd-logind[1867]: Removed session 17. Mar 6 03:01:46.321358 systemd[1]: Started sshd@15-10.200.20.33:22-10.200.16.10:44488.service - OpenSSH per-connection server daemon (10.200.16.10:44488). Mar 6 03:01:46.682909 sshd[6238]: Accepted publickey for core from 10.200.16.10 port 44488 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:46.684514 sshd-session[6238]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:46.688068 systemd-logind[1867]: New session 18 of user core. Mar 6 03:01:46.696281 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 6 03:01:46.927964 sshd[6241]: Connection closed by 10.200.16.10 port 44488 Mar 6 03:01:46.928507 sshd-session[6238]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:46.932033 systemd[1]: sshd@15-10.200.20.33:22-10.200.16.10:44488.service: Deactivated successfully. Mar 6 03:01:46.933721 systemd[1]: session-18.scope: Deactivated successfully. Mar 6 03:01:46.934550 systemd-logind[1867]: Session 18 logged out. Waiting for processes to exit. Mar 6 03:01:46.936214 systemd-logind[1867]: Removed session 18. Mar 6 03:01:52.009553 systemd[1]: Started sshd@16-10.200.20.33:22-10.200.16.10:60562.service - OpenSSH per-connection server daemon (10.200.16.10:60562). Mar 6 03:01:52.378617 sshd[6313]: Accepted publickey for core from 10.200.16.10 port 60562 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:52.379857 sshd-session[6313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:52.383561 systemd-logind[1867]: New session 19 of user core. Mar 6 03:01:52.397303 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 6 03:01:52.632895 sshd[6316]: Connection closed by 10.200.16.10 port 60562 Mar 6 03:01:52.637002 systemd[1]: sshd@16-10.200.20.33:22-10.200.16.10:60562.service: Deactivated successfully. Mar 6 03:01:52.633331 sshd-session[6313]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:52.639979 systemd[1]: session-19.scope: Deactivated successfully. Mar 6 03:01:52.640827 systemd-logind[1867]: Session 19 logged out. Waiting for processes to exit. Mar 6 03:01:52.642592 systemd-logind[1867]: Removed session 19. Mar 6 03:01:55.037453 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount913835846.mount: Deactivated successfully. Mar 6 03:01:55.097123 containerd[1884]: time="2026-03-06T03:01:55.097084329Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:55.099807 containerd[1884]: time="2026-03-06T03:01:55.099780525Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 6 03:01:55.103464 containerd[1884]: time="2026-03-06T03:01:55.103405061Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:55.108541 containerd[1884]: time="2026-03-06T03:01:55.108498883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:55.109118 containerd[1884]: time="2026-03-06T03:01:55.108792765Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 12.59159221s" Mar 6 03:01:55.109118 containerd[1884]: time="2026-03-06T03:01:55.108820589Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 6 03:01:55.111409 containerd[1884]: time="2026-03-06T03:01:55.111360380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 6 03:01:55.118572 containerd[1884]: time="2026-03-06T03:01:55.118548787Z" level=info msg="CreateContainer within sandbox \"ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 6 03:01:55.138836 containerd[1884]: time="2026-03-06T03:01:55.138146683Z" level=info msg="Container a2ef8deeb3c8a6dab2b533b85626fdedbd2f6359068e0fc4fe40fb5ce460616d: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:55.156085 containerd[1884]: time="2026-03-06T03:01:55.156056519Z" level=info msg="CreateContainer within sandbox \"ca97ea8107b029ecff60215cc9118ea06400b609b36d6278b42079c99a559b3f\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a2ef8deeb3c8a6dab2b533b85626fdedbd2f6359068e0fc4fe40fb5ce460616d\"" Mar 6 03:01:55.156617 containerd[1884]: time="2026-03-06T03:01:55.156595648Z" level=info msg="StartContainer for \"a2ef8deeb3c8a6dab2b533b85626fdedbd2f6359068e0fc4fe40fb5ce460616d\"" Mar 6 03:01:55.158460 containerd[1884]: time="2026-03-06T03:01:55.158434465Z" level=info msg="connecting to shim a2ef8deeb3c8a6dab2b533b85626fdedbd2f6359068e0fc4fe40fb5ce460616d" address="unix:///run/containerd/s/e5e68b59e704a627fe8bffe52cec8b9cc3bad8752f2c4624adb97ed97978dcbd" protocol=ttrpc version=3 Mar 6 03:01:55.178294 systemd[1]: Started cri-containerd-a2ef8deeb3c8a6dab2b533b85626fdedbd2f6359068e0fc4fe40fb5ce460616d.scope - libcontainer container a2ef8deeb3c8a6dab2b533b85626fdedbd2f6359068e0fc4fe40fb5ce460616d. Mar 6 03:01:55.215878 containerd[1884]: time="2026-03-06T03:01:55.215851446Z" level=info msg="StartContainer for \"a2ef8deeb3c8a6dab2b533b85626fdedbd2f6359068e0fc4fe40fb5ce460616d\" returns successfully" Mar 6 03:01:55.464193 kubelet[3463]: I0306 03:01:55.463149 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-5c89594cd5-cjvrd" podStartSLOduration=4.127340601 podStartE2EDuration="1m2.463139974s" podCreationTimestamp="2026-03-06 03:00:53 +0000 UTC" firstStartedPulling="2026-03-06 03:00:56.773789024 +0000 UTC m=+88.890762402" lastFinishedPulling="2026-03-06 03:01:55.109588397 +0000 UTC m=+147.226561775" observedRunningTime="2026-03-06 03:01:55.462325677 +0000 UTC m=+147.579299055" watchObservedRunningTime="2026-03-06 03:01:55.463139974 +0000 UTC m=+147.580113352" Mar 6 03:01:56.846600 containerd[1884]: time="2026-03-06T03:01:56.846552596Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:56.853402 containerd[1884]: time="2026-03-06T03:01:56.853372920Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 6 03:01:56.856314 containerd[1884]: time="2026-03-06T03:01:56.856284026Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:56.860907 containerd[1884]: time="2026-03-06T03:01:56.860877449Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:56.861341 containerd[1884]: time="2026-03-06T03:01:56.861227875Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.749805181s" Mar 6 03:01:56.861341 containerd[1884]: time="2026-03-06T03:01:56.861259300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 6 03:01:56.869306 containerd[1884]: time="2026-03-06T03:01:56.869220939Z" level=info msg="CreateContainer within sandbox \"8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 6 03:01:56.893937 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount23242854.mount: Deactivated successfully. Mar 6 03:01:56.895892 containerd[1884]: time="2026-03-06T03:01:56.895390351Z" level=info msg="Container 9d3e2465a24e79206f683ef130665a9498f0b7d54e13ec99bf4963e30d2c1c57: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:56.913238 containerd[1884]: time="2026-03-06T03:01:56.913208776Z" level=info msg="CreateContainer within sandbox \"8fe3eb775e3650ef0e37124315f06611b1dd6e3da687031067d25c8fde9c6070\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9d3e2465a24e79206f683ef130665a9498f0b7d54e13ec99bf4963e30d2c1c57\"" Mar 6 03:01:56.914204 containerd[1884]: time="2026-03-06T03:01:56.913769305Z" level=info msg="StartContainer for \"9d3e2465a24e79206f683ef130665a9498f0b7d54e13ec99bf4963e30d2c1c57\"" Mar 6 03:01:56.916867 containerd[1884]: time="2026-03-06T03:01:56.916844505Z" level=info msg="connecting to shim 9d3e2465a24e79206f683ef130665a9498f0b7d54e13ec99bf4963e30d2c1c57" address="unix:///run/containerd/s/9eaaad0aeb4174728099d7794286ba7a3aa2fdfeee37e64c552af40e23bf1f43" protocol=ttrpc version=3 Mar 6 03:01:56.936291 systemd[1]: Started cri-containerd-9d3e2465a24e79206f683ef130665a9498f0b7d54e13ec99bf4963e30d2c1c57.scope - libcontainer container 9d3e2465a24e79206f683ef130665a9498f0b7d54e13ec99bf4963e30d2c1c57. Mar 6 03:01:56.991848 containerd[1884]: time="2026-03-06T03:01:56.991821151Z" level=info msg="StartContainer for \"9d3e2465a24e79206f683ef130665a9498f0b7d54e13ec99bf4963e30d2c1c57\" returns successfully" Mar 6 03:01:57.154159 kubelet[3463]: I0306 03:01:57.154008 3463 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 6 03:01:57.154159 kubelet[3463]: I0306 03:01:57.154046 3463 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 6 03:01:57.472018 kubelet[3463]: I0306 03:01:57.471969 3463 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-vgvhp" podStartSLOduration=80.979271234 podStartE2EDuration="2m10.471956782s" podCreationTimestamp="2026-03-06 02:59:47 +0000 UTC" firstStartedPulling="2026-03-06 03:01:07.36938821 +0000 UTC m=+99.486361588" lastFinishedPulling="2026-03-06 03:01:56.862073758 +0000 UTC m=+148.979047136" observedRunningTime="2026-03-06 03:01:57.471632356 +0000 UTC m=+149.588605750" watchObservedRunningTime="2026-03-06 03:01:57.471956782 +0000 UTC m=+149.588930160" Mar 6 03:01:57.722343 systemd[1]: Started sshd@17-10.200.20.33:22-10.200.16.10:60576.service - OpenSSH per-connection server daemon (10.200.16.10:60576). Mar 6 03:01:58.138911 sshd[6430]: Accepted publickey for core from 10.200.16.10 port 60576 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:01:58.140231 sshd-session[6430]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:58.143835 systemd-logind[1867]: New session 20 of user core. Mar 6 03:01:58.147397 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 6 03:01:58.418878 sshd[6433]: Connection closed by 10.200.16.10 port 60576 Mar 6 03:01:58.419348 sshd-session[6430]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:58.422403 systemd[1]: sshd@17-10.200.20.33:22-10.200.16.10:60576.service: Deactivated successfully. Mar 6 03:01:58.425036 systemd[1]: session-20.scope: Deactivated successfully. Mar 6 03:01:58.426672 systemd-logind[1867]: Session 20 logged out. Waiting for processes to exit. Mar 6 03:01:58.427993 systemd-logind[1867]: Removed session 20. Mar 6 03:02:03.511507 systemd[1]: Started sshd@18-10.200.20.33:22-10.200.16.10:52388.service - OpenSSH per-connection server daemon (10.200.16.10:52388). Mar 6 03:02:03.931134 sshd[6445]: Accepted publickey for core from 10.200.16.10 port 52388 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:03.931936 sshd-session[6445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:03.935587 systemd-logind[1867]: New session 21 of user core. Mar 6 03:02:03.941495 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 6 03:02:04.206306 sshd[6448]: Connection closed by 10.200.16.10 port 52388 Mar 6 03:02:04.206142 sshd-session[6445]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:04.209228 systemd-logind[1867]: Session 21 logged out. Waiting for processes to exit. Mar 6 03:02:04.209464 systemd[1]: sshd@18-10.200.20.33:22-10.200.16.10:52388.service: Deactivated successfully. Mar 6 03:02:04.213369 systemd[1]: session-21.scope: Deactivated successfully. Mar 6 03:02:04.215592 systemd-logind[1867]: Removed session 21. Mar 6 03:02:09.295074 systemd[1]: Started sshd@19-10.200.20.33:22-10.200.16.10:52400.service - OpenSSH per-connection server daemon (10.200.16.10:52400). Mar 6 03:02:09.713040 sshd[6462]: Accepted publickey for core from 10.200.16.10 port 52400 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:09.714157 sshd-session[6462]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:09.718341 systemd-logind[1867]: New session 22 of user core. Mar 6 03:02:09.723291 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 6 03:02:09.987330 sshd[6465]: Connection closed by 10.200.16.10 port 52400 Mar 6 03:02:09.988373 sshd-session[6462]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:09.991375 systemd[1]: sshd@19-10.200.20.33:22-10.200.16.10:52400.service: Deactivated successfully. Mar 6 03:02:09.993410 systemd[1]: session-22.scope: Deactivated successfully. Mar 6 03:02:09.994354 systemd-logind[1867]: Session 22 logged out. Waiting for processes to exit. Mar 6 03:02:09.996121 systemd-logind[1867]: Removed session 22. Mar 6 03:02:15.075648 systemd[1]: Started sshd@20-10.200.20.33:22-10.200.16.10:42714.service - OpenSSH per-connection server daemon (10.200.16.10:42714). Mar 6 03:02:15.499192 sshd[6498]: Accepted publickey for core from 10.200.16.10 port 42714 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:15.500013 sshd-session[6498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:15.503411 systemd-logind[1867]: New session 23 of user core. Mar 6 03:02:15.513279 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 6 03:02:15.770894 sshd[6501]: Connection closed by 10.200.16.10 port 42714 Mar 6 03:02:15.771314 sshd-session[6498]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:15.774576 systemd[1]: sshd@20-10.200.20.33:22-10.200.16.10:42714.service: Deactivated successfully. Mar 6 03:02:15.776061 systemd[1]: session-23.scope: Deactivated successfully. Mar 6 03:02:15.777956 systemd-logind[1867]: Session 23 logged out. Waiting for processes to exit. Mar 6 03:02:15.779367 systemd-logind[1867]: Removed session 23. Mar 6 03:02:20.844629 systemd[1]: Started sshd@21-10.200.20.33:22-10.200.16.10:47724.service - OpenSSH per-connection server daemon (10.200.16.10:47724). Mar 6 03:02:21.208468 sshd[6545]: Accepted publickey for core from 10.200.16.10 port 47724 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:21.232245 sshd-session[6545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:21.235979 systemd-logind[1867]: New session 24 of user core. Mar 6 03:02:21.244292 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 6 03:02:21.445306 sshd[6548]: Connection closed by 10.200.16.10 port 47724 Mar 6 03:02:21.445791 sshd-session[6545]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:21.448696 systemd[1]: sshd@21-10.200.20.33:22-10.200.16.10:47724.service: Deactivated successfully. Mar 6 03:02:21.450365 systemd[1]: session-24.scope: Deactivated successfully. Mar 6 03:02:21.450991 systemd-logind[1867]: Session 24 logged out. Waiting for processes to exit. Mar 6 03:02:21.452365 systemd-logind[1867]: Removed session 24. Mar 6 03:02:21.519796 systemd[1]: Started sshd@22-10.200.20.33:22-10.200.16.10:47730.service - OpenSSH per-connection server daemon (10.200.16.10:47730). Mar 6 03:02:21.881352 sshd[6560]: Accepted publickey for core from 10.200.16.10 port 47730 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:21.883152 sshd-session[6560]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:21.887563 systemd-logind[1867]: New session 25 of user core. Mar 6 03:02:21.895297 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 6 03:02:22.258815 sshd[6563]: Connection closed by 10.200.16.10 port 47730 Mar 6 03:02:22.258215 sshd-session[6560]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:22.261140 systemd[1]: sshd@22-10.200.20.33:22-10.200.16.10:47730.service: Deactivated successfully. Mar 6 03:02:22.263352 systemd[1]: session-25.scope: Deactivated successfully. Mar 6 03:02:22.264090 systemd-logind[1867]: Session 25 logged out. Waiting for processes to exit. Mar 6 03:02:22.265251 systemd-logind[1867]: Removed session 25. Mar 6 03:02:22.330467 systemd[1]: Started sshd@23-10.200.20.33:22-10.200.16.10:47738.service - OpenSSH per-connection server daemon (10.200.16.10:47738). Mar 6 03:02:22.685490 sshd[6573]: Accepted publickey for core from 10.200.16.10 port 47738 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:22.686987 sshd-session[6573]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:22.690268 systemd-logind[1867]: New session 26 of user core. Mar 6 03:02:22.693324 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 6 03:02:23.399518 sshd[6576]: Connection closed by 10.200.16.10 port 47738 Mar 6 03:02:23.399854 sshd-session[6573]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:23.404400 systemd-logind[1867]: Session 26 logged out. Waiting for processes to exit. Mar 6 03:02:23.404836 systemd[1]: sshd@23-10.200.20.33:22-10.200.16.10:47738.service: Deactivated successfully. Mar 6 03:02:23.406564 systemd[1]: session-26.scope: Deactivated successfully. Mar 6 03:02:23.409132 systemd-logind[1867]: Removed session 26. Mar 6 03:02:23.474870 systemd[1]: Started sshd@24-10.200.20.33:22-10.200.16.10:47742.service - OpenSSH per-connection server daemon (10.200.16.10:47742). Mar 6 03:02:23.831184 sshd[6607]: Accepted publickey for core from 10.200.16.10 port 47742 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:23.832268 sshd-session[6607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:23.835875 systemd-logind[1867]: New session 27 of user core. Mar 6 03:02:23.844288 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 6 03:02:24.151502 sshd[6611]: Connection closed by 10.200.16.10 port 47742 Mar 6 03:02:24.152472 sshd-session[6607]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:24.155801 systemd[1]: sshd@24-10.200.20.33:22-10.200.16.10:47742.service: Deactivated successfully. Mar 6 03:02:24.157671 systemd[1]: session-27.scope: Deactivated successfully. Mar 6 03:02:24.158508 systemd-logind[1867]: Session 27 logged out. Waiting for processes to exit. Mar 6 03:02:24.160121 systemd-logind[1867]: Removed session 27. Mar 6 03:02:24.235356 systemd[1]: Started sshd@25-10.200.20.33:22-10.200.16.10:47746.service - OpenSSH per-connection server daemon (10.200.16.10:47746). Mar 6 03:02:24.609698 sshd[6642]: Accepted publickey for core from 10.200.16.10 port 47746 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:24.610097 sshd-session[6642]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:24.614231 systemd-logind[1867]: New session 28 of user core. Mar 6 03:02:24.617292 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 6 03:02:24.865488 sshd[6653]: Connection closed by 10.200.16.10 port 47746 Mar 6 03:02:24.866386 sshd-session[6642]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:24.869383 systemd-logind[1867]: Session 28 logged out. Waiting for processes to exit. Mar 6 03:02:24.869493 systemd[1]: sshd@25-10.200.20.33:22-10.200.16.10:47746.service: Deactivated successfully. Mar 6 03:02:24.871027 systemd[1]: session-28.scope: Deactivated successfully. Mar 6 03:02:24.872872 systemd-logind[1867]: Removed session 28. Mar 6 03:02:29.956815 systemd[1]: Started sshd@26-10.200.20.33:22-10.200.16.10:34016.service - OpenSSH per-connection server daemon (10.200.16.10:34016). Mar 6 03:02:30.365247 sshd[6689]: Accepted publickey for core from 10.200.16.10 port 34016 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:30.366240 sshd-session[6689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:30.370261 systemd-logind[1867]: New session 29 of user core. Mar 6 03:02:30.375298 systemd[1]: Started session-29.scope - Session 29 of User core. Mar 6 03:02:30.635993 sshd[6692]: Connection closed by 10.200.16.10 port 34016 Mar 6 03:02:30.636633 sshd-session[6689]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:30.640646 systemd[1]: sshd@26-10.200.20.33:22-10.200.16.10:34016.service: Deactivated successfully. Mar 6 03:02:30.642501 systemd[1]: session-29.scope: Deactivated successfully. Mar 6 03:02:30.644576 systemd-logind[1867]: Session 29 logged out. Waiting for processes to exit. Mar 6 03:02:30.646212 systemd-logind[1867]: Removed session 29. Mar 6 03:02:35.707928 systemd[1]: Started sshd@27-10.200.20.33:22-10.200.16.10:34022.service - OpenSSH per-connection server daemon (10.200.16.10:34022). Mar 6 03:02:36.067779 sshd[6727]: Accepted publickey for core from 10.200.16.10 port 34022 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:36.069497 sshd-session[6727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:36.073836 systemd-logind[1867]: New session 30 of user core. Mar 6 03:02:36.080271 systemd[1]: Started session-30.scope - Session 30 of User core. Mar 6 03:02:36.309216 sshd[6730]: Connection closed by 10.200.16.10 port 34022 Mar 6 03:02:36.308869 sshd-session[6727]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:36.312185 systemd-logind[1867]: Session 30 logged out. Waiting for processes to exit. Mar 6 03:02:36.313098 systemd[1]: sshd@27-10.200.20.33:22-10.200.16.10:34022.service: Deactivated successfully. Mar 6 03:02:36.315416 systemd[1]: session-30.scope: Deactivated successfully. Mar 6 03:02:36.316748 systemd-logind[1867]: Removed session 30. Mar 6 03:02:41.403881 systemd[1]: Started sshd@28-10.200.20.33:22-10.200.16.10:47044.service - OpenSSH per-connection server daemon (10.200.16.10:47044). Mar 6 03:02:41.801162 sshd[6742]: Accepted publickey for core from 10.200.16.10 port 47044 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:41.801903 sshd-session[6742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:41.805686 systemd-logind[1867]: New session 31 of user core. Mar 6 03:02:41.811308 systemd[1]: Started session-31.scope - Session 31 of User core. Mar 6 03:02:42.066991 sshd[6745]: Connection closed by 10.200.16.10 port 47044 Mar 6 03:02:42.068346 sshd-session[6742]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:42.071480 systemd-logind[1867]: Session 31 logged out. Waiting for processes to exit. Mar 6 03:02:42.071880 systemd[1]: sshd@28-10.200.20.33:22-10.200.16.10:47044.service: Deactivated successfully. Mar 6 03:02:42.075458 systemd[1]: session-31.scope: Deactivated successfully. Mar 6 03:02:42.078517 systemd-logind[1867]: Removed session 31. Mar 6 03:02:47.164370 systemd[1]: Started sshd@29-10.200.20.33:22-10.200.16.10:47056.service - OpenSSH per-connection server daemon (10.200.16.10:47056). Mar 6 03:02:47.601989 sshd[6780]: Accepted publickey for core from 10.200.16.10 port 47056 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:47.603602 sshd-session[6780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:47.608452 systemd-logind[1867]: New session 32 of user core. Mar 6 03:02:47.613322 systemd[1]: Started session-32.scope - Session 32 of User core. Mar 6 03:02:47.894456 sshd[6783]: Connection closed by 10.200.16.10 port 47056 Mar 6 03:02:47.895005 sshd-session[6780]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:47.899467 systemd[1]: sshd@29-10.200.20.33:22-10.200.16.10:47056.service: Deactivated successfully. Mar 6 03:02:47.901502 systemd[1]: session-32.scope: Deactivated successfully. Mar 6 03:02:47.902828 systemd-logind[1867]: Session 32 logged out. Waiting for processes to exit. Mar 6 03:02:47.904849 systemd-logind[1867]: Removed session 32. Mar 6 03:02:52.965155 systemd[1]: Started sshd@30-10.200.20.33:22-10.200.16.10:35082.service - OpenSSH per-connection server daemon (10.200.16.10:35082). Mar 6 03:02:53.342574 sshd[6861]: Accepted publickey for core from 10.200.16.10 port 35082 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:53.343376 sshd-session[6861]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:53.347416 systemd-logind[1867]: New session 33 of user core. Mar 6 03:02:53.354289 systemd[1]: Started session-33.scope - Session 33 of User core. Mar 6 03:02:53.587334 sshd[6864]: Connection closed by 10.200.16.10 port 35082 Mar 6 03:02:53.587902 sshd-session[6861]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:53.591102 systemd-logind[1867]: Session 33 logged out. Waiting for processes to exit. Mar 6 03:02:53.591762 systemd[1]: sshd@30-10.200.20.33:22-10.200.16.10:35082.service: Deactivated successfully. Mar 6 03:02:53.593419 systemd[1]: session-33.scope: Deactivated successfully. Mar 6 03:02:53.595612 systemd-logind[1867]: Removed session 33. Mar 6 03:02:58.667787 systemd[1]: Started sshd@31-10.200.20.33:22-10.200.16.10:35090.service - OpenSSH per-connection server daemon (10.200.16.10:35090). Mar 6 03:02:59.025084 sshd[6900]: Accepted publickey for core from 10.200.16.10 port 35090 ssh2: RSA SHA256:FEy/krmA4A08ZzdMQEPdw8LvNt9bbJfX7o/obFKAbA4 Mar 6 03:02:59.026078 sshd-session[6900]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:59.029545 systemd-logind[1867]: New session 34 of user core. Mar 6 03:02:59.041403 systemd[1]: Started session-34.scope - Session 34 of User core. Mar 6 03:02:59.264694 sshd[6903]: Connection closed by 10.200.16.10 port 35090 Mar 6 03:02:59.265272 sshd-session[6900]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:59.269399 systemd[1]: sshd@31-10.200.20.33:22-10.200.16.10:35090.service: Deactivated successfully. Mar 6 03:02:59.271388 systemd[1]: session-34.scope: Deactivated successfully. Mar 6 03:02:59.272317 systemd-logind[1867]: Session 34 logged out. Waiting for processes to exit. Mar 6 03:02:59.273709 systemd-logind[1867]: Removed session 34.