Mar 4 08:51:57.778492 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 4 08:51:57.778513 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Mar 3 11:03:33 -00 2026 Mar 4 08:51:57.778533 kernel: KASLR enabled Mar 4 08:51:57.778538 kernel: efi: EFI v2.7 by EDK II Mar 4 08:51:57.778544 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Mar 4 08:51:57.778550 kernel: random: crng init done Mar 4 08:51:57.778557 kernel: secureboot: Secure boot disabled Mar 4 08:51:57.778562 kernel: ACPI: Early table checksum verification disabled Mar 4 08:51:57.778568 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Mar 4 08:51:57.778574 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Mar 4 08:51:57.778580 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778586 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778592 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778598 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778605 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778611 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778619 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778625 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778631 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778637 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 08:51:57.778643 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Mar 4 08:51:57.778649 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Mar 4 08:51:57.778655 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 4 08:51:57.778661 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Mar 4 08:51:57.778668 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Mar 4 08:51:57.778673 kernel: Zone ranges: Mar 4 08:51:57.778681 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 4 08:51:57.778687 kernel: DMA32 empty Mar 4 08:51:57.778693 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Mar 4 08:51:57.778699 kernel: Device empty Mar 4 08:51:57.778705 kernel: Movable zone start for each node Mar 4 08:51:57.778710 kernel: Early memory node ranges Mar 4 08:51:57.778716 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Mar 4 08:51:57.778723 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Mar 4 08:51:57.778728 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Mar 4 08:51:57.778734 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Mar 4 08:51:57.778740 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Mar 4 08:51:57.778746 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Mar 4 08:51:57.778754 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Mar 4 08:51:57.778760 kernel: psci: probing for conduit method from ACPI. Mar 4 08:51:57.778768 kernel: psci: PSCIv1.3 detected in firmware. Mar 4 08:51:57.778775 kernel: psci: Using standard PSCI v0.2 function IDs Mar 4 08:51:57.778781 kernel: psci: Trusted OS migration not required Mar 4 08:51:57.778789 kernel: psci: SMC Calling Convention v1.1 Mar 4 08:51:57.778796 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 4 08:51:57.778802 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 4 08:51:57.778809 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 4 08:51:57.778816 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Mar 4 08:51:57.778822 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Mar 4 08:51:57.778830 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 4 08:51:57.778836 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 4 08:51:57.778843 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 4 08:51:57.778849 kernel: Detected PIPT I-cache on CPU0 Mar 4 08:51:57.778856 kernel: CPU features: detected: GIC system register CPU interface Mar 4 08:51:57.778862 kernel: CPU features: detected: Spectre-v4 Mar 4 08:51:57.778869 kernel: CPU features: detected: Spectre-BHB Mar 4 08:51:57.778876 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 4 08:51:57.778883 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 4 08:51:57.778889 kernel: CPU features: detected: ARM erratum 1418040 Mar 4 08:51:57.778896 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 4 08:51:57.778902 kernel: alternatives: applying boot alternatives Mar 4 08:51:57.778909 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=9550c2083f3062ad7c57f28a015a3afab95dfddb073076612b771af8d5df9e06 Mar 4 08:51:57.778916 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Mar 4 08:51:57.778922 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Mar 4 08:51:57.778929 kernel: Fallback order for Node 0: 0 Mar 4 08:51:57.778937 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Mar 4 08:51:57.778943 kernel: Policy zone: Normal Mar 4 08:51:57.778949 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 4 08:51:57.778956 kernel: software IO TLB: area num 4. Mar 4 08:51:57.778962 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Mar 4 08:51:57.778969 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 4 08:51:57.778975 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 4 08:51:57.778982 kernel: rcu: RCU event tracing is enabled. Mar 4 08:51:57.778989 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 4 08:51:57.778995 kernel: Trampoline variant of Tasks RCU enabled. Mar 4 08:51:57.779002 kernel: Tracing variant of Tasks RCU enabled. Mar 4 08:51:57.779008 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 4 08:51:57.779016 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 4 08:51:57.779022 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 4 08:51:57.779029 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 4 08:51:57.779035 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 4 08:51:57.779042 kernel: GICv3: 256 SPIs implemented Mar 4 08:51:57.779048 kernel: GICv3: 0 Extended SPIs implemented Mar 4 08:51:57.779054 kernel: Root IRQ handler: gic_handle_irq Mar 4 08:51:57.779061 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 4 08:51:57.779067 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Mar 4 08:51:57.779073 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 4 08:51:57.779080 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 4 08:51:57.779086 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Mar 4 08:51:57.779094 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Mar 4 08:51:57.779101 kernel: GICv3: using LPI property table @0x0000000100130000 Mar 4 08:51:57.779107 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Mar 4 08:51:57.779113 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 4 08:51:57.779120 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 4 08:51:57.779126 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 4 08:51:57.779133 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 4 08:51:57.779139 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 4 08:51:57.779146 kernel: arm-pv: using stolen time PV Mar 4 08:51:57.779152 kernel: Console: colour dummy device 80x25 Mar 4 08:51:57.779160 kernel: ACPI: Core revision 20240827 Mar 4 08:51:57.779180 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 4 08:51:57.779187 kernel: pid_max: default: 32768 minimum: 301 Mar 4 08:51:57.779193 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 4 08:51:57.779200 kernel: landlock: Up and running. Mar 4 08:51:57.779206 kernel: SELinux: Initializing. Mar 4 08:51:57.779213 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 4 08:51:57.779220 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 4 08:51:57.779226 kernel: rcu: Hierarchical SRCU implementation. Mar 4 08:51:57.779233 kernel: rcu: Max phase no-delay instances is 400. Mar 4 08:51:57.779241 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 4 08:51:57.779248 kernel: Remapping and enabling EFI services. Mar 4 08:51:57.779255 kernel: smp: Bringing up secondary CPUs ... Mar 4 08:51:57.779261 kernel: Detected PIPT I-cache on CPU1 Mar 4 08:51:57.779268 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 4 08:51:57.779275 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Mar 4 08:51:57.779281 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 4 08:51:57.779288 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 4 08:51:57.779295 kernel: Detected PIPT I-cache on CPU2 Mar 4 08:51:57.779307 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 4 08:51:57.779314 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Mar 4 08:51:57.779321 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 4 08:51:57.779329 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 4 08:51:57.779336 kernel: Detected PIPT I-cache on CPU3 Mar 4 08:51:57.779343 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 4 08:51:57.779349 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Mar 4 08:51:57.779356 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 4 08:51:57.779364 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 4 08:51:57.779371 kernel: smp: Brought up 1 node, 4 CPUs Mar 4 08:51:57.779378 kernel: SMP: Total of 4 processors activated. Mar 4 08:51:57.779385 kernel: CPU: All CPU(s) started at EL1 Mar 4 08:51:57.779392 kernel: CPU features: detected: 32-bit EL0 Support Mar 4 08:51:57.779399 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 4 08:51:57.779406 kernel: CPU features: detected: Common not Private translations Mar 4 08:51:57.779413 kernel: CPU features: detected: CRC32 instructions Mar 4 08:51:57.779420 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 4 08:51:57.779428 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 4 08:51:57.779435 kernel: CPU features: detected: LSE atomic instructions Mar 4 08:51:57.779441 kernel: CPU features: detected: Privileged Access Never Mar 4 08:51:57.779448 kernel: CPU features: detected: RAS Extension Support Mar 4 08:51:57.779455 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 4 08:51:57.779466 kernel: alternatives: applying system-wide alternatives Mar 4 08:51:57.779472 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Mar 4 08:51:57.779480 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Mar 4 08:51:57.779487 kernel: devtmpfs: initialized Mar 4 08:51:57.779496 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 4 08:51:57.779503 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 4 08:51:57.779510 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 4 08:51:57.779516 kernel: 0 pages in range for non-PLT usage Mar 4 08:51:57.779523 kernel: 508400 pages in range for PLT usage Mar 4 08:51:57.779530 kernel: pinctrl core: initialized pinctrl subsystem Mar 4 08:51:57.779537 kernel: SMBIOS 3.0.0 present. Mar 4 08:51:57.779544 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Mar 4 08:51:57.779551 kernel: DMI: Memory slots populated: 1/1 Mar 4 08:51:57.779559 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 4 08:51:57.779566 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Mar 4 08:51:57.779573 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 4 08:51:57.779580 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 4 08:51:57.779587 kernel: audit: initializing netlink subsys (disabled) Mar 4 08:51:57.779594 kernel: audit: type=2000 audit(0.043:1): state=initialized audit_enabled=0 res=1 Mar 4 08:51:57.779601 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 4 08:51:57.779608 kernel: cpuidle: using governor menu Mar 4 08:51:57.779615 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 4 08:51:57.779623 kernel: ASID allocator initialised with 32768 entries Mar 4 08:51:57.779630 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 4 08:51:57.779637 kernel: Serial: AMBA PL011 UART driver Mar 4 08:51:57.779644 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 4 08:51:57.779651 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 4 08:51:57.779658 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 4 08:51:57.779665 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 4 08:51:57.779672 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 4 08:51:57.779679 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 4 08:51:57.779687 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 4 08:51:57.779694 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 4 08:51:57.779712 kernel: ACPI: Added _OSI(Module Device) Mar 4 08:51:57.779719 kernel: ACPI: Added _OSI(Processor Device) Mar 4 08:51:57.779726 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 4 08:51:57.779733 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 4 08:51:57.779740 kernel: ACPI: Interpreter enabled Mar 4 08:51:57.779747 kernel: ACPI: Using GIC for interrupt routing Mar 4 08:51:57.779754 kernel: ACPI: MCFG table detected, 1 entries Mar 4 08:51:57.779763 kernel: ACPI: CPU0 has been hot-added Mar 4 08:51:57.779771 kernel: ACPI: CPU1 has been hot-added Mar 4 08:51:57.779778 kernel: ACPI: CPU2 has been hot-added Mar 4 08:51:57.779785 kernel: ACPI: CPU3 has been hot-added Mar 4 08:51:57.779792 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 4 08:51:57.779799 kernel: printk: legacy console [ttyAMA0] enabled Mar 4 08:51:57.779806 kernel: ACPI: PCI: Interrupt link L000 configured for IRQ 35 Mar 4 08:51:57.779813 kernel: ACPI: PCI: Interrupt link L001 configured for IRQ 36 Mar 4 08:51:57.779820 kernel: ACPI: PCI: Interrupt link L002 configured for IRQ 37 Mar 4 08:51:57.779828 kernel: ACPI: PCI: Interrupt link L003 configured for IRQ 38 Mar 4 08:51:57.779835 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 4 08:51:57.779971 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 4 08:51:57.780044 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 4 08:51:57.780145 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 4 08:51:57.780230 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 4 08:51:57.780289 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 4 08:51:57.780302 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 4 08:51:57.780309 kernel: PCI host bridge to bus 0000:00 Mar 4 08:51:57.780375 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 4 08:51:57.780428 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 4 08:51:57.780481 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 4 08:51:57.780533 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 4 08:51:57.780612 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Mar 4 08:51:57.780689 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.780749 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Mar 4 08:51:57.780807 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 4 08:51:57.780866 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Mar 4 08:51:57.780928 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Mar 4 08:51:57.780999 kernel: pci 0000:00:01.0: enabling Extended Tags Mar 4 08:51:57.781069 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.781132 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Mar 4 08:51:57.781261 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Mar 4 08:51:57.781339 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Mar 4 08:51:57.781398 kernel: pci 0000:00:01.1: enabling Extended Tags Mar 4 08:51:57.781465 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.781527 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Mar 4 08:51:57.781588 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Mar 4 08:51:57.781646 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Mar 4 08:51:57.781702 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Mar 4 08:51:57.781759 kernel: pci 0000:00:01.2: enabling Extended Tags Mar 4 08:51:57.781823 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.781880 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Mar 4 08:51:57.781936 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Mar 4 08:51:57.781995 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Mar 4 08:51:57.782051 kernel: pci 0000:00:01.3: enabling Extended Tags Mar 4 08:51:57.782114 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.782181 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Mar 4 08:51:57.782254 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Mar 4 08:51:57.782312 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Mar 4 08:51:57.782368 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Mar 4 08:51:57.782426 kernel: pci 0000:00:01.4: enabling Extended Tags Mar 4 08:51:57.782493 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.782551 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Mar 4 08:51:57.782607 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Mar 4 08:51:57.782663 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Mar 4 08:51:57.782719 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Mar 4 08:51:57.782775 kernel: pci 0000:00:01.5: enabling Extended Tags Mar 4 08:51:57.782837 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.782898 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Mar 4 08:51:57.782954 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Mar 4 08:51:57.783010 kernel: pci 0000:00:01.6: enabling Extended Tags Mar 4 08:51:57.783074 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.783133 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Mar 4 08:51:57.783210 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Mar 4 08:51:57.783275 kernel: pci 0000:00:01.7: enabling Extended Tags Mar 4 08:51:57.783342 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.783402 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Mar 4 08:51:57.783461 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Mar 4 08:51:57.783517 kernel: pci 0000:00:02.0: enabling Extended Tags Mar 4 08:51:57.783585 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.783648 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Mar 4 08:51:57.783720 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Mar 4 08:51:57.783782 kernel: pci 0000:00:02.1: enabling Extended Tags Mar 4 08:51:57.783846 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.783903 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Mar 4 08:51:57.783972 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Mar 4 08:51:57.784029 kernel: pci 0000:00:02.2: enabling Extended Tags Mar 4 08:51:57.784094 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.784154 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Mar 4 08:51:57.784246 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Mar 4 08:51:57.784307 kernel: pci 0000:00:02.3: enabling Extended Tags Mar 4 08:51:57.784371 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.784431 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Mar 4 08:51:57.784487 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Mar 4 08:51:57.784543 kernel: pci 0000:00:02.4: enabling Extended Tags Mar 4 08:51:57.784611 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.784669 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Mar 4 08:51:57.784727 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Mar 4 08:51:57.784784 kernel: pci 0000:00:02.5: enabling Extended Tags Mar 4 08:51:57.784848 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.784905 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Mar 4 08:51:57.784962 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Mar 4 08:51:57.785020 kernel: pci 0000:00:02.6: enabling Extended Tags Mar 4 08:51:57.785082 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.785140 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Mar 4 08:51:57.785212 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Mar 4 08:51:57.785271 kernel: pci 0000:00:02.7: enabling Extended Tags Mar 4 08:51:57.785335 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.785394 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Mar 4 08:51:57.785454 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Mar 4 08:51:57.785511 kernel: pci 0000:00:03.0: enabling Extended Tags Mar 4 08:51:57.785580 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.785637 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Mar 4 08:51:57.785693 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Mar 4 08:51:57.785749 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Mar 4 08:51:57.785805 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Mar 4 08:51:57.785865 kernel: pci 0000:00:03.1: enabling Extended Tags Mar 4 08:51:57.785931 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.785988 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Mar 4 08:51:57.786044 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Mar 4 08:51:57.786100 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Mar 4 08:51:57.786156 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Mar 4 08:51:57.786234 kernel: pci 0000:00:03.2: enabling Extended Tags Mar 4 08:51:57.786302 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.786359 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Mar 4 08:51:57.786416 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Mar 4 08:51:57.786471 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Mar 4 08:51:57.786527 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Mar 4 08:51:57.786583 kernel: pci 0000:00:03.3: enabling Extended Tags Mar 4 08:51:57.786647 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.786706 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Mar 4 08:51:57.786761 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Mar 4 08:51:57.786817 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Mar 4 08:51:57.786873 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Mar 4 08:51:57.786929 kernel: pci 0000:00:03.4: enabling Extended Tags Mar 4 08:51:57.786991 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.787048 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Mar 4 08:51:57.787106 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Mar 4 08:51:57.787162 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Mar 4 08:51:57.787232 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Mar 4 08:51:57.787289 kernel: pci 0000:00:03.5: enabling Extended Tags Mar 4 08:51:57.787353 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.787410 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Mar 4 08:51:57.787466 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Mar 4 08:51:57.787526 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Mar 4 08:51:57.787582 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Mar 4 08:51:57.787639 kernel: pci 0000:00:03.6: enabling Extended Tags Mar 4 08:51:57.787714 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.787777 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Mar 4 08:51:57.787835 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Mar 4 08:51:57.787891 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Mar 4 08:51:57.787949 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Mar 4 08:51:57.788006 kernel: pci 0000:00:03.7: enabling Extended Tags Mar 4 08:51:57.788070 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.788127 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Mar 4 08:51:57.788197 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Mar 4 08:51:57.788257 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Mar 4 08:51:57.788313 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Mar 4 08:51:57.788370 kernel: pci 0000:00:04.0: enabling Extended Tags Mar 4 08:51:57.788434 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.788491 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Mar 4 08:51:57.788547 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Mar 4 08:51:57.788606 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Mar 4 08:51:57.788662 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Mar 4 08:51:57.788718 kernel: pci 0000:00:04.1: enabling Extended Tags Mar 4 08:51:57.788786 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.788843 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Mar 4 08:51:57.788901 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Mar 4 08:51:57.788959 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Mar 4 08:51:57.789019 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Mar 4 08:51:57.789082 kernel: pci 0000:00:04.2: enabling Extended Tags Mar 4 08:51:57.789153 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.789241 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Mar 4 08:51:57.789312 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Mar 4 08:51:57.789375 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Mar 4 08:51:57.789431 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Mar 4 08:51:57.789493 kernel: pci 0000:00:04.3: enabling Extended Tags Mar 4 08:51:57.789587 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.789646 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Mar 4 08:51:57.789702 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Mar 4 08:51:57.789758 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Mar 4 08:51:57.789814 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Mar 4 08:51:57.789880 kernel: pci 0000:00:04.4: enabling Extended Tags Mar 4 08:51:57.789949 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.790006 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Mar 4 08:51:57.790063 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Mar 4 08:51:57.790119 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Mar 4 08:51:57.790190 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Mar 4 08:51:57.790251 kernel: pci 0000:00:04.5: enabling Extended Tags Mar 4 08:51:57.790318 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.790376 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Mar 4 08:51:57.790433 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Mar 4 08:51:57.790491 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Mar 4 08:51:57.790548 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Mar 4 08:51:57.790606 kernel: pci 0000:00:04.6: enabling Extended Tags Mar 4 08:51:57.790672 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.790942 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Mar 4 08:51:57.791004 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Mar 4 08:51:57.791060 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Mar 4 08:51:57.791116 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Mar 4 08:51:57.791179 kernel: pci 0000:00:04.7: enabling Extended Tags Mar 4 08:51:57.791247 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 4 08:51:57.791309 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Mar 4 08:51:57.791365 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Mar 4 08:51:57.791422 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Mar 4 08:51:57.791478 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Mar 4 08:51:57.791534 kernel: pci 0000:00:05.0: enabling Extended Tags Mar 4 08:51:57.791601 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 4 08:51:57.791664 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Mar 4 08:51:57.791739 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Mar 4 08:51:57.791838 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 4 08:51:57.791902 kernel: pci 0000:01:00.0: enabling Extended Tags Mar 4 08:51:57.791971 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 4 08:51:57.792032 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Mar 4 08:51:57.792120 kernel: pci 0000:02:00.0: enabling Extended Tags Mar 4 08:51:57.792208 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Mar 4 08:51:57.792275 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Mar 4 08:51:57.792339 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Mar 4 08:51:57.792400 kernel: pci 0000:03:00.0: enabling Extended Tags Mar 4 08:51:57.792467 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 4 08:51:57.792527 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Mar 4 08:51:57.792586 kernel: pci 0000:04:00.0: enabling Extended Tags Mar 4 08:51:57.792655 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 4 08:51:57.792717 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Mar 4 08:51:57.792803 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Mar 4 08:51:57.792868 kernel: pci 0000:05:00.0: enabling Extended Tags Mar 4 08:51:57.792934 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Mar 4 08:51:57.792994 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Mar 4 08:51:57.793052 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Mar 4 08:51:57.793113 kernel: pci 0000:06:00.0: enabling Extended Tags Mar 4 08:51:57.793184 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Mar 4 08:51:57.793245 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Mar 4 08:51:57.793302 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Mar 4 08:51:57.793361 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Mar 4 08:51:57.793426 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Mar 4 08:51:57.793482 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Mar 4 08:51:57.793542 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 4 08:51:57.793603 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Mar 4 08:51:57.793660 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Mar 4 08:51:57.793721 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 4 08:51:57.793779 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Mar 4 08:51:57.793836 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Mar 4 08:51:57.793911 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 4 08:51:57.793974 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Mar 4 08:51:57.794035 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Mar 4 08:51:57.794095 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 4 08:51:57.794153 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Mar 4 08:51:57.794218 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Mar 4 08:51:57.794282 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 4 08:51:57.794341 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Mar 4 08:51:57.794400 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Mar 4 08:51:57.794464 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 4 08:51:57.794522 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Mar 4 08:51:57.794580 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Mar 4 08:51:57.794641 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 4 08:51:57.794699 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Mar 4 08:51:57.794757 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Mar 4 08:51:57.794817 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Mar 4 08:51:57.794877 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Mar 4 08:51:57.794934 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Mar 4 08:51:57.794996 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Mar 4 08:51:57.795053 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Mar 4 08:51:57.795110 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Mar 4 08:51:57.795177 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Mar 4 08:51:57.795238 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Mar 4 08:51:57.795298 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Mar 4 08:51:57.795360 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Mar 4 08:51:57.795419 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Mar 4 08:51:57.795478 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Mar 4 08:51:57.795541 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Mar 4 08:51:57.795601 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Mar 4 08:51:57.795676 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Mar 4 08:51:57.795759 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Mar 4 08:51:57.795822 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Mar 4 08:51:57.795880 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Mar 4 08:51:57.795939 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Mar 4 08:51:57.795996 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Mar 4 08:51:57.796053 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Mar 4 08:51:57.796118 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Mar 4 08:51:57.796186 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Mar 4 08:51:57.796265 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Mar 4 08:51:57.796329 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Mar 4 08:51:57.796387 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Mar 4 08:51:57.796444 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Mar 4 08:51:57.796509 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Mar 4 08:51:57.796580 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Mar 4 08:51:57.796638 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Mar 4 08:51:57.796700 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Mar 4 08:51:57.796759 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Mar 4 08:51:57.796816 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Mar 4 08:51:57.796879 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Mar 4 08:51:57.796937 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Mar 4 08:51:57.796997 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Mar 4 08:51:57.797060 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Mar 4 08:51:57.797119 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Mar 4 08:51:57.797184 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Mar 4 08:51:57.797248 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Mar 4 08:51:57.797307 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Mar 4 08:51:57.797364 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Mar 4 08:51:57.797426 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Mar 4 08:51:57.797486 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Mar 4 08:51:57.797544 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Mar 4 08:51:57.797605 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Mar 4 08:51:57.797665 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Mar 4 08:51:57.797727 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Mar 4 08:51:57.797792 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Mar 4 08:51:57.797852 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Mar 4 08:51:57.797929 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Mar 4 08:51:57.797995 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Mar 4 08:51:57.798054 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Mar 4 08:51:57.798113 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Mar 4 08:51:57.798184 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Mar 4 08:51:57.798249 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Mar 4 08:51:57.798307 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Mar 4 08:51:57.798369 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Mar 4 08:51:57.798429 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Mar 4 08:51:57.798486 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Mar 4 08:51:57.798546 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Mar 4 08:51:57.798605 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Mar 4 08:51:57.798662 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Mar 4 08:51:57.798722 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Mar 4 08:51:57.798785 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Mar 4 08:51:57.798847 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Mar 4 08:51:57.798911 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Mar 4 08:51:57.798971 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Mar 4 08:51:57.799029 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Mar 4 08:51:57.799089 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Mar 4 08:51:57.799147 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Mar 4 08:51:57.799212 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Mar 4 08:51:57.799278 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Mar 4 08:51:57.799341 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Mar 4 08:51:57.799401 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Mar 4 08:51:57.799474 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Mar 4 08:51:57.799533 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Mar 4 08:51:57.799590 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Mar 4 08:51:57.799651 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Mar 4 08:51:57.799719 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Mar 4 08:51:57.799784 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Mar 4 08:51:57.799843 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Mar 4 08:51:57.799902 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Mar 4 08:51:57.799960 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Mar 4 08:51:57.800020 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Mar 4 08:51:57.800077 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Mar 4 08:51:57.800136 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Mar 4 08:51:57.800209 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Mar 4 08:51:57.800273 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Mar 4 08:51:57.800331 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Mar 4 08:51:57.800390 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Mar 4 08:51:57.800449 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Mar 4 08:51:57.800509 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Mar 4 08:51:57.800591 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Mar 4 08:51:57.800652 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Mar 4 08:51:57.800712 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Mar 4 08:51:57.800791 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Mar 4 08:51:57.800850 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Mar 4 08:51:57.800911 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Mar 4 08:51:57.800971 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Mar 4 08:51:57.801029 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Mar 4 08:51:57.801086 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Mar 4 08:51:57.801145 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Mar 4 08:51:57.801215 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Mar 4 08:51:57.801277 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Mar 4 08:51:57.801335 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Mar 4 08:51:57.801394 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Mar 4 08:51:57.801453 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Mar 4 08:51:57.801515 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Mar 4 08:51:57.801578 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Mar 4 08:51:57.801639 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Mar 4 08:51:57.801699 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Mar 4 08:51:57.801761 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Mar 4 08:51:57.801818 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Mar 4 08:51:57.801879 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Mar 4 08:51:57.801936 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Mar 4 08:51:57.801994 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Mar 4 08:51:57.802051 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Mar 4 08:51:57.802124 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Mar 4 08:51:57.802200 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Mar 4 08:51:57.802266 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Mar 4 08:51:57.802324 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Mar 4 08:51:57.802383 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Mar 4 08:51:57.802440 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Mar 4 08:51:57.802497 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Mar 4 08:51:57.802554 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Mar 4 08:51:57.802613 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Mar 4 08:51:57.802673 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Mar 4 08:51:57.802733 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Mar 4 08:51:57.802790 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Mar 4 08:51:57.802848 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Mar 4 08:51:57.802906 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Mar 4 08:51:57.802966 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Mar 4 08:51:57.803023 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Mar 4 08:51:57.803081 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Mar 4 08:51:57.803138 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Mar 4 08:51:57.803205 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Mar 4 08:51:57.803264 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Mar 4 08:51:57.803323 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Mar 4 08:51:57.803380 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Mar 4 08:51:57.803440 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Mar 4 08:51:57.803497 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Mar 4 08:51:57.803555 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Mar 4 08:51:57.803612 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Mar 4 08:51:57.803669 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Mar 4 08:51:57.803743 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Mar 4 08:51:57.803805 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Mar 4 08:51:57.803862 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Mar 4 08:51:57.803922 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Mar 4 08:51:57.803979 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Mar 4 08:51:57.804038 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Mar 4 08:51:57.804095 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Mar 4 08:51:57.804152 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Mar 4 08:51:57.804227 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Mar 4 08:51:57.804287 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Mar 4 08:51:57.804344 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Mar 4 08:51:57.804406 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Mar 4 08:51:57.804480 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Mar 4 08:51:57.804544 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Mar 4 08:51:57.804602 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Mar 4 08:51:57.804660 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Mar 4 08:51:57.804718 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Mar 4 08:51:57.804776 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Mar 4 08:51:57.804833 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Mar 4 08:51:57.804895 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Mar 4 08:51:57.804952 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Mar 4 08:51:57.805010 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Mar 4 08:51:57.805067 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Mar 4 08:51:57.805148 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Mar 4 08:51:57.805216 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.805274 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.805333 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Mar 4 08:51:57.805392 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.805449 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.805507 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Mar 4 08:51:57.805564 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.805620 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.805679 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Mar 4 08:51:57.805735 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.805794 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.805851 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Mar 4 08:51:57.805908 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.805964 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.806034 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Mar 4 08:51:57.806093 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.806153 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.806222 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Mar 4 08:51:57.806280 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.806337 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.806396 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Mar 4 08:51:57.806476 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.806536 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.806595 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Mar 4 08:51:57.806652 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.806709 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.806770 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Mar 4 08:51:57.806827 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.806885 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.806944 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Mar 4 08:51:57.807014 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.807073 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.807132 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Mar 4 08:51:57.807198 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.807260 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.807318 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Mar 4 08:51:57.807377 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.807434 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.807492 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Mar 4 08:51:57.807551 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.807611 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.807674 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Mar 4 08:51:57.807742 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.807803 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.807863 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Mar 4 08:51:57.807920 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.807980 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.808039 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Mar 4 08:51:57.808097 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.808154 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.808224 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Mar 4 08:51:57.808283 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.808342 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.808400 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Mar 4 08:51:57.808458 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Mar 4 08:51:57.808517 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Mar 4 08:51:57.808577 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Mar 4 08:51:57.808636 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Mar 4 08:51:57.808697 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Mar 4 08:51:57.808760 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Mar 4 08:51:57.808836 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Mar 4 08:51:57.808898 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Mar 4 08:51:57.808960 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Mar 4 08:51:57.809036 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Mar 4 08:51:57.809094 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Mar 4 08:51:57.809154 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Mar 4 08:51:57.809235 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Mar 4 08:51:57.809299 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Mar 4 08:51:57.809360 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.809417 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.809475 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.809538 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.809596 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.809654 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.809715 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.809772 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.809838 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.809898 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.809958 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810016 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810075 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810132 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810213 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810272 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810330 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810387 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810446 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810517 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810577 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810634 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810695 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810753 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810811 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810869 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.810926 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.810983 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.811043 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.811101 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.811160 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.811232 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.811291 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.811349 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.811412 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Mar 4 08:51:57.811469 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Mar 4 08:51:57.811538 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Mar 4 08:51:57.811600 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Mar 4 08:51:57.811660 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Mar 4 08:51:57.811728 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 4 08:51:57.811835 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Mar 4 08:51:57.811899 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Mar 4 08:51:57.811967 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Mar 4 08:51:57.812026 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Mar 4 08:51:57.812114 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Mar 4 08:51:57.812186 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Mar 4 08:51:57.812252 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Mar 4 08:51:57.812312 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Mar 4 08:51:57.812374 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Mar 4 08:51:57.812431 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Mar 4 08:51:57.812488 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Mar 4 08:51:57.812553 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Mar 4 08:51:57.812611 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Mar 4 08:51:57.812668 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Mar 4 08:51:57.812726 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Mar 4 08:51:57.812790 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Mar 4 08:51:57.812853 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Mar 4 08:51:57.812911 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Mar 4 08:51:57.812969 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Mar 4 08:51:57.813026 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Mar 4 08:51:57.813092 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Mar 4 08:51:57.813153 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Mar 4 08:51:57.813257 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Mar 4 08:51:57.813321 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Mar 4 08:51:57.813383 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 4 08:51:57.813444 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Mar 4 08:51:57.813520 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Mar 4 08:51:57.813582 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 4 08:51:57.813643 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Mar 4 08:51:57.813703 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Mar 4 08:51:57.813762 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 4 08:51:57.813820 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Mar 4 08:51:57.813879 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Mar 4 08:51:57.813937 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Mar 4 08:51:57.813996 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Mar 4 08:51:57.814054 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Mar 4 08:51:57.814114 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Mar 4 08:51:57.814184 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Mar 4 08:51:57.814247 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Mar 4 08:51:57.814316 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Mar 4 08:51:57.814376 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Mar 4 08:51:57.814433 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Mar 4 08:51:57.814489 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Mar 4 08:51:57.814550 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Mar 4 08:51:57.814608 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Mar 4 08:51:57.814665 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Mar 4 08:51:57.814722 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Mar 4 08:51:57.814780 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Mar 4 08:51:57.814843 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Mar 4 08:51:57.814905 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Mar 4 08:51:57.814968 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Mar 4 08:51:57.815025 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Mar 4 08:51:57.815084 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Mar 4 08:51:57.815141 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Mar 4 08:51:57.815210 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Mar 4 08:51:57.815270 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Mar 4 08:51:57.815330 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Mar 4 08:51:57.815390 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Mar 4 08:51:57.815449 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Mar 4 08:51:57.815506 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Mar 4 08:51:57.815563 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Mar 4 08:51:57.815621 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Mar 4 08:51:57.815678 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Mar 4 08:51:57.815746 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Mar 4 08:51:57.815805 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Mar 4 08:51:57.815865 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Mar 4 08:51:57.815930 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Mar 4 08:51:57.815989 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Mar 4 08:51:57.816047 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Mar 4 08:51:57.816108 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Mar 4 08:51:57.816175 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Mar 4 08:51:57.816253 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Mar 4 08:51:57.816311 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Mar 4 08:51:57.816373 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Mar 4 08:51:57.816431 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Mar 4 08:51:57.816489 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Mar 4 08:51:57.816547 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Mar 4 08:51:57.816606 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Mar 4 08:51:57.816665 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Mar 4 08:51:57.816736 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Mar 4 08:51:57.816795 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Mar 4 08:51:57.816854 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Mar 4 08:51:57.816915 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Mar 4 08:51:57.816971 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Mar 4 08:51:57.817028 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Mar 4 08:51:57.817086 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Mar 4 08:51:57.817144 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Mar 4 08:51:57.817224 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Mar 4 08:51:57.817283 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Mar 4 08:51:57.817341 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Mar 4 08:51:57.817400 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Mar 4 08:51:57.817456 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Mar 4 08:51:57.817513 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Mar 4 08:51:57.817570 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Mar 4 08:51:57.817628 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Mar 4 08:51:57.817684 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Mar 4 08:51:57.817741 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Mar 4 08:51:57.817799 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Mar 4 08:51:57.817861 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Mar 4 08:51:57.817918 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Mar 4 08:51:57.817975 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Mar 4 08:51:57.818035 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Mar 4 08:51:57.818093 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Mar 4 08:51:57.818150 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Mar 4 08:51:57.818218 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Mar 4 08:51:57.818278 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Mar 4 08:51:57.818336 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Mar 4 08:51:57.818395 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Mar 4 08:51:57.818452 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Mar 4 08:51:57.818509 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Mar 4 08:51:57.818567 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Mar 4 08:51:57.818623 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Mar 4 08:51:57.818680 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Mar 4 08:51:57.818738 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Mar 4 08:51:57.818795 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Mar 4 08:51:57.818854 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Mar 4 08:51:57.818912 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Mar 4 08:51:57.818970 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Mar 4 08:51:57.819028 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Mar 4 08:51:57.819085 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Mar 4 08:51:57.819142 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Mar 4 08:51:57.819210 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 4 08:51:57.819265 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 4 08:51:57.819319 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 4 08:51:57.819384 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Mar 4 08:51:57.819439 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Mar 4 08:51:57.819499 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Mar 4 08:51:57.819553 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Mar 4 08:51:57.819614 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Mar 4 08:51:57.819671 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Mar 4 08:51:57.819748 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Mar 4 08:51:57.819805 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Mar 4 08:51:57.819873 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Mar 4 08:51:57.819927 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Mar 4 08:51:57.819988 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Mar 4 08:51:57.820046 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 4 08:51:57.820107 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Mar 4 08:51:57.820161 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 4 08:51:57.820231 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Mar 4 08:51:57.820290 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 4 08:51:57.820350 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Mar 4 08:51:57.820407 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Mar 4 08:51:57.820469 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Mar 4 08:51:57.820522 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Mar 4 08:51:57.820583 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Mar 4 08:51:57.820638 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Mar 4 08:51:57.820700 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Mar 4 08:51:57.820755 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Mar 4 08:51:57.820817 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Mar 4 08:51:57.820872 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Mar 4 08:51:57.820937 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Mar 4 08:51:57.820992 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Mar 4 08:51:57.821051 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Mar 4 08:51:57.821105 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Mar 4 08:51:57.821181 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Mar 4 08:51:57.821239 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Mar 4 08:51:57.821300 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Mar 4 08:51:57.821353 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Mar 4 08:51:57.821413 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Mar 4 08:51:57.821469 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Mar 4 08:51:57.821528 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Mar 4 08:51:57.821581 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Mar 4 08:51:57.821633 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Mar 4 08:51:57.821695 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Mar 4 08:51:57.821748 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Mar 4 08:51:57.821803 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Mar 4 08:51:57.821862 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Mar 4 08:51:57.821916 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Mar 4 08:51:57.821969 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Mar 4 08:51:57.822029 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Mar 4 08:51:57.822082 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Mar 4 08:51:57.822134 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Mar 4 08:51:57.822233 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Mar 4 08:51:57.822290 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Mar 4 08:51:57.822343 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Mar 4 08:51:57.822404 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Mar 4 08:51:57.822457 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Mar 4 08:51:57.822509 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Mar 4 08:51:57.822568 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Mar 4 08:51:57.822625 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Mar 4 08:51:57.822678 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Mar 4 08:51:57.822736 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Mar 4 08:51:57.822790 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Mar 4 08:51:57.822843 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Mar 4 08:51:57.822916 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Mar 4 08:51:57.822972 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Mar 4 08:51:57.823028 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Mar 4 08:51:57.823087 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Mar 4 08:51:57.823141 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Mar 4 08:51:57.823209 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Mar 4 08:51:57.823272 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Mar 4 08:51:57.823328 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Mar 4 08:51:57.823383 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Mar 4 08:51:57.823444 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Mar 4 08:51:57.823498 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Mar 4 08:51:57.823550 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Mar 4 08:51:57.823613 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Mar 4 08:51:57.823670 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Mar 4 08:51:57.823736 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Mar 4 08:51:57.823800 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Mar 4 08:51:57.823854 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Mar 4 08:51:57.823907 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Mar 4 08:51:57.823968 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Mar 4 08:51:57.824021 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Mar 4 08:51:57.824074 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Mar 4 08:51:57.824083 kernel: iommu: Default domain type: Translated Mar 4 08:51:57.824093 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 4 08:51:57.824100 kernel: efivars: Registered efivars operations Mar 4 08:51:57.824108 kernel: vgaarb: loaded Mar 4 08:51:57.824116 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 4 08:51:57.824123 kernel: VFS: Disk quotas dquot_6.6.0 Mar 4 08:51:57.824131 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 4 08:51:57.824138 kernel: pnp: PnP ACPI init Mar 4 08:51:57.824215 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 4 08:51:57.824227 kernel: pnp: PnP ACPI: found 1 devices Mar 4 08:51:57.824237 kernel: NET: Registered PF_INET protocol family Mar 4 08:51:57.824244 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 4 08:51:57.824252 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Mar 4 08:51:57.824260 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 4 08:51:57.824268 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 4 08:51:57.824275 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Mar 4 08:51:57.824283 kernel: TCP: Hash tables configured (established 131072 bind 65536) Mar 4 08:51:57.824291 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Mar 4 08:51:57.824300 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Mar 4 08:51:57.824307 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 4 08:51:57.824374 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Mar 4 08:51:57.824384 kernel: PCI: CLS 0 bytes, default 64 Mar 4 08:51:57.824392 kernel: kvm [1]: HYP mode not available Mar 4 08:51:57.824399 kernel: Initialise system trusted keyrings Mar 4 08:51:57.824407 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Mar 4 08:51:57.824414 kernel: Key type asymmetric registered Mar 4 08:51:57.824421 kernel: Asymmetric key parser 'x509' registered Mar 4 08:51:57.824430 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 4 08:51:57.824438 kernel: io scheduler mq-deadline registered Mar 4 08:51:57.824445 kernel: io scheduler kyber registered Mar 4 08:51:57.824452 kernel: io scheduler bfq registered Mar 4 08:51:57.824460 kernel: ACPI: \_SB_.L001: Enabled at IRQ 36 Mar 4 08:51:57.824522 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Mar 4 08:51:57.824583 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Mar 4 08:51:57.824641 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.824702 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Mar 4 08:51:57.824765 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Mar 4 08:51:57.824823 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.824884 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Mar 4 08:51:57.824945 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Mar 4 08:51:57.825003 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.825065 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Mar 4 08:51:57.825123 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Mar 4 08:51:57.825190 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.825257 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Mar 4 08:51:57.825316 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Mar 4 08:51:57.825374 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.825434 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Mar 4 08:51:57.825493 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Mar 4 08:51:57.825552 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.825612 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Mar 4 08:51:57.825671 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Mar 4 08:51:57.825728 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.825788 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Mar 4 08:51:57.825846 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Mar 4 08:51:57.825904 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.825914 kernel: ACPI: \_SB_.L002: Enabled at IRQ 37 Mar 4 08:51:57.825974 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Mar 4 08:51:57.826035 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Mar 4 08:51:57.826094 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.826156 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Mar 4 08:51:57.826222 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Mar 4 08:51:57.826279 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.826339 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Mar 4 08:51:57.826396 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Mar 4 08:51:57.826453 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.826513 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Mar 4 08:51:57.826573 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Mar 4 08:51:57.826629 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.826689 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Mar 4 08:51:57.826746 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Mar 4 08:51:57.826803 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.826864 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Mar 4 08:51:57.826921 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Mar 4 08:51:57.826978 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.827041 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Mar 4 08:51:57.827097 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Mar 4 08:51:57.827154 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.827227 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Mar 4 08:51:57.827315 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Mar 4 08:51:57.827373 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.827382 kernel: ACPI: \_SB_.L003: Enabled at IRQ 38 Mar 4 08:51:57.827441 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Mar 4 08:51:57.827502 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Mar 4 08:51:57.827559 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.827619 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Mar 4 08:51:57.827676 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Mar 4 08:51:57.827802 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.827871 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Mar 4 08:51:57.827929 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Mar 4 08:51:57.827991 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.828051 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Mar 4 08:51:57.828109 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Mar 4 08:51:57.828183 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.828252 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Mar 4 08:51:57.828312 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Mar 4 08:51:57.828369 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.828430 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Mar 4 08:51:57.828492 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Mar 4 08:51:57.828549 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.828610 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Mar 4 08:51:57.828670 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Mar 4 08:51:57.828728 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.828793 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Mar 4 08:51:57.828851 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Mar 4 08:51:57.828910 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.828922 kernel: ACPI: \_SB_.L000: Enabled at IRQ 35 Mar 4 08:51:57.828981 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Mar 4 08:51:57.829038 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Mar 4 08:51:57.829096 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.829157 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Mar 4 08:51:57.829224 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Mar 4 08:51:57.829283 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.829344 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Mar 4 08:51:57.829404 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Mar 4 08:51:57.829461 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.829521 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Mar 4 08:51:57.829579 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Mar 4 08:51:57.829636 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.829696 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Mar 4 08:51:57.829753 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Mar 4 08:51:57.829813 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.829874 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Mar 4 08:51:57.829933 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Mar 4 08:51:57.830009 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.830074 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Mar 4 08:51:57.830134 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Mar 4 08:51:57.830214 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.830279 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Mar 4 08:51:57.830341 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Mar 4 08:51:57.830400 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.830460 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Mar 4 08:51:57.830520 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Mar 4 08:51:57.830578 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 08:51:57.830587 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 4 08:51:57.830595 kernel: ACPI: button: Power Button [PWRB] Mar 4 08:51:57.830658 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Mar 4 08:51:57.830725 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Mar 4 08:51:57.830735 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 4 08:51:57.830743 kernel: thunder_xcv, ver 1.0 Mar 4 08:51:57.830750 kernel: thunder_bgx, ver 1.0 Mar 4 08:51:57.830758 kernel: nicpf, ver 1.0 Mar 4 08:51:57.830765 kernel: nicvf, ver 1.0 Mar 4 08:51:57.830833 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 4 08:51:57.830888 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-04T08:51:57 UTC (1772614317) Mar 4 08:51:57.830918 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 4 08:51:57.830927 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 4 08:51:57.830934 kernel: watchdog: NMI not fully supported Mar 4 08:51:57.830942 kernel: watchdog: Hard watchdog permanently disabled Mar 4 08:51:57.830950 kernel: NET: Registered PF_INET6 protocol family Mar 4 08:51:57.830958 kernel: Segment Routing with IPv6 Mar 4 08:51:57.830966 kernel: In-situ OAM (IOAM) with IPv6 Mar 4 08:51:57.830973 kernel: NET: Registered PF_PACKET protocol family Mar 4 08:51:57.830981 kernel: Key type dns_resolver registered Mar 4 08:51:57.830989 kernel: registered taskstats version 1 Mar 4 08:51:57.830998 kernel: Loading compiled-in X.509 certificates Mar 4 08:51:57.831006 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 14a741e1e2b172e51b42fe87d143cf4cae2ad92c' Mar 4 08:51:57.831013 kernel: Demotion targets for Node 0: null Mar 4 08:51:57.831021 kernel: Key type .fscrypt registered Mar 4 08:51:57.831030 kernel: Key type fscrypt-provisioning registered Mar 4 08:51:57.831038 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 4 08:51:57.831046 kernel: ima: Allocated hash algorithm: sha1 Mar 4 08:51:57.831054 kernel: ima: No architecture policies found Mar 4 08:51:57.831063 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 4 08:51:57.831070 kernel: clk: Disabling unused clocks Mar 4 08:51:57.831078 kernel: PM: genpd: Disabling unused power domains Mar 4 08:51:57.831086 kernel: Warning: unable to open an initial console. Mar 4 08:51:57.831094 kernel: Freeing unused kernel memory: 39552K Mar 4 08:51:57.831101 kernel: Run /init as init process Mar 4 08:51:57.831109 kernel: with arguments: Mar 4 08:51:57.831117 kernel: /init Mar 4 08:51:57.831124 kernel: with environment: Mar 4 08:51:57.831133 kernel: HOME=/ Mar 4 08:51:57.831141 kernel: TERM=linux Mar 4 08:51:57.831149 systemd[1]: Successfully made /usr/ read-only. Mar 4 08:51:57.831160 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 4 08:51:57.831182 systemd[1]: Detected virtualization kvm. Mar 4 08:51:57.831191 systemd[1]: Detected architecture arm64. Mar 4 08:51:57.831199 systemd[1]: Running in initrd. Mar 4 08:51:57.831208 systemd[1]: No hostname configured, using default hostname. Mar 4 08:51:57.831217 systemd[1]: Hostname set to . Mar 4 08:51:57.831230 systemd[1]: Initializing machine ID from VM UUID. Mar 4 08:51:57.831238 systemd[1]: Queued start job for default target initrd.target. Mar 4 08:51:57.831246 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 4 08:51:57.831255 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 4 08:51:57.831264 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 4 08:51:57.831272 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 4 08:51:57.831282 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 4 08:51:57.831291 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 4 08:51:57.831301 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 4 08:51:57.831309 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 4 08:51:57.831318 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 4 08:51:57.831326 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 4 08:51:57.831335 systemd[1]: Reached target paths.target - Path Units. Mar 4 08:51:57.831345 systemd[1]: Reached target slices.target - Slice Units. Mar 4 08:51:57.831354 systemd[1]: Reached target swap.target - Swaps. Mar 4 08:51:57.831362 systemd[1]: Reached target timers.target - Timer Units. Mar 4 08:51:57.831370 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 4 08:51:57.831379 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 4 08:51:57.831389 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 4 08:51:57.831397 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 4 08:51:57.831406 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 4 08:51:57.831414 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 4 08:51:57.831423 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 4 08:51:57.831432 systemd[1]: Reached target sockets.target - Socket Units. Mar 4 08:51:57.831440 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 4 08:51:57.831448 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 4 08:51:57.831456 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 4 08:51:57.831464 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 4 08:51:57.831473 systemd[1]: Starting systemd-fsck-usr.service... Mar 4 08:51:57.831481 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 4 08:51:57.831490 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 4 08:51:57.831498 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 08:51:57.831507 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 4 08:51:57.831515 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 4 08:51:57.831525 systemd[1]: Finished systemd-fsck-usr.service. Mar 4 08:51:57.831556 systemd-journald[313]: Collecting audit messages is disabled. Mar 4 08:51:57.831577 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 4 08:51:57.831585 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 4 08:51:57.831597 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 08:51:57.831605 kernel: Bridge firewalling registered Mar 4 08:51:57.831613 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 4 08:51:57.831622 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 4 08:51:57.831630 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 4 08:51:57.831639 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 4 08:51:57.831647 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 4 08:51:57.831655 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 4 08:51:57.831665 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 4 08:51:57.831673 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 4 08:51:57.831681 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 4 08:51:57.831690 systemd-journald[313]: Journal started Mar 4 08:51:57.831720 systemd-journald[313]: Runtime Journal (/run/log/journal/b76944c6b89e42cea806ca42d9fecc5f) is 8M, max 319.5M, 311.5M free. Mar 4 08:51:57.778506 systemd-modules-load[314]: Inserted module 'overlay' Mar 4 08:51:57.833277 systemd[1]: Started systemd-journald.service - Journal Service. Mar 4 08:51:57.793334 systemd-modules-load[314]: Inserted module 'br_netfilter' Mar 4 08:51:57.838148 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 4 08:51:57.846691 systemd-tmpfiles[345]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 4 08:51:57.849524 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 4 08:51:57.851635 dracut-cmdline[344]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=9550c2083f3062ad7c57f28a015a3afab95dfddb073076612b771af8d5df9e06 Mar 4 08:51:57.852137 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 4 08:51:57.896745 systemd-resolved[371]: Positive Trust Anchors: Mar 4 08:51:57.896766 systemd-resolved[371]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 4 08:51:57.896796 systemd-resolved[371]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 4 08:51:57.902296 systemd-resolved[371]: Defaulting to hostname 'linux'. Mar 4 08:51:57.903232 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 4 08:51:57.905784 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 4 08:51:57.926197 kernel: SCSI subsystem initialized Mar 4 08:51:57.931183 kernel: Loading iSCSI transport class v2.0-870. Mar 4 08:51:57.938192 kernel: iscsi: registered transport (tcp) Mar 4 08:51:57.951407 kernel: iscsi: registered transport (qla4xxx) Mar 4 08:51:57.951428 kernel: QLogic iSCSI HBA Driver Mar 4 08:51:57.968160 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 4 08:51:57.986202 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 4 08:51:57.987553 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 4 08:51:58.035330 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 4 08:51:58.037104 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 4 08:51:58.094228 kernel: raid6: neonx8 gen() 15701 MB/s Mar 4 08:51:58.111224 kernel: raid6: neonx4 gen() 15730 MB/s Mar 4 08:51:58.128226 kernel: raid6: neonx2 gen() 13223 MB/s Mar 4 08:51:58.145219 kernel: raid6: neonx1 gen() 10432 MB/s Mar 4 08:51:58.162214 kernel: raid6: int64x8 gen() 6856 MB/s Mar 4 08:51:58.179216 kernel: raid6: int64x4 gen() 7309 MB/s Mar 4 08:51:58.196192 kernel: raid6: int64x2 gen() 6035 MB/s Mar 4 08:51:58.213192 kernel: raid6: int64x1 gen() 5036 MB/s Mar 4 08:51:58.213209 kernel: raid6: using algorithm neonx4 gen() 15730 MB/s Mar 4 08:51:58.230219 kernel: raid6: .... xor() 12314 MB/s, rmw enabled Mar 4 08:51:58.230268 kernel: raid6: using neon recovery algorithm Mar 4 08:51:58.235579 kernel: xor: measuring software checksum speed Mar 4 08:51:58.235632 kernel: 8regs : 21590 MB/sec Mar 4 08:51:58.236185 kernel: 32regs : 20363 MB/sec Mar 4 08:51:58.237214 kernel: arm64_neon : 28041 MB/sec Mar 4 08:51:58.237228 kernel: xor: using function: arm64_neon (28041 MB/sec) Mar 4 08:51:58.290223 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 4 08:51:58.296205 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 4 08:51:58.299841 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 4 08:51:58.326271 systemd-udevd[565]: Using default interface naming scheme 'v255'. Mar 4 08:51:58.330299 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 4 08:51:58.332996 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 4 08:51:58.362230 dracut-pre-trigger[575]: rd.md=0: removing MD RAID activation Mar 4 08:51:58.385355 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 4 08:51:58.387693 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 4 08:51:58.465203 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 4 08:51:58.467624 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 4 08:51:58.513189 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Mar 4 08:51:58.518863 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Mar 4 08:51:58.526184 kernel: ACPI: bus type USB registered Mar 4 08:51:58.526220 kernel: usbcore: registered new interface driver usbfs Mar 4 08:51:58.526237 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 4 08:51:58.528213 kernel: GPT:17805311 != 104857599 Mar 4 08:51:58.528243 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 4 08:51:58.528259 kernel: usbcore: registered new interface driver hub Mar 4 08:51:58.530245 kernel: GPT:17805311 != 104857599 Mar 4 08:51:58.530283 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 4 08:51:58.531243 kernel: usbcore: registered new device driver usb Mar 4 08:51:58.531267 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 4 08:51:58.550742 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 4 08:51:58.550951 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 4 08:51:58.553186 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 4 08:51:58.554912 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 4 08:51:58.555072 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 4 08:51:58.555998 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 4 08:51:58.558041 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 4 08:51:58.556117 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 08:51:58.560882 kernel: hub 1-0:1.0: USB hub found Mar 4 08:51:58.561068 kernel: hub 1-0:1.0: 4 ports detected Mar 4 08:51:58.560447 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 08:51:58.564039 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 4 08:51:58.564223 kernel: hub 2-0:1.0: USB hub found Mar 4 08:51:58.564314 kernel: hub 2-0:1.0: 4 ports detected Mar 4 08:51:58.563547 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 08:51:58.588890 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 08:51:58.618826 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 4 08:51:58.620367 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 4 08:51:58.628905 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 4 08:51:58.636377 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 4 08:51:58.637585 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 4 08:51:58.646522 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 4 08:51:58.647761 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 4 08:51:58.649774 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 4 08:51:58.651815 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 4 08:51:58.654526 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 4 08:51:58.656582 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 4 08:51:58.670416 disk-uuid[663]: Primary Header is updated. Mar 4 08:51:58.670416 disk-uuid[663]: Secondary Entries is updated. Mar 4 08:51:58.670416 disk-uuid[663]: Secondary Header is updated. Mar 4 08:51:58.676748 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 4 08:51:58.680260 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 4 08:51:58.804201 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 4 08:51:58.934955 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Mar 4 08:51:58.935060 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 4 08:51:58.935504 kernel: usbcore: registered new interface driver usbhid Mar 4 08:51:58.935543 kernel: usbhid: USB HID core driver Mar 4 08:51:59.041213 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Mar 4 08:51:59.167213 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Mar 4 08:51:59.219229 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Mar 4 08:51:59.691126 disk-uuid[666]: The operation has completed successfully. Mar 4 08:51:59.692582 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 4 08:51:59.733185 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 4 08:51:59.733284 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 4 08:51:59.759072 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 4 08:51:59.781811 sh[686]: Success Mar 4 08:51:59.795796 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 4 08:51:59.795836 kernel: device-mapper: uevent: version 1.0.3 Mar 4 08:51:59.795847 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 4 08:51:59.803239 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 4 08:51:59.859240 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 4 08:51:59.861864 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 4 08:51:59.878322 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 4 08:51:59.896200 kernel: BTRFS: device fsid 639fb782-fb4f-4fdd-a572-72667a093996 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (698) Mar 4 08:51:59.898006 kernel: BTRFS info (device dm-0): first mount of filesystem 639fb782-fb4f-4fdd-a572-72667a093996 Mar 4 08:51:59.898026 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 4 08:51:59.909264 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 4 08:51:59.909286 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 4 08:51:59.912330 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 4 08:51:59.913499 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 4 08:51:59.914544 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 4 08:51:59.915300 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 4 08:51:59.917501 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 4 08:51:59.944883 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (729) Mar 4 08:51:59.944937 kernel: BTRFS info (device vda6): first mount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 4 08:51:59.945811 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 4 08:51:59.950568 kernel: BTRFS info (device vda6): turning on async discard Mar 4 08:51:59.950610 kernel: BTRFS info (device vda6): enabling free space tree Mar 4 08:51:59.955200 kernel: BTRFS info (device vda6): last unmount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 4 08:51:59.956318 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 4 08:51:59.958361 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 4 08:52:00.019674 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 4 08:52:00.022561 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 4 08:52:00.058578 systemd-networkd[869]: lo: Link UP Mar 4 08:52:00.058595 systemd-networkd[869]: lo: Gained carrier Mar 4 08:52:00.059524 systemd-networkd[869]: Enumeration completed Mar 4 08:52:00.059974 systemd-networkd[869]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 08:52:00.059977 systemd-networkd[869]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 4 08:52:00.060365 systemd-networkd[869]: eth0: Link UP Mar 4 08:52:00.060757 systemd-networkd[869]: eth0: Gained carrier Mar 4 08:52:00.060767 systemd-networkd[869]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 08:52:00.061056 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 4 08:52:00.062186 systemd[1]: Reached target network.target - Network. Mar 4 08:52:00.081261 systemd-networkd[869]: eth0: DHCPv4 address 10.0.9.143/25, gateway 10.0.9.129 acquired from 10.0.9.129 Mar 4 08:52:00.092350 ignition[787]: Ignition 2.22.0 Mar 4 08:52:00.092367 ignition[787]: Stage: fetch-offline Mar 4 08:52:00.092407 ignition[787]: no configs at "/usr/lib/ignition/base.d" Mar 4 08:52:00.092414 ignition[787]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 08:52:00.095370 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 4 08:52:00.092497 ignition[787]: parsed url from cmdline: "" Mar 4 08:52:00.097265 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 4 08:52:00.092500 ignition[787]: no config URL provided Mar 4 08:52:00.092504 ignition[787]: reading system config file "/usr/lib/ignition/user.ign" Mar 4 08:52:00.092511 ignition[787]: no config at "/usr/lib/ignition/user.ign" Mar 4 08:52:00.092516 ignition[787]: failed to fetch config: resource requires networking Mar 4 08:52:00.092661 ignition[787]: Ignition finished successfully Mar 4 08:52:00.131291 ignition[881]: Ignition 2.22.0 Mar 4 08:52:00.131309 ignition[881]: Stage: fetch Mar 4 08:52:00.131442 ignition[881]: no configs at "/usr/lib/ignition/base.d" Mar 4 08:52:00.131451 ignition[881]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 08:52:00.131530 ignition[881]: parsed url from cmdline: "" Mar 4 08:52:00.131533 ignition[881]: no config URL provided Mar 4 08:52:00.131537 ignition[881]: reading system config file "/usr/lib/ignition/user.ign" Mar 4 08:52:00.131543 ignition[881]: no config at "/usr/lib/ignition/user.ign" Mar 4 08:52:00.131802 ignition[881]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 4 08:52:00.131882 ignition[881]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 4 08:52:00.132109 ignition[881]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 4 08:52:01.132036 ignition[881]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 4 08:52:01.132322 ignition[881]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 4 08:52:01.292116 systemd-networkd[869]: eth0: Gained IPv6LL Mar 4 08:52:01.617108 ignition[881]: GET result: OK Mar 4 08:52:01.617228 ignition[881]: parsing config with SHA512: 031f9817b1997337bdb0ed1b63bd0675f94a8b9026e16eac82788f89968f6d4c354d38cdf7c6b6797025f90086f473d558c78b597c620b03e183329eb0bee294 Mar 4 08:52:01.621773 unknown[881]: fetched base config from "system" Mar 4 08:52:01.621785 unknown[881]: fetched base config from "system" Mar 4 08:52:01.622110 ignition[881]: fetch: fetch complete Mar 4 08:52:01.621790 unknown[881]: fetched user config from "openstack" Mar 4 08:52:01.622114 ignition[881]: fetch: fetch passed Mar 4 08:52:01.624973 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 4 08:52:01.622152 ignition[881]: Ignition finished successfully Mar 4 08:52:01.626987 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 4 08:52:01.665335 ignition[889]: Ignition 2.22.0 Mar 4 08:52:01.665353 ignition[889]: Stage: kargs Mar 4 08:52:01.665487 ignition[889]: no configs at "/usr/lib/ignition/base.d" Mar 4 08:52:01.665495 ignition[889]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 08:52:01.666229 ignition[889]: kargs: kargs passed Mar 4 08:52:01.666273 ignition[889]: Ignition finished successfully Mar 4 08:52:01.668823 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 4 08:52:01.670641 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 4 08:52:01.703883 ignition[897]: Ignition 2.22.0 Mar 4 08:52:01.703900 ignition[897]: Stage: disks Mar 4 08:52:01.704024 ignition[897]: no configs at "/usr/lib/ignition/base.d" Mar 4 08:52:01.704034 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 08:52:01.704740 ignition[897]: disks: disks passed Mar 4 08:52:01.706659 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 4 08:52:01.704784 ignition[897]: Ignition finished successfully Mar 4 08:52:01.708125 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 4 08:52:01.709885 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 4 08:52:01.711330 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 4 08:52:01.712889 systemd[1]: Reached target sysinit.target - System Initialization. Mar 4 08:52:01.714598 systemd[1]: Reached target basic.target - Basic System. Mar 4 08:52:01.716886 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 4 08:52:01.749011 systemd-fsck[907]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 4 08:52:01.752183 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 4 08:52:01.754455 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 4 08:52:01.858220 kernel: EXT4-fs (vda9): mounted filesystem f44cfd4f-a1a9-472a-86a7-c3154f299e07 r/w with ordered data mode. Quota mode: none. Mar 4 08:52:01.859258 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 4 08:52:01.860415 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 4 08:52:01.863459 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 4 08:52:01.875533 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 4 08:52:01.876657 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 4 08:52:01.877299 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 4 08:52:01.880953 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 4 08:52:01.881851 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 4 08:52:01.884821 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 4 08:52:01.886853 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 4 08:52:01.901205 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (915) Mar 4 08:52:01.903798 kernel: BTRFS info (device vda6): first mount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 4 08:52:01.903835 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 4 08:52:01.908193 kernel: BTRFS info (device vda6): turning on async discard Mar 4 08:52:01.908215 kernel: BTRFS info (device vda6): enabling free space tree Mar 4 08:52:01.910218 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 4 08:52:01.930197 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:01.933069 initrd-setup-root[943]: cut: /sysroot/etc/passwd: No such file or directory Mar 4 08:52:01.939792 initrd-setup-root[950]: cut: /sysroot/etc/group: No such file or directory Mar 4 08:52:01.944594 initrd-setup-root[957]: cut: /sysroot/etc/shadow: No such file or directory Mar 4 08:52:01.948851 initrd-setup-root[964]: cut: /sysroot/etc/gshadow: No such file or directory Mar 4 08:52:02.034005 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 4 08:52:02.038273 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 4 08:52:02.040784 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 4 08:52:02.052968 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 4 08:52:02.055233 kernel: BTRFS info (device vda6): last unmount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 4 08:52:02.071409 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 4 08:52:02.087290 ignition[1034]: INFO : Ignition 2.22.0 Mar 4 08:52:02.087290 ignition[1034]: INFO : Stage: mount Mar 4 08:52:02.088796 ignition[1034]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 4 08:52:02.088796 ignition[1034]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 08:52:02.088796 ignition[1034]: INFO : mount: mount passed Mar 4 08:52:02.088796 ignition[1034]: INFO : Ignition finished successfully Mar 4 08:52:02.089864 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 4 08:52:02.965218 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:04.975204 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:08.983211 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:08.990943 coreos-metadata[917]: Mar 04 08:52:08.990 WARN failed to locate config-drive, using the metadata service API instead Mar 4 08:52:09.007356 coreos-metadata[917]: Mar 04 08:52:09.007 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 4 08:52:09.736192 coreos-metadata[917]: Mar 04 08:52:09.736 INFO Fetch successful Mar 4 08:52:09.736192 coreos-metadata[917]: Mar 04 08:52:09.736 INFO wrote hostname ci-4459-2-4-2-039fb286b9 to /sysroot/etc/hostname Mar 4 08:52:09.738844 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 4 08:52:09.740207 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 4 08:52:09.742135 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 4 08:52:09.766155 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 4 08:52:09.799191 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1051) Mar 4 08:52:09.801206 kernel: BTRFS info (device vda6): first mount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 4 08:52:09.801237 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 4 08:52:09.805309 kernel: BTRFS info (device vda6): turning on async discard Mar 4 08:52:09.805328 kernel: BTRFS info (device vda6): enabling free space tree Mar 4 08:52:09.806895 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 4 08:52:09.833757 ignition[1069]: INFO : Ignition 2.22.0 Mar 4 08:52:09.833757 ignition[1069]: INFO : Stage: files Mar 4 08:52:09.835343 ignition[1069]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 4 08:52:09.835343 ignition[1069]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 08:52:09.835343 ignition[1069]: DEBUG : files: compiled without relabeling support, skipping Mar 4 08:52:09.838401 ignition[1069]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 4 08:52:09.838401 ignition[1069]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 4 08:52:09.838401 ignition[1069]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 4 08:52:09.841973 ignition[1069]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 4 08:52:09.841973 ignition[1069]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 4 08:52:09.841973 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 4 08:52:09.841973 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 4 08:52:09.838935 unknown[1069]: wrote ssh authorized keys file for user: core Mar 4 08:52:09.883807 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 4 08:52:10.002017 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 4 08:52:10.002017 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 4 08:52:10.005563 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 4 08:52:10.020409 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 4 08:52:10.020409 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 4 08:52:10.020409 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-arm64.raw: attempt #1 Mar 4 08:52:10.417307 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 4 08:52:11.979402 ignition[1069]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 4 08:52:11.979402 ignition[1069]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 4 08:52:11.983150 ignition[1069]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 4 08:52:11.987096 ignition[1069]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 4 08:52:11.987096 ignition[1069]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 4 08:52:11.987096 ignition[1069]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 4 08:52:11.993258 ignition[1069]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 4 08:52:11.993258 ignition[1069]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 4 08:52:11.993258 ignition[1069]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 4 08:52:11.993258 ignition[1069]: INFO : files: files passed Mar 4 08:52:11.993258 ignition[1069]: INFO : Ignition finished successfully Mar 4 08:52:11.992459 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 4 08:52:11.995010 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 4 08:52:11.996713 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 4 08:52:12.008749 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 4 08:52:12.008843 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 4 08:52:12.014256 initrd-setup-root-after-ignition[1101]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 4 08:52:12.014256 initrd-setup-root-after-ignition[1101]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 4 08:52:12.017129 initrd-setup-root-after-ignition[1105]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 4 08:52:12.016593 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 4 08:52:12.018388 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 4 08:52:12.022257 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 4 08:52:12.071484 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 4 08:52:12.071601 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 4 08:52:12.073528 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 4 08:52:12.075066 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 4 08:52:12.076712 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 4 08:52:12.077549 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 4 08:52:12.110904 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 4 08:52:12.113292 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 4 08:52:12.130664 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 4 08:52:12.131792 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 4 08:52:12.133532 systemd[1]: Stopped target timers.target - Timer Units. Mar 4 08:52:12.134947 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 4 08:52:12.135070 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 4 08:52:12.137159 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 4 08:52:12.138984 systemd[1]: Stopped target basic.target - Basic System. Mar 4 08:52:12.140381 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 4 08:52:12.141816 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 4 08:52:12.143421 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 4 08:52:12.145086 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 4 08:52:12.146753 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 4 08:52:12.148276 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 4 08:52:12.149999 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 4 08:52:12.151692 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 4 08:52:12.153092 systemd[1]: Stopped target swap.target - Swaps. Mar 4 08:52:12.154396 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 4 08:52:12.154513 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 4 08:52:12.156503 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 4 08:52:12.158190 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 4 08:52:12.159939 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 4 08:52:12.160077 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 4 08:52:12.161752 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 4 08:52:12.161869 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 4 08:52:12.164238 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 4 08:52:12.164356 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 4 08:52:12.166052 systemd[1]: ignition-files.service: Deactivated successfully. Mar 4 08:52:12.166149 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 4 08:52:12.168435 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 4 08:52:12.169913 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 4 08:52:12.170040 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 4 08:52:12.173353 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 4 08:52:12.174585 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 4 08:52:12.174696 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 4 08:52:12.176501 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 4 08:52:12.176645 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 4 08:52:12.181767 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 4 08:52:12.182315 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 4 08:52:12.197846 ignition[1126]: INFO : Ignition 2.22.0 Mar 4 08:52:12.197846 ignition[1126]: INFO : Stage: umount Mar 4 08:52:12.199431 ignition[1126]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 4 08:52:12.199431 ignition[1126]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 08:52:12.199431 ignition[1126]: INFO : umount: umount passed Mar 4 08:52:12.199431 ignition[1126]: INFO : Ignition finished successfully Mar 4 08:52:12.200580 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 4 08:52:12.200694 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 4 08:52:12.202144 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 4 08:52:12.202203 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 4 08:52:12.203594 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 4 08:52:12.203652 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 4 08:52:12.205055 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 4 08:52:12.205092 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 4 08:52:12.207136 systemd[1]: Stopped target network.target - Network. Mar 4 08:52:12.209665 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 4 08:52:12.209734 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 4 08:52:12.211217 systemd[1]: Stopped target paths.target - Path Units. Mar 4 08:52:12.212640 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 4 08:52:12.217228 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 4 08:52:12.218366 systemd[1]: Stopped target slices.target - Slice Units. Mar 4 08:52:12.219662 systemd[1]: Stopped target sockets.target - Socket Units. Mar 4 08:52:12.221154 systemd[1]: iscsid.socket: Deactivated successfully. Mar 4 08:52:12.221208 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 4 08:52:12.223024 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 4 08:52:12.223054 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 4 08:52:12.224449 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 4 08:52:12.224505 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 4 08:52:12.225861 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 4 08:52:12.225899 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 4 08:52:12.227373 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 4 08:52:12.228850 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 4 08:52:12.230971 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 4 08:52:12.231526 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 4 08:52:12.231612 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 4 08:52:12.233194 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 4 08:52:12.233272 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 4 08:52:12.235755 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 4 08:52:12.235893 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 4 08:52:12.241300 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 4 08:52:12.241504 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 4 08:52:12.241589 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 4 08:52:12.244399 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 4 08:52:12.244915 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 4 08:52:12.246209 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 4 08:52:12.246256 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 4 08:52:12.248589 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 4 08:52:12.249338 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 4 08:52:12.249390 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 4 08:52:12.251157 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 4 08:52:12.251210 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 4 08:52:12.253943 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 4 08:52:12.253984 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 4 08:52:12.255710 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 4 08:52:12.255751 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 4 08:52:12.258396 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 4 08:52:12.260748 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 4 08:52:12.260802 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 4 08:52:12.276556 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 4 08:52:12.276675 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 4 08:52:12.278485 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 4 08:52:12.279253 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 4 08:52:12.281362 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 4 08:52:12.281427 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 4 08:52:12.282680 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 4 08:52:12.282713 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 4 08:52:12.284147 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 4 08:52:12.284245 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 4 08:52:12.286556 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 4 08:52:12.286608 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 4 08:52:12.288873 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 4 08:52:12.288943 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 4 08:52:12.292182 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 4 08:52:12.293846 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 4 08:52:12.293918 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 4 08:52:12.296867 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 4 08:52:12.296916 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 4 08:52:12.300014 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 4 08:52:12.300059 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 08:52:12.303834 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 4 08:52:12.303886 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 4 08:52:12.303921 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 4 08:52:12.305646 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 4 08:52:12.305769 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 4 08:52:12.307369 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 4 08:52:12.309990 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 4 08:52:12.328324 systemd[1]: Switching root. Mar 4 08:52:12.364424 systemd-journald[313]: Journal stopped Mar 4 08:52:13.892723 systemd-journald[313]: Received SIGTERM from PID 1 (systemd). Mar 4 08:52:13.892797 kernel: SELinux: policy capability network_peer_controls=1 Mar 4 08:52:13.892809 kernel: SELinux: policy capability open_perms=1 Mar 4 08:52:13.892822 kernel: SELinux: policy capability extended_socket_class=1 Mar 4 08:52:13.892831 kernel: SELinux: policy capability always_check_network=0 Mar 4 08:52:13.892843 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 4 08:52:13.892852 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 4 08:52:13.892864 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 4 08:52:13.892879 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 4 08:52:13.892891 kernel: SELinux: policy capability userspace_initial_context=0 Mar 4 08:52:13.892900 kernel: audit: type=1403 audit(1772614333.167:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 4 08:52:13.892915 systemd[1]: Successfully loaded SELinux policy in 63.109ms. Mar 4 08:52:13.892938 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.677ms. Mar 4 08:52:13.892952 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 4 08:52:13.892965 systemd[1]: Detected virtualization kvm. Mar 4 08:52:13.892979 systemd[1]: Detected architecture arm64. Mar 4 08:52:13.892990 systemd[1]: Detected first boot. Mar 4 08:52:13.893000 systemd[1]: Hostname set to . Mar 4 08:52:13.893013 systemd[1]: Initializing machine ID from VM UUID. Mar 4 08:52:13.893023 zram_generator::config[1174]: No configuration found. Mar 4 08:52:13.893033 kernel: NET: Registered PF_VSOCK protocol family Mar 4 08:52:13.893043 systemd[1]: Populated /etc with preset unit settings. Mar 4 08:52:13.893053 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 4 08:52:13.893066 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 4 08:52:13.893077 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 4 08:52:13.893087 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 4 08:52:13.893097 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 4 08:52:13.893107 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 4 08:52:13.893117 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 4 08:52:13.893129 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 4 08:52:13.893139 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 4 08:52:13.893149 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 4 08:52:13.893159 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 4 08:52:13.893201 systemd[1]: Created slice user.slice - User and Session Slice. Mar 4 08:52:13.893222 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 4 08:52:13.893233 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 4 08:52:13.893244 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 4 08:52:13.893254 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 4 08:52:13.893265 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 4 08:52:13.893276 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 4 08:52:13.893290 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 4 08:52:13.893300 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 4 08:52:13.893310 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 4 08:52:13.893320 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 4 08:52:13.893330 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 4 08:52:13.893339 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 4 08:52:13.893349 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 4 08:52:13.893359 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 4 08:52:13.893370 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 4 08:52:13.893380 systemd[1]: Reached target slices.target - Slice Units. Mar 4 08:52:13.893390 systemd[1]: Reached target swap.target - Swaps. Mar 4 08:52:13.893400 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 4 08:52:13.893415 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 4 08:52:13.893425 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 4 08:52:13.893434 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 4 08:52:13.893444 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 4 08:52:13.893455 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 4 08:52:13.893465 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 4 08:52:13.893476 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 4 08:52:13.893486 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 4 08:52:13.893496 systemd[1]: Mounting media.mount - External Media Directory... Mar 4 08:52:13.893507 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 4 08:52:13.893517 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 4 08:52:13.893527 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 4 08:52:13.893537 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 4 08:52:13.893547 systemd[1]: Reached target machines.target - Containers. Mar 4 08:52:13.893558 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 4 08:52:13.893567 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 4 08:52:13.893577 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 4 08:52:13.893587 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 4 08:52:13.893598 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 4 08:52:13.893608 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 4 08:52:13.893618 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 4 08:52:13.893629 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 4 08:52:13.893643 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 4 08:52:13.893653 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 4 08:52:13.893663 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 4 08:52:13.893673 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 4 08:52:13.893683 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 4 08:52:13.893693 systemd[1]: Stopped systemd-fsck-usr.service. Mar 4 08:52:13.893704 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 4 08:52:13.893714 kernel: fuse: init (API version 7.41) Mar 4 08:52:13.893725 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 4 08:52:13.893734 kernel: loop: module loaded Mar 4 08:52:13.893744 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 4 08:52:13.893754 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 4 08:52:13.893764 kernel: ACPI: bus type drm_connector registered Mar 4 08:52:13.893774 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 4 08:52:13.893785 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 4 08:52:13.893799 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 4 08:52:13.893810 systemd[1]: verity-setup.service: Deactivated successfully. Mar 4 08:52:13.893820 systemd[1]: Stopped verity-setup.service. Mar 4 08:52:13.893830 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 4 08:52:13.893868 systemd-journald[1246]: Collecting audit messages is disabled. Mar 4 08:52:13.893894 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 4 08:52:13.893904 systemd[1]: Mounted media.mount - External Media Directory. Mar 4 08:52:13.893915 systemd-journald[1246]: Journal started Mar 4 08:52:13.893936 systemd-journald[1246]: Runtime Journal (/run/log/journal/b76944c6b89e42cea806ca42d9fecc5f) is 8M, max 319.5M, 311.5M free. Mar 4 08:52:13.673232 systemd[1]: Queued start job for default target multi-user.target. Mar 4 08:52:13.699234 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 4 08:52:13.699685 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 4 08:52:13.896535 systemd[1]: Started systemd-journald.service - Journal Service. Mar 4 08:52:13.897204 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 4 08:52:13.898215 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 4 08:52:13.899215 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 4 08:52:13.902194 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 4 08:52:13.903413 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 4 08:52:13.904728 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 4 08:52:13.904899 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 4 08:52:13.906153 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 4 08:52:13.906326 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 4 08:52:13.907479 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 4 08:52:13.907664 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 4 08:52:13.908935 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 4 08:52:13.909087 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 4 08:52:13.910408 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 4 08:52:13.910564 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 4 08:52:13.913546 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 4 08:52:13.913727 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 4 08:52:13.914943 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 4 08:52:13.916402 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 4 08:52:13.917724 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 4 08:52:13.919049 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 4 08:52:13.931015 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 4 08:52:13.933216 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 4 08:52:13.935218 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 4 08:52:13.936163 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 4 08:52:13.936206 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 4 08:52:13.937860 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 4 08:52:13.948341 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 4 08:52:13.949319 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 4 08:52:13.950859 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 4 08:52:13.953338 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 4 08:52:13.954296 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 4 08:52:13.956499 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 4 08:52:13.957610 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 4 08:52:13.958812 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 4 08:52:13.962302 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 4 08:52:13.964974 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 4 08:52:13.969634 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 4 08:52:13.972716 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 4 08:52:13.974738 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 4 08:52:13.982978 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 4 08:52:13.984680 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 4 08:52:13.985141 systemd-journald[1246]: Time spent on flushing to /var/log/journal/b76944c6b89e42cea806ca42d9fecc5f is 21.558ms for 1730 entries. Mar 4 08:52:13.985141 systemd-journald[1246]: System Journal (/var/log/journal/b76944c6b89e42cea806ca42d9fecc5f) is 8M, max 584.8M, 576.8M free. Mar 4 08:52:14.016065 systemd-journald[1246]: Received client request to flush runtime journal. Mar 4 08:52:14.016113 kernel: loop0: detected capacity change from 0 to 119840 Mar 4 08:52:14.016128 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 4 08:52:13.988509 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 4 08:52:14.001680 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 4 08:52:14.019244 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 4 08:52:14.030202 kernel: loop1: detected capacity change from 0 to 1632 Mar 4 08:52:14.036859 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 4 08:52:14.040314 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 4 08:52:14.057335 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 4 08:52:14.070336 kernel: loop2: detected capacity change from 0 to 100632 Mar 4 08:52:14.126619 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Mar 4 08:52:14.126637 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Mar 4 08:52:14.131198 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 4 08:52:14.206236 kernel: loop3: detected capacity change from 0 to 209336 Mar 4 08:52:14.323212 kernel: loop4: detected capacity change from 0 to 119840 Mar 4 08:52:14.397880 kernel: loop5: detected capacity change from 0 to 1632 Mar 4 08:52:14.443205 kernel: loop6: detected capacity change from 0 to 100632 Mar 4 08:52:14.478853 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 4 08:52:14.482975 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 4 08:52:14.508214 kernel: loop7: detected capacity change from 0 to 209336 Mar 4 08:52:14.514918 systemd-udevd[1322]: Using default interface naming scheme 'v255'. Mar 4 08:52:14.620222 (sd-merge)[1320]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Mar 4 08:52:14.620676 (sd-merge)[1320]: Merged extensions into '/usr'. Mar 4 08:52:14.634581 systemd[1]: Reload requested from client PID 1294 ('systemd-sysext') (unit systemd-sysext.service)... Mar 4 08:52:14.634604 systemd[1]: Reloading... Mar 4 08:52:14.684347 zram_generator::config[1357]: No configuration found. Mar 4 08:52:14.792220 kernel: mousedev: PS/2 mouse device common for all mice Mar 4 08:52:14.909350 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 4 08:52:14.909608 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 4 08:52:14.909996 systemd[1]: Reloading finished in 275 ms. Mar 4 08:52:14.927897 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 4 08:52:14.931970 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Mar 4 08:52:14.932052 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 4 08:52:14.932068 kernel: [drm] features: -context_init Mar 4 08:52:14.932082 kernel: [drm] number of scanouts: 1 Mar 4 08:52:14.932605 kernel: [drm] number of cap sets: 0 Mar 4 08:52:14.933628 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 4 08:52:14.938212 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Mar 4 08:52:14.943239 kernel: Console: switching to colour frame buffer device 160x50 Mar 4 08:52:14.955196 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 4 08:52:14.959703 systemd[1]: Starting ensure-sysext.service... Mar 4 08:52:14.961978 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 4 08:52:14.964087 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 4 08:52:14.977581 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 4 08:52:14.981501 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 08:52:14.981629 systemd-tmpfiles[1442]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 4 08:52:14.981665 systemd-tmpfiles[1442]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 4 08:52:14.981886 systemd-tmpfiles[1442]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 4 08:52:14.982086 systemd-tmpfiles[1442]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 4 08:52:14.982891 systemd-tmpfiles[1442]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 4 08:52:14.983102 systemd-tmpfiles[1442]: ACLs are not supported, ignoring. Mar 4 08:52:14.983159 systemd-tmpfiles[1442]: ACLs are not supported, ignoring. Mar 4 08:52:14.986040 systemd[1]: Reload requested from client PID 1440 ('systemctl') (unit ensure-sysext.service)... Mar 4 08:52:14.986053 systemd[1]: Reloading... Mar 4 08:52:14.992020 systemd-tmpfiles[1442]: Detected autofs mount point /boot during canonicalization of boot. Mar 4 08:52:14.992033 systemd-tmpfiles[1442]: Skipping /boot Mar 4 08:52:14.999113 systemd-tmpfiles[1442]: Detected autofs mount point /boot during canonicalization of boot. Mar 4 08:52:14.999270 systemd-tmpfiles[1442]: Skipping /boot Mar 4 08:52:15.043239 zram_generator::config[1476]: No configuration found. Mar 4 08:52:15.170027 systemd-networkd[1441]: lo: Link UP Mar 4 08:52:15.170356 systemd-networkd[1441]: lo: Gained carrier Mar 4 08:52:15.171457 systemd-networkd[1441]: Enumeration completed Mar 4 08:52:15.172025 systemd-networkd[1441]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 08:52:15.172106 systemd-networkd[1441]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 4 08:52:15.172754 systemd-networkd[1441]: eth0: Link UP Mar 4 08:52:15.172957 systemd-networkd[1441]: eth0: Gained carrier Mar 4 08:52:15.173036 systemd-networkd[1441]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 08:52:15.194281 systemd-networkd[1441]: eth0: DHCPv4 address 10.0.9.143/25, gateway 10.0.9.129 acquired from 10.0.9.129 Mar 4 08:52:15.208913 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 4 08:52:15.210496 systemd[1]: Reloading finished in 224 ms. Mar 4 08:52:15.241227 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 4 08:52:15.242763 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 4 08:52:15.257783 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 4 08:52:15.278842 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 4 08:52:15.282126 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 4 08:52:15.283294 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 4 08:52:15.298158 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 4 08:52:15.300245 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 4 08:52:15.302367 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 4 08:52:15.303354 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 4 08:52:15.304395 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 4 08:52:15.305478 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 4 08:52:15.308402 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 4 08:52:15.310752 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 4 08:52:15.313508 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 4 08:52:15.316323 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 4 08:52:15.320750 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 4 08:52:15.325620 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 08:52:15.327252 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 4 08:52:15.327413 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 4 08:52:15.328849 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 4 08:52:15.328992 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 4 08:52:15.330560 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 4 08:52:15.340897 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 4 08:52:15.342690 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 4 08:52:15.351887 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 4 08:52:15.356241 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 4 08:52:15.358302 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 4 08:52:15.360445 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 4 08:52:15.362939 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 4 08:52:15.365875 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Mar 4 08:52:15.367232 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 4 08:52:15.367396 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 4 08:52:15.367644 systemd[1]: Reached target time-set.target - System Time Set. Mar 4 08:52:15.373627 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 4 08:52:15.375416 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 4 08:52:15.375570 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 4 08:52:15.377467 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 4 08:52:15.377611 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 4 08:52:15.379015 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 4 08:52:15.379200 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 4 08:52:15.381741 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 4 08:52:15.383475 systemd[1]: Finished ensure-sysext.service. Mar 4 08:52:15.384469 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 4 08:52:15.384627 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 4 08:52:15.390598 augenrules[1571]: No rules Mar 4 08:52:15.392563 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 4 08:52:15.392622 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 4 08:52:15.393431 systemd[1]: audit-rules.service: Deactivated successfully. Mar 4 08:52:15.393650 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 4 08:52:15.394100 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 4 08:52:15.394148 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 4 08:52:15.397398 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 4 08:52:15.408251 kernel: PTP clock support registered Mar 4 08:52:15.411362 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Mar 4 08:52:15.411754 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Mar 4 08:52:15.426793 systemd-resolved[1532]: Positive Trust Anchors: Mar 4 08:52:15.427128 systemd-resolved[1532]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 4 08:52:15.427190 systemd-resolved[1532]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 4 08:52:15.435298 systemd-resolved[1532]: Using system hostname 'ci-4459-2-4-2-039fb286b9'. Mar 4 08:52:15.437334 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 4 08:52:15.438362 systemd[1]: Reached target network.target - Network. Mar 4 08:52:15.439087 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 4 08:52:15.703806 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 4 08:52:15.705247 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 4 08:52:15.739979 ldconfig[1289]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 4 08:52:15.745264 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 4 08:52:15.747619 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 4 08:52:15.766264 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 4 08:52:15.767459 systemd[1]: Reached target sysinit.target - System Initialization. Mar 4 08:52:15.768478 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 4 08:52:15.769590 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 4 08:52:15.770910 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 4 08:52:15.771979 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 4 08:52:15.773111 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 4 08:52:15.774670 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 4 08:52:15.774709 systemd[1]: Reached target paths.target - Path Units. Mar 4 08:52:15.775533 systemd[1]: Reached target timers.target - Timer Units. Mar 4 08:52:15.777421 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 4 08:52:15.780890 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 4 08:52:15.783769 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 4 08:52:15.785032 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 4 08:52:15.786199 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 4 08:52:15.789235 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 4 08:52:15.790436 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 4 08:52:15.791998 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 4 08:52:15.793080 systemd[1]: Reached target sockets.target - Socket Units. Mar 4 08:52:15.793978 systemd[1]: Reached target basic.target - Basic System. Mar 4 08:52:15.794861 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 4 08:52:15.794893 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 4 08:52:15.797128 systemd[1]: Starting chronyd.service - NTP client/server... Mar 4 08:52:15.798741 systemd[1]: Starting containerd.service - containerd container runtime... Mar 4 08:52:15.800795 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 4 08:52:15.811319 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 4 08:52:15.812213 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:15.813276 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 4 08:52:15.816679 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 4 08:52:15.818652 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 4 08:52:15.819945 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 4 08:52:15.821060 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 4 08:52:15.827285 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 4 08:52:15.830479 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 4 08:52:15.831013 jq[1594]: false Mar 4 08:52:15.833111 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 4 08:52:15.841358 extend-filesystems[1595]: Found /dev/vda6 Mar 4 08:52:15.841375 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 4 08:52:15.843115 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 4 08:52:15.843683 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 4 08:52:15.844303 systemd[1]: Starting update-engine.service - Update Engine... Mar 4 08:52:15.846084 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 4 08:52:15.849787 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 4 08:52:15.852639 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 4 08:52:15.852829 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 4 08:52:15.853251 jq[1613]: true Mar 4 08:52:15.853064 systemd[1]: motdgen.service: Deactivated successfully. Mar 4 08:52:15.853242 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 4 08:52:15.856577 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 4 08:52:15.858209 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 4 08:52:15.865070 (ntainerd)[1618]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 4 08:52:15.868606 jq[1617]: true Mar 4 08:52:15.873413 chronyd[1587]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 4 08:52:15.874650 chronyd[1587]: Loaded seccomp filter (level 2) Mar 4 08:52:15.874757 systemd[1]: Started chronyd.service - NTP client/server. Mar 4 08:52:15.877641 extend-filesystems[1595]: Found /dev/vda9 Mar 4 08:52:15.880603 extend-filesystems[1595]: Checking size of /dev/vda9 Mar 4 08:52:15.900156 extend-filesystems[1595]: Resized partition /dev/vda9 Mar 4 08:52:15.902698 tar[1615]: linux-arm64/LICENSE Mar 4 08:52:15.902698 tar[1615]: linux-arm64/helm Mar 4 08:52:15.908958 extend-filesystems[1650]: resize2fs 1.47.3 (8-Jul-2025) Mar 4 08:52:15.914772 update_engine[1610]: I20260304 08:52:15.913344 1610 main.cc:92] Flatcar Update Engine starting Mar 4 08:52:15.916588 systemd-logind[1606]: New seat seat0. Mar 4 08:52:15.918203 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Mar 4 08:52:15.930649 dbus-daemon[1590]: [system] SELinux support is enabled Mar 4 08:52:15.964457 update_engine[1610]: I20260304 08:52:15.936472 1610 update_check_scheduler.cc:74] Next update check in 3m39s Mar 4 08:52:15.931775 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 4 08:52:15.938102 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 4 08:52:15.938125 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 4 08:52:15.941270 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 4 08:52:15.941290 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 4 08:52:15.945998 systemd[1]: Started update-engine.service - Update Engine. Mar 4 08:52:15.949953 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 4 08:52:15.964653 systemd-logind[1606]: Watching system buttons on /dev/input/event0 (Power Button) Mar 4 08:52:15.964669 systemd-logind[1606]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Mar 4 08:52:15.964895 systemd[1]: Started systemd-logind.service - User Login Management. Mar 4 08:52:16.014570 locksmithd[1654]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 4 08:52:16.089142 containerd[1618]: time="2026-03-04T08:52:16Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 4 08:52:16.273462 containerd[1618]: time="2026-03-04T08:52:16.273079360Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 4 08:52:16.284336 containerd[1618]: time="2026-03-04T08:52:16.284264720Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.76µs" Mar 4 08:52:16.284336 containerd[1618]: time="2026-03-04T08:52:16.284304760Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 4 08:52:16.284336 containerd[1618]: time="2026-03-04T08:52:16.284324520Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 4 08:52:16.298597 containerd[1618]: time="2026-03-04T08:52:16.298539960Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 4 08:52:16.298674 containerd[1618]: time="2026-03-04T08:52:16.298609720Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 4 08:52:16.298696 containerd[1618]: time="2026-03-04T08:52:16.298676320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 4 08:52:16.299097 containerd[1618]: time="2026-03-04T08:52:16.299067200Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 4 08:52:16.299097 containerd[1618]: time="2026-03-04T08:52:16.299089920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 4 08:52:16.299448 containerd[1618]: time="2026-03-04T08:52:16.299372240Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 4 08:52:16.299448 containerd[1618]: time="2026-03-04T08:52:16.299393040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 4 08:52:16.299448 containerd[1618]: time="2026-03-04T08:52:16.299406440Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 4 08:52:16.299448 containerd[1618]: time="2026-03-04T08:52:16.299414680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 4 08:52:16.299664 containerd[1618]: time="2026-03-04T08:52:16.299494080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 4 08:52:16.300230 containerd[1618]: time="2026-03-04T08:52:16.299706400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 4 08:52:16.300230 containerd[1618]: time="2026-03-04T08:52:16.299745400Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 4 08:52:16.300230 containerd[1618]: time="2026-03-04T08:52:16.299756000Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 4 08:52:16.300230 containerd[1618]: time="2026-03-04T08:52:16.299790240Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 4 08:52:16.300230 containerd[1618]: time="2026-03-04T08:52:16.299998120Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 4 08:52:16.300230 containerd[1618]: time="2026-03-04T08:52:16.300054320Z" level=info msg="metadata content store policy set" policy=shared Mar 4 08:52:16.300379 bash[1651]: Updated "/home/core/.ssh/authorized_keys" Mar 4 08:52:16.305251 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 4 08:52:16.308385 systemd[1]: Starting sshkeys.service... Mar 4 08:52:16.330296 systemd-networkd[1441]: eth0: Gained IPv6LL Mar 4 08:52:16.331041 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 4 08:52:16.335451 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 4 08:52:16.337544 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 4 08:52:16.343227 systemd[1]: Reached target network-online.target - Network is Online. Mar 4 08:52:16.349446 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:52:16.354245 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:16.354970 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 4 08:52:16.377035 containerd[1618]: time="2026-03-04T08:52:16.376975080Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 4 08:52:16.377120 containerd[1618]: time="2026-03-04T08:52:16.377060400Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 4 08:52:16.377120 containerd[1618]: time="2026-03-04T08:52:16.377075760Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 4 08:52:16.377120 containerd[1618]: time="2026-03-04T08:52:16.377088160Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 4 08:52:16.377120 containerd[1618]: time="2026-03-04T08:52:16.377100240Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 4 08:52:16.377120 containerd[1618]: time="2026-03-04T08:52:16.377112760Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 4 08:52:16.377254 containerd[1618]: time="2026-03-04T08:52:16.377132720Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 4 08:52:16.377254 containerd[1618]: time="2026-03-04T08:52:16.377146560Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 4 08:52:16.377254 containerd[1618]: time="2026-03-04T08:52:16.377160920Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 4 08:52:16.377254 containerd[1618]: time="2026-03-04T08:52:16.377198120Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 4 08:52:16.377254 containerd[1618]: time="2026-03-04T08:52:16.377208280Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 4 08:52:16.377254 containerd[1618]: time="2026-03-04T08:52:16.377221080Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 4 08:52:16.377374 containerd[1618]: time="2026-03-04T08:52:16.377353560Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 4 08:52:16.377402 containerd[1618]: time="2026-03-04T08:52:16.377393680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 4 08:52:16.377449 containerd[1618]: time="2026-03-04T08:52:16.377412520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 4 08:52:16.377449 containerd[1618]: time="2026-03-04T08:52:16.377423240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 4 08:52:16.377449 containerd[1618]: time="2026-03-04T08:52:16.377433440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 4 08:52:16.377449 containerd[1618]: time="2026-03-04T08:52:16.377448200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 4 08:52:16.380226 containerd[1618]: time="2026-03-04T08:52:16.380189440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 4 08:52:16.380226 containerd[1618]: time="2026-03-04T08:52:16.380224800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 4 08:52:16.380320 containerd[1618]: time="2026-03-04T08:52:16.380238760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 4 08:52:16.380320 containerd[1618]: time="2026-03-04T08:52:16.380249720Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 4 08:52:16.380320 containerd[1618]: time="2026-03-04T08:52:16.380260200Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 4 08:52:16.380533 containerd[1618]: time="2026-03-04T08:52:16.380515840Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 4 08:52:16.380736 containerd[1618]: time="2026-03-04T08:52:16.380662320Z" level=info msg="Start snapshots syncer" Mar 4 08:52:16.380736 containerd[1618]: time="2026-03-04T08:52:16.380698040Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 4 08:52:16.382137 containerd[1618]: time="2026-03-04T08:52:16.381679680Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 4 08:52:16.382137 containerd[1618]: time="2026-03-04T08:52:16.381760800Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.381834800Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.381970000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.381998040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382012680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382142560Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382179640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382194160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382210320Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382244000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382260840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382285520Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 4 08:52:16.382323 containerd[1618]: time="2026-03-04T08:52:16.382325600Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382341360Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382354080Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382367120Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382378080Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382388640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382407200Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382511040Z" level=info msg="runtime interface created" Mar 4 08:52:16.382517 containerd[1618]: time="2026-03-04T08:52:16.382516960Z" level=info msg="created NRI interface" Mar 4 08:52:16.382646 containerd[1618]: time="2026-03-04T08:52:16.382526200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 4 08:52:16.382646 containerd[1618]: time="2026-03-04T08:52:16.382541720Z" level=info msg="Connect containerd service" Mar 4 08:52:16.382646 containerd[1618]: time="2026-03-04T08:52:16.382572360Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 4 08:52:16.385655 containerd[1618]: time="2026-03-04T08:52:16.385431440Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 4 08:52:16.399284 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 4 08:52:16.510779 containerd[1618]: time="2026-03-04T08:52:16.510520320Z" level=info msg="Start subscribing containerd event" Mar 4 08:52:16.510985 containerd[1618]: time="2026-03-04T08:52:16.510810440Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 4 08:52:16.510985 containerd[1618]: time="2026-03-04T08:52:16.510857440Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 4 08:52:16.510985 containerd[1618]: time="2026-03-04T08:52:16.510878800Z" level=info msg="Start recovering state" Mar 4 08:52:16.511290 containerd[1618]: time="2026-03-04T08:52:16.511191120Z" level=info msg="Start event monitor" Mar 4 08:52:16.511490 containerd[1618]: time="2026-03-04T08:52:16.511432840Z" level=info msg="Start cni network conf syncer for default" Mar 4 08:52:16.511490 containerd[1618]: time="2026-03-04T08:52:16.511448400Z" level=info msg="Start streaming server" Mar 4 08:52:16.511490 containerd[1618]: time="2026-03-04T08:52:16.511457920Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 4 08:52:16.511490 containerd[1618]: time="2026-03-04T08:52:16.511464920Z" level=info msg="runtime interface starting up..." Mar 4 08:52:16.511490 containerd[1618]: time="2026-03-04T08:52:16.511470880Z" level=info msg="starting plugins..." Mar 4 08:52:16.513571 containerd[1618]: time="2026-03-04T08:52:16.513315480Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 4 08:52:16.513571 containerd[1618]: time="2026-03-04T08:52:16.513551480Z" level=info msg="containerd successfully booted in 0.424761s" Mar 4 08:52:16.513720 systemd[1]: Started containerd.service - containerd container runtime. Mar 4 08:52:16.616214 tar[1615]: linux-arm64/README.md Mar 4 08:52:16.643927 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 4 08:52:16.654073 sshd_keygen[1630]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 4 08:52:16.675268 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 4 08:52:16.678643 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 4 08:52:16.696462 systemd[1]: issuegen.service: Deactivated successfully. Mar 4 08:52:16.696655 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 4 08:52:16.701289 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 4 08:52:16.721741 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 4 08:52:16.724570 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 4 08:52:16.726791 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 4 08:52:16.728057 systemd[1]: Reached target getty.target - Login Prompts. Mar 4 08:52:16.820238 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:17.189226 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Mar 4 08:52:17.204447 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 4 08:52:17.206565 systemd[1]: Started sshd@0-10.0.9.143:22-20.161.92.111:36030.service - OpenSSH per-connection server daemon (20.161.92.111:36030). Mar 4 08:52:17.359184 extend-filesystems[1650]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 4 08:52:17.359184 extend-filesystems[1650]: old_desc_blocks = 1, new_desc_blocks = 6 Mar 4 08:52:17.359184 extend-filesystems[1650]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Mar 4 08:52:17.363014 extend-filesystems[1595]: Resized filesystem in /dev/vda9 Mar 4 08:52:17.363991 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:17.362148 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 4 08:52:17.363966 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 4 08:52:17.730433 sshd[1720]: Accepted publickey for core from 20.161.92.111 port 36030 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:17.732362 sshd-session[1720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:17.739468 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 4 08:52:17.741695 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 4 08:52:17.749587 systemd-logind[1606]: New session 1 of user core. Mar 4 08:52:17.766237 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 4 08:52:17.771282 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 4 08:52:17.787986 (systemd)[1728]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 4 08:52:17.791466 systemd-logind[1606]: New session c1 of user core. Mar 4 08:52:17.859696 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:52:17.869865 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 08:52:17.918658 systemd[1728]: Queued start job for default target default.target. Mar 4 08:52:17.936290 systemd[1728]: Created slice app.slice - User Application Slice. Mar 4 08:52:17.936447 systemd[1728]: Reached target paths.target - Paths. Mar 4 08:52:17.936612 systemd[1728]: Reached target timers.target - Timers. Mar 4 08:52:17.937886 systemd[1728]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 4 08:52:17.948503 systemd[1728]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 4 08:52:17.948598 systemd[1728]: Reached target sockets.target - Sockets. Mar 4 08:52:17.948632 systemd[1728]: Reached target basic.target - Basic System. Mar 4 08:52:17.948657 systemd[1728]: Reached target default.target - Main User Target. Mar 4 08:52:17.948681 systemd[1728]: Startup finished in 151ms. Mar 4 08:52:17.948836 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 4 08:52:17.951267 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 4 08:52:18.249585 systemd[1]: Started sshd@1-10.0.9.143:22-20.161.92.111:36046.service - OpenSSH per-connection server daemon (20.161.92.111:36046). Mar 4 08:52:18.396679 kubelet[1739]: E0304 08:52:18.396140 1739 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 08:52:18.399259 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 08:52:18.399401 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 08:52:18.401462 systemd[1]: kubelet.service: Consumed 761ms CPU time, 257.6M memory peak. Mar 4 08:52:18.757148 sshd[1751]: Accepted publickey for core from 20.161.92.111 port 36046 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:18.757894 sshd-session[1751]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:18.761848 systemd-logind[1606]: New session 2 of user core. Mar 4 08:52:18.772383 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 4 08:52:18.835245 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:19.046631 sshd[1756]: Connection closed by 20.161.92.111 port 36046 Mar 4 08:52:19.047374 sshd-session[1751]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:19.051082 systemd[1]: sshd@1-10.0.9.143:22-20.161.92.111:36046.service: Deactivated successfully. Mar 4 08:52:19.052763 systemd[1]: session-2.scope: Deactivated successfully. Mar 4 08:52:19.054086 systemd-logind[1606]: Session 2 logged out. Waiting for processes to exit. Mar 4 08:52:19.056402 systemd-logind[1606]: Removed session 2. Mar 4 08:52:19.155322 systemd[1]: Started sshd@2-10.0.9.143:22-20.161.92.111:36060.service - OpenSSH per-connection server daemon (20.161.92.111:36060). Mar 4 08:52:19.373240 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:19.671320 sshd[1763]: Accepted publickey for core from 20.161.92.111 port 36060 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:19.672586 sshd-session[1763]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:19.677516 systemd-logind[1606]: New session 3 of user core. Mar 4 08:52:19.688574 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 4 08:52:19.957716 sshd[1767]: Connection closed by 20.161.92.111 port 36060 Mar 4 08:52:19.958530 sshd-session[1763]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:19.962608 systemd[1]: sshd@2-10.0.9.143:22-20.161.92.111:36060.service: Deactivated successfully. Mar 4 08:52:19.964162 systemd[1]: session-3.scope: Deactivated successfully. Mar 4 08:52:19.965482 systemd-logind[1606]: Session 3 logged out. Waiting for processes to exit. Mar 4 08:52:19.966917 systemd-logind[1606]: Removed session 3. Mar 4 08:52:22.842210 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:22.854557 coreos-metadata[1589]: Mar 04 08:52:22.854 WARN failed to locate config-drive, using the metadata service API instead Mar 4 08:52:22.870619 coreos-metadata[1589]: Mar 04 08:52:22.870 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 4 08:52:23.381197 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 4 08:52:23.385952 coreos-metadata[1670]: Mar 04 08:52:23.385 WARN failed to locate config-drive, using the metadata service API instead Mar 4 08:52:23.398366 coreos-metadata[1670]: Mar 04 08:52:23.398 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 4 08:52:27.860810 coreos-metadata[1589]: Mar 04 08:52:27.860 INFO Fetch successful Mar 4 08:52:27.861234 coreos-metadata[1589]: Mar 04 08:52:27.861 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 4 08:52:28.433939 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 4 08:52:28.435650 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:52:29.217856 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:52:29.221607 (kubelet)[1788]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 08:52:29.570063 kubelet[1788]: E0304 08:52:29.569925 1788 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 08:52:29.573513 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 08:52:29.573653 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 08:52:29.575256 systemd[1]: kubelet.service: Consumed 458ms CPU time, 105.7M memory peak. Mar 4 08:52:29.956041 coreos-metadata[1670]: Mar 04 08:52:29.955 INFO Fetch successful Mar 4 08:52:29.956041 coreos-metadata[1670]: Mar 04 08:52:29.955 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 4 08:52:30.061844 systemd[1]: Started sshd@3-10.0.9.143:22-20.161.92.111:45676.service - OpenSSH per-connection server daemon (20.161.92.111:45676). Mar 4 08:52:30.565820 sshd[1798]: Accepted publickey for core from 20.161.92.111 port 45676 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:30.567027 sshd-session[1798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:30.570804 systemd-logind[1606]: New session 4 of user core. Mar 4 08:52:30.586761 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 4 08:52:30.652044 coreos-metadata[1589]: Mar 04 08:52:30.651 INFO Fetch successful Mar 4 08:52:30.652044 coreos-metadata[1589]: Mar 04 08:52:30.651 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 4 08:52:30.851311 sshd[1801]: Connection closed by 20.161.92.111 port 45676 Mar 4 08:52:30.852306 sshd-session[1798]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:30.855758 systemd[1]: sshd@3-10.0.9.143:22-20.161.92.111:45676.service: Deactivated successfully. Mar 4 08:52:30.857387 systemd[1]: session-4.scope: Deactivated successfully. Mar 4 08:52:30.858234 systemd-logind[1606]: Session 4 logged out. Waiting for processes to exit. Mar 4 08:52:30.859453 systemd-logind[1606]: Removed session 4. Mar 4 08:52:30.901753 coreos-metadata[1670]: Mar 04 08:52:30.901 INFO Fetch successful Mar 4 08:52:30.903868 unknown[1670]: wrote ssh authorized keys file for user: core Mar 4 08:52:30.930441 update-ssh-keys[1807]: Updated "/home/core/.ssh/authorized_keys" Mar 4 08:52:30.931398 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 4 08:52:30.932758 systemd[1]: Finished sshkeys.service. Mar 4 08:52:30.956575 systemd[1]: Started sshd@4-10.0.9.143:22-20.161.92.111:41808.service - OpenSSH per-connection server daemon (20.161.92.111:41808). Mar 4 08:52:31.460746 sshd[1811]: Accepted publickey for core from 20.161.92.111 port 41808 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:31.461924 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:31.466860 systemd-logind[1606]: New session 5 of user core. Mar 4 08:52:31.477422 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 4 08:52:31.746417 sshd[1814]: Connection closed by 20.161.92.111 port 41808 Mar 4 08:52:31.747128 sshd-session[1811]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:31.750357 systemd[1]: sshd@4-10.0.9.143:22-20.161.92.111:41808.service: Deactivated successfully. Mar 4 08:52:31.752030 systemd[1]: session-5.scope: Deactivated successfully. Mar 4 08:52:31.752830 systemd-logind[1606]: Session 5 logged out. Waiting for processes to exit. Mar 4 08:52:31.753957 systemd-logind[1606]: Removed session 5. Mar 4 08:52:32.359049 coreos-metadata[1589]: Mar 04 08:52:32.358 INFO Fetch successful Mar 4 08:52:32.359049 coreos-metadata[1589]: Mar 04 08:52:32.359 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 4 08:52:33.804136 coreos-metadata[1589]: Mar 04 08:52:33.804 INFO Fetch successful Mar 4 08:52:33.804136 coreos-metadata[1589]: Mar 04 08:52:33.804 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 4 08:52:35.977542 coreos-metadata[1589]: Mar 04 08:52:35.977 INFO Fetch successful Mar 4 08:52:35.977542 coreos-metadata[1589]: Mar 04 08:52:35.977 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 4 08:52:36.772151 coreos-metadata[1589]: Mar 04 08:52:36.772 INFO Fetch successful Mar 4 08:52:36.805258 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 4 08:52:36.805873 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 4 08:52:36.806001 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 4 08:52:36.812278 systemd[1]: Startup finished in 3.331s (kernel) + 15.552s (initrd) + 23.708s (userspace) = 42.593s. Mar 4 08:52:39.662008 chronyd[1587]: Selected source PHC0 Mar 4 08:52:39.683400 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 4 08:52:39.684783 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:52:39.824438 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:52:39.828235 (kubelet)[1832]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 08:52:40.591300 kubelet[1832]: E0304 08:52:40.591243 1832 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 08:52:40.593722 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 08:52:40.593846 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 08:52:40.594142 systemd[1]: kubelet.service: Consumed 144ms CPU time, 110.1M memory peak. Mar 4 08:52:41.800321 systemd[1]: Started sshd@5-10.0.9.143:22-20.161.92.111:60684.service - OpenSSH per-connection server daemon (20.161.92.111:60684). Mar 4 08:52:42.304136 sshd[1841]: Accepted publickey for core from 20.161.92.111 port 60684 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:42.305411 sshd-session[1841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:42.309729 systemd-logind[1606]: New session 6 of user core. Mar 4 08:52:42.328434 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 4 08:52:42.589055 sshd[1844]: Connection closed by 20.161.92.111 port 60684 Mar 4 08:52:42.589438 sshd-session[1841]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:42.592444 systemd[1]: sshd@5-10.0.9.143:22-20.161.92.111:60684.service: Deactivated successfully. Mar 4 08:52:42.594358 systemd[1]: session-6.scope: Deactivated successfully. Mar 4 08:52:42.595578 systemd-logind[1606]: Session 6 logged out. Waiting for processes to exit. Mar 4 08:52:42.596922 systemd-logind[1606]: Removed session 6. Mar 4 08:52:42.692230 systemd[1]: Started sshd@6-10.0.9.143:22-20.161.92.111:60688.service - OpenSSH per-connection server daemon (20.161.92.111:60688). Mar 4 08:52:43.195963 sshd[1850]: Accepted publickey for core from 20.161.92.111 port 60688 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:43.197322 sshd-session[1850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:43.200941 systemd-logind[1606]: New session 7 of user core. Mar 4 08:52:43.207787 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 4 08:52:43.477467 sshd[1853]: Connection closed by 20.161.92.111 port 60688 Mar 4 08:52:43.477873 sshd-session[1850]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:43.481288 systemd[1]: sshd@6-10.0.9.143:22-20.161.92.111:60688.service: Deactivated successfully. Mar 4 08:52:43.483935 systemd[1]: session-7.scope: Deactivated successfully. Mar 4 08:52:43.486263 systemd-logind[1606]: Session 7 logged out. Waiting for processes to exit. Mar 4 08:52:43.487262 systemd-logind[1606]: Removed session 7. Mar 4 08:52:43.584458 systemd[1]: Started sshd@7-10.0.9.143:22-20.161.92.111:60704.service - OpenSSH per-connection server daemon (20.161.92.111:60704). Mar 4 08:52:44.107069 sshd[1859]: Accepted publickey for core from 20.161.92.111 port 60704 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:44.108429 sshd-session[1859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:44.111936 systemd-logind[1606]: New session 8 of user core. Mar 4 08:52:44.119500 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 4 08:52:44.392283 sshd[1862]: Connection closed by 20.161.92.111 port 60704 Mar 4 08:52:44.392518 sshd-session[1859]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:44.395900 systemd[1]: sshd@7-10.0.9.143:22-20.161.92.111:60704.service: Deactivated successfully. Mar 4 08:52:44.397505 systemd[1]: session-8.scope: Deactivated successfully. Mar 4 08:52:44.400070 systemd-logind[1606]: Session 8 logged out. Waiting for processes to exit. Mar 4 08:52:44.401018 systemd-logind[1606]: Removed session 8. Mar 4 08:52:44.499332 systemd[1]: Started sshd@8-10.0.9.143:22-20.161.92.111:60710.service - OpenSSH per-connection server daemon (20.161.92.111:60710). Mar 4 08:52:45.017915 sshd[1868]: Accepted publickey for core from 20.161.92.111 port 60710 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:45.019187 sshd-session[1868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:45.023015 systemd-logind[1606]: New session 9 of user core. Mar 4 08:52:45.039405 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 4 08:52:45.228887 sudo[1872]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 4 08:52:45.229194 sudo[1872]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 08:52:45.257365 sudo[1872]: pam_unix(sudo:session): session closed for user root Mar 4 08:52:45.353018 sshd[1871]: Connection closed by 20.161.92.111 port 60710 Mar 4 08:52:45.352049 sshd-session[1868]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:45.356240 systemd[1]: sshd@8-10.0.9.143:22-20.161.92.111:60710.service: Deactivated successfully. Mar 4 08:52:45.359592 systemd[1]: session-9.scope: Deactivated successfully. Mar 4 08:52:45.360300 systemd-logind[1606]: Session 9 logged out. Waiting for processes to exit. Mar 4 08:52:45.361490 systemd-logind[1606]: Removed session 9. Mar 4 08:52:45.458363 systemd[1]: Started sshd@9-10.0.9.143:22-20.161.92.111:60726.service - OpenSSH per-connection server daemon (20.161.92.111:60726). Mar 4 08:52:45.979224 sshd[1878]: Accepted publickey for core from 20.161.92.111 port 60726 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:45.980434 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:45.984122 systemd-logind[1606]: New session 10 of user core. Mar 4 08:52:45.993488 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 4 08:52:46.171139 sudo[1883]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 4 08:52:46.171417 sudo[1883]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 08:52:46.186931 sudo[1883]: pam_unix(sudo:session): session closed for user root Mar 4 08:52:46.191721 sudo[1882]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 4 08:52:46.191974 sudo[1882]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 08:52:46.200892 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 4 08:52:46.236985 augenrules[1905]: No rules Mar 4 08:52:46.238233 systemd[1]: audit-rules.service: Deactivated successfully. Mar 4 08:52:46.238438 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 4 08:52:46.240370 sudo[1882]: pam_unix(sudo:session): session closed for user root Mar 4 08:52:46.334048 sshd[1881]: Connection closed by 20.161.92.111 port 60726 Mar 4 08:52:46.334533 sshd-session[1878]: pam_unix(sshd:session): session closed for user core Mar 4 08:52:46.338321 systemd[1]: sshd@9-10.0.9.143:22-20.161.92.111:60726.service: Deactivated successfully. Mar 4 08:52:46.339777 systemd[1]: session-10.scope: Deactivated successfully. Mar 4 08:52:46.340416 systemd-logind[1606]: Session 10 logged out. Waiting for processes to exit. Mar 4 08:52:46.341735 systemd-logind[1606]: Removed session 10. Mar 4 08:52:46.442429 systemd[1]: Started sshd@10-10.0.9.143:22-20.161.92.111:60734.service - OpenSSH per-connection server daemon (20.161.92.111:60734). Mar 4 08:52:46.961670 sshd[1914]: Accepted publickey for core from 20.161.92.111 port 60734 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:52:46.962769 sshd-session[1914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:52:46.966795 systemd-logind[1606]: New session 11 of user core. Mar 4 08:52:46.979387 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 4 08:52:47.154111 sudo[1918]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 4 08:52:47.154391 sudo[1918]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 08:52:47.848603 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 4 08:52:47.868800 (dockerd)[1938]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 4 08:52:48.103715 dockerd[1938]: time="2026-03-04T08:52:48.103587351Z" level=info msg="Starting up" Mar 4 08:52:48.105797 dockerd[1938]: time="2026-03-04T08:52:48.105768603Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 4 08:52:48.116251 dockerd[1938]: time="2026-03-04T08:52:48.116217500Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 4 08:52:48.137966 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport4215125764-merged.mount: Deactivated successfully. Mar 4 08:52:48.151839 systemd[1]: var-lib-docker-metacopy\x2dcheck863214377-merged.mount: Deactivated successfully. Mar 4 08:52:48.163145 dockerd[1938]: time="2026-03-04T08:52:48.163094158Z" level=info msg="Loading containers: start." Mar 4 08:52:48.171198 kernel: Initializing XFRM netlink socket Mar 4 08:52:48.381460 systemd-networkd[1441]: docker0: Link UP Mar 4 08:52:48.386043 dockerd[1938]: time="2026-03-04T08:52:48.385959145Z" level=info msg="Loading containers: done." Mar 4 08:52:48.400902 dockerd[1938]: time="2026-03-04T08:52:48.400846347Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 4 08:52:48.401039 dockerd[1938]: time="2026-03-04T08:52:48.400939427Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 4 08:52:48.401039 dockerd[1938]: time="2026-03-04T08:52:48.401015788Z" level=info msg="Initializing buildkit" Mar 4 08:52:48.425437 dockerd[1938]: time="2026-03-04T08:52:48.425391722Z" level=info msg="Completed buildkit initialization" Mar 4 08:52:48.432235 dockerd[1938]: time="2026-03-04T08:52:48.432194639Z" level=info msg="Daemon has completed initialization" Mar 4 08:52:48.432405 dockerd[1938]: time="2026-03-04T08:52:48.432274640Z" level=info msg="API listen on /run/docker.sock" Mar 4 08:52:48.432406 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 4 08:52:49.054235 containerd[1618]: time="2026-03-04T08:52:49.053608974Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 4 08:52:49.729333 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1267344573.mount: Deactivated successfully. Mar 4 08:52:50.683403 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 4 08:52:50.684870 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:52:50.713152 containerd[1618]: time="2026-03-04T08:52:50.713074464Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:50.714095 containerd[1618]: time="2026-03-04T08:52:50.713907148Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=27390272" Mar 4 08:52:50.716951 containerd[1618]: time="2026-03-04T08:52:50.716881963Z" level=info msg="ImageCreate event name:\"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:50.720958 containerd[1618]: time="2026-03-04T08:52:50.720887783Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:50.722259 containerd[1618]: time="2026-03-04T08:52:50.722214310Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"27386773\" in 1.668565736s" Mar 4 08:52:50.722259 containerd[1618]: time="2026-03-04T08:52:50.722257550Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\"" Mar 4 08:52:50.722971 containerd[1618]: time="2026-03-04T08:52:50.722688592Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 4 08:52:50.810271 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:52:50.813865 (kubelet)[2222]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 08:52:50.844022 kubelet[2222]: E0304 08:52:50.843944 2222 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 08:52:50.846464 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 08:52:50.846600 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 08:52:50.848247 systemd[1]: kubelet.service: Consumed 135ms CPU time, 106.4M memory peak. Mar 4 08:52:52.246268 containerd[1618]: time="2026-03-04T08:52:52.246180793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:52.247336 containerd[1618]: time="2026-03-04T08:52:52.247296558Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=23552126" Mar 4 08:52:52.248437 containerd[1618]: time="2026-03-04T08:52:52.248407364Z" level=info msg="ImageCreate event name:\"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:52.251667 containerd[1618]: time="2026-03-04T08:52:52.251626740Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:52.253220 containerd[1618]: time="2026-03-04T08:52:52.253155028Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"25136510\" in 1.530431075s" Mar 4 08:52:52.253270 containerd[1618]: time="2026-03-04T08:52:52.253221308Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\"" Mar 4 08:52:52.254687 containerd[1618]: time="2026-03-04T08:52:52.254537235Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 4 08:52:53.439117 containerd[1618]: time="2026-03-04T08:52:53.439039958Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:53.440111 containerd[1618]: time="2026-03-04T08:52:53.440026563Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=18301325" Mar 4 08:52:53.440969 containerd[1618]: time="2026-03-04T08:52:53.440921047Z" level=info msg="ImageCreate event name:\"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:53.443646 containerd[1618]: time="2026-03-04T08:52:53.443598501Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:53.445071 containerd[1618]: time="2026-03-04T08:52:53.445033748Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"19885727\" in 1.190455193s" Mar 4 08:52:53.445113 containerd[1618]: time="2026-03-04T08:52:53.445069708Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\"" Mar 4 08:52:53.445603 containerd[1618]: time="2026-03-04T08:52:53.445476990Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 4 08:52:54.435888 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1364237204.mount: Deactivated successfully. Mar 4 08:52:54.660410 containerd[1618]: time="2026-03-04T08:52:54.659937545Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:54.660724 containerd[1618]: time="2026-03-04T08:52:54.660675428Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=28148896" Mar 4 08:52:54.663587 containerd[1618]: time="2026-03-04T08:52:54.663495283Z" level=info msg="ImageCreate event name:\"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:54.667034 containerd[1618]: time="2026-03-04T08:52:54.666984580Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:54.667995 containerd[1618]: time="2026-03-04T08:52:54.667947065Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"28147889\" in 1.222435315s" Mar 4 08:52:54.667995 containerd[1618]: time="2026-03-04T08:52:54.667985465Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\"" Mar 4 08:52:54.668745 containerd[1618]: time="2026-03-04T08:52:54.668722629Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 4 08:52:55.168348 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1531012951.mount: Deactivated successfully. Mar 4 08:52:55.939145 containerd[1618]: time="2026-03-04T08:52:55.939073657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:55.939980 containerd[1618]: time="2026-03-04T08:52:55.939939621Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Mar 4 08:52:55.943028 containerd[1618]: time="2026-03-04T08:52:55.942993796Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:55.946211 containerd[1618]: time="2026-03-04T08:52:55.946108492Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:55.947125 containerd[1618]: time="2026-03-04T08:52:55.947078217Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.278329228s" Mar 4 08:52:55.947125 containerd[1618]: time="2026-03-04T08:52:55.947108497Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Mar 4 08:52:55.947589 containerd[1618]: time="2026-03-04T08:52:55.947559899Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 4 08:52:56.408205 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3432158838.mount: Deactivated successfully. Mar 4 08:52:56.415041 containerd[1618]: time="2026-03-04T08:52:56.414993650Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 08:52:56.415941 containerd[1618]: time="2026-03-04T08:52:56.415906775Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Mar 4 08:52:56.417131 containerd[1618]: time="2026-03-04T08:52:56.417090381Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 08:52:56.419130 containerd[1618]: time="2026-03-04T08:52:56.419084791Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 08:52:56.419769 containerd[1618]: time="2026-03-04T08:52:56.419688434Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 472.098614ms" Mar 4 08:52:56.419769 containerd[1618]: time="2026-03-04T08:52:56.419724874Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 4 08:52:56.420223 containerd[1618]: time="2026-03-04T08:52:56.420195556Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 4 08:52:56.935564 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1876467076.mount: Deactivated successfully. Mar 4 08:52:57.609275 containerd[1618]: time="2026-03-04T08:52:57.609212262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:57.612227 containerd[1618]: time="2026-03-04T08:52:57.612182997Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=21885878" Mar 4 08:52:57.613180 containerd[1618]: time="2026-03-04T08:52:57.613127002Z" level=info msg="ImageCreate event name:\"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:57.616523 containerd[1618]: time="2026-03-04T08:52:57.616454979Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:52:57.617531 containerd[1618]: time="2026-03-04T08:52:57.617485184Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"21882972\" in 1.197262108s" Mar 4 08:52:57.617573 containerd[1618]: time="2026-03-04T08:52:57.617528024Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\"" Mar 4 08:53:00.933902 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 4 08:53:00.935283 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:53:01.220967 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:53:01.230589 (kubelet)[2397]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 08:53:01.233274 update_engine[1610]: I20260304 08:53:01.233206 1610 update_attempter.cc:509] Updating boot flags... Mar 4 08:53:01.296276 kubelet[2397]: E0304 08:53:01.296211 2397 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 08:53:01.298946 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 08:53:01.299082 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 08:53:01.299437 systemd[1]: kubelet.service: Consumed 136ms CPU time, 105.9M memory peak. Mar 4 08:53:01.802840 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:53:01.802972 systemd[1]: kubelet.service: Consumed 136ms CPU time, 105.9M memory peak. Mar 4 08:53:01.804769 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:53:01.827920 systemd[1]: Reload requested from client PID 2428 ('systemctl') (unit session-11.scope)... Mar 4 08:53:01.827937 systemd[1]: Reloading... Mar 4 08:53:01.903207 zram_generator::config[2474]: No configuration found. Mar 4 08:53:02.063865 systemd[1]: Reloading finished in 235 ms. Mar 4 08:53:02.105817 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:53:02.107731 systemd[1]: kubelet.service: Deactivated successfully. Mar 4 08:53:02.107976 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:53:02.108029 systemd[1]: kubelet.service: Consumed 93ms CPU time, 95.1M memory peak. Mar 4 08:53:02.110397 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:53:02.818017 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:53:02.822762 (kubelet)[2521]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 4 08:53:02.851422 kubelet[2521]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 4 08:53:02.851422 kubelet[2521]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 4 08:53:02.851422 kubelet[2521]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 4 08:53:02.851753 kubelet[2521]: I0304 08:53:02.851454 2521 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 4 08:53:03.392960 kubelet[2521]: I0304 08:53:03.392893 2521 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 4 08:53:03.392960 kubelet[2521]: I0304 08:53:03.392939 2521 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 4 08:53:03.393204 kubelet[2521]: I0304 08:53:03.393158 2521 server.go:956] "Client rotation is on, will bootstrap in background" Mar 4 08:53:03.426720 kubelet[2521]: E0304 08:53:03.426675 2521 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.9.143:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.9.143:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 4 08:53:03.430602 kubelet[2521]: I0304 08:53:03.430537 2521 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 4 08:53:03.441102 kubelet[2521]: I0304 08:53:03.440216 2521 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 4 08:53:03.443931 kubelet[2521]: I0304 08:53:03.443891 2521 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 4 08:53:03.445416 kubelet[2521]: I0304 08:53:03.445366 2521 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 4 08:53:03.445553 kubelet[2521]: I0304 08:53:03.445408 2521 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-2-039fb286b9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 4 08:53:03.445646 kubelet[2521]: I0304 08:53:03.445558 2521 topology_manager.go:138] "Creating topology manager with none policy" Mar 4 08:53:03.445646 kubelet[2521]: I0304 08:53:03.445567 2521 container_manager_linux.go:303] "Creating device plugin manager" Mar 4 08:53:03.445785 kubelet[2521]: I0304 08:53:03.445769 2521 state_mem.go:36] "Initialized new in-memory state store" Mar 4 08:53:03.450942 kubelet[2521]: I0304 08:53:03.450898 2521 kubelet.go:480] "Attempting to sync node with API server" Mar 4 08:53:03.450942 kubelet[2521]: I0304 08:53:03.450924 2521 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 4 08:53:03.450942 kubelet[2521]: I0304 08:53:03.450948 2521 kubelet.go:386] "Adding apiserver pod source" Mar 4 08:53:03.451049 kubelet[2521]: I0304 08:53:03.450963 2521 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 4 08:53:03.453677 kubelet[2521]: I0304 08:53:03.453654 2521 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 4 08:53:03.454376 kubelet[2521]: I0304 08:53:03.454290 2521 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 4 08:53:03.454449 kubelet[2521]: W0304 08:53:03.454436 2521 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 4 08:53:03.455194 kubelet[2521]: E0304 08:53:03.454984 2521 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.9.143:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-2-039fb286b9&limit=500&resourceVersion=0\": dial tcp 10.0.9.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 4 08:53:03.455565 kubelet[2521]: E0304 08:53:03.455477 2521 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.9.143:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.9.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 4 08:53:03.456833 kubelet[2521]: I0304 08:53:03.456793 2521 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 4 08:53:03.456833 kubelet[2521]: I0304 08:53:03.456838 2521 server.go:1289] "Started kubelet" Mar 4 08:53:03.457023 kubelet[2521]: I0304 08:53:03.456990 2521 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 4 08:53:03.457921 kubelet[2521]: I0304 08:53:03.457901 2521 server.go:317] "Adding debug handlers to kubelet server" Mar 4 08:53:03.459592 kubelet[2521]: I0304 08:53:03.459543 2521 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 4 08:53:03.459888 kubelet[2521]: I0304 08:53:03.459871 2521 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 4 08:53:03.460214 kubelet[2521]: I0304 08:53:03.460194 2521 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 4 08:53:03.460500 kubelet[2521]: I0304 08:53:03.460484 2521 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 4 08:53:03.463728 kubelet[2521]: E0304 08:53:03.463697 2521 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-2-039fb286b9\" not found" Mar 4 08:53:03.463832 kubelet[2521]: I0304 08:53:03.463821 2521 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 4 08:53:03.464040 kubelet[2521]: I0304 08:53:03.464022 2521 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 4 08:53:03.464192 kubelet[2521]: I0304 08:53:03.464157 2521 reconciler.go:26] "Reconciler: start to sync state" Mar 4 08:53:03.464323 kubelet[2521]: E0304 08:53:03.464286 2521 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.9.143:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-2-039fb286b9?timeout=10s\": dial tcp 10.0.9.143:6443: connect: connection refused" interval="200ms" Mar 4 08:53:03.465373 kubelet[2521]: I0304 08:53:03.465322 2521 factory.go:223] Registration of the systemd container factory successfully Mar 4 08:53:03.465443 kubelet[2521]: I0304 08:53:03.465414 2521 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 4 08:53:03.465551 kubelet[2521]: E0304 08:53:03.465527 2521 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.9.143:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.9.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 4 08:53:03.471538 kubelet[2521]: E0304 08:53:03.464326 2521 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.9.143:6443/api/v1/namespaces/default/events\": dial tcp 10.0.9.143:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-2-039fb286b9.18999763d9b0f537 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-2-039fb286b9,UID:ci-4459-2-4-2-039fb286b9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-2-039fb286b9,},FirstTimestamp:2026-03-04 08:53:03.456810295 +0000 UTC m=+0.630745557,LastTimestamp:2026-03-04 08:53:03.456810295 +0000 UTC m=+0.630745557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-2-039fb286b9,}" Mar 4 08:53:03.472801 kubelet[2521]: E0304 08:53:03.472759 2521 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 4 08:53:03.472922 kubelet[2521]: I0304 08:53:03.472891 2521 factory.go:223] Registration of the containerd container factory successfully Mar 4 08:53:03.482831 kubelet[2521]: I0304 08:53:03.482810 2521 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 4 08:53:03.482831 kubelet[2521]: I0304 08:53:03.482827 2521 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 4 08:53:03.482931 kubelet[2521]: I0304 08:53:03.482845 2521 state_mem.go:36] "Initialized new in-memory state store" Mar 4 08:53:03.486527 kubelet[2521]: I0304 08:53:03.485627 2521 policy_none.go:49] "None policy: Start" Mar 4 08:53:03.486527 kubelet[2521]: I0304 08:53:03.485672 2521 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 4 08:53:03.486527 kubelet[2521]: I0304 08:53:03.485684 2521 state_mem.go:35] "Initializing new in-memory state store" Mar 4 08:53:03.488932 kubelet[2521]: I0304 08:53:03.488886 2521 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 4 08:53:03.490109 kubelet[2521]: I0304 08:53:03.490052 2521 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 4 08:53:03.490109 kubelet[2521]: I0304 08:53:03.490075 2521 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 4 08:53:03.490109 kubelet[2521]: I0304 08:53:03.490095 2521 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 4 08:53:03.490109 kubelet[2521]: I0304 08:53:03.490102 2521 kubelet.go:2436] "Starting kubelet main sync loop" Mar 4 08:53:03.490452 kubelet[2521]: E0304 08:53:03.490139 2521 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 4 08:53:03.491673 kubelet[2521]: E0304 08:53:03.491649 2521 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.9.143:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.9.143:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 4 08:53:03.492591 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 4 08:53:03.507194 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 4 08:53:03.509892 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 4 08:53:03.532163 kubelet[2521]: E0304 08:53:03.531632 2521 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 4 08:53:03.532163 kubelet[2521]: I0304 08:53:03.531824 2521 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 4 08:53:03.532163 kubelet[2521]: I0304 08:53:03.531836 2521 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 4 08:53:03.532163 kubelet[2521]: I0304 08:53:03.532098 2521 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 4 08:53:03.533576 kubelet[2521]: E0304 08:53:03.533489 2521 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 4 08:53:03.533710 kubelet[2521]: E0304 08:53:03.533696 2521 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-2-039fb286b9\" not found" Mar 4 08:53:03.599994 systemd[1]: Created slice kubepods-burstable-pod3bbc586856608510e5160b2ae3647978.slice - libcontainer container kubepods-burstable-pod3bbc586856608510e5160b2ae3647978.slice. Mar 4 08:53:03.634430 kubelet[2521]: I0304 08:53:03.634391 2521 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.634738 kubelet[2521]: E0304 08:53:03.634714 2521 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.9.143:6443/api/v1/nodes\": dial tcp 10.0.9.143:6443: connect: connection refused" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.634738 kubelet[2521]: E0304 08:53:03.634731 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.637331 systemd[1]: Created slice kubepods-burstable-podedad63d7700610d0acdd1669487da661.slice - libcontainer container kubepods-burstable-podedad63d7700610d0acdd1669487da661.slice. Mar 4 08:53:03.638792 kubelet[2521]: E0304 08:53:03.638765 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.640993 systemd[1]: Created slice kubepods-burstable-podf6d20466e2868a1aa79fbcbfb7111232.slice - libcontainer container kubepods-burstable-podf6d20466e2868a1aa79fbcbfb7111232.slice. Mar 4 08:53:03.642451 kubelet[2521]: E0304 08:53:03.642419 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.665152 kubelet[2521]: E0304 08:53:03.665058 2521 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.9.143:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-2-039fb286b9?timeout=10s\": dial tcp 10.0.9.143:6443: connect: connection refused" interval="400ms" Mar 4 08:53:03.666169 kubelet[2521]: I0304 08:53:03.666123 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666219 kubelet[2521]: I0304 08:53:03.666161 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666219 kubelet[2521]: I0304 08:53:03.666190 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666552 kubelet[2521]: I0304 08:53:03.666475 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f6d20466e2868a1aa79fbcbfb7111232-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" (UID: \"f6d20466e2868a1aa79fbcbfb7111232\") " pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666589 kubelet[2521]: I0304 08:53:03.666564 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666589 kubelet[2521]: I0304 08:53:03.666582 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666630 kubelet[2521]: I0304 08:53:03.666600 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/edad63d7700610d0acdd1669487da661-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-2-039fb286b9\" (UID: \"edad63d7700610d0acdd1669487da661\") " pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666630 kubelet[2521]: I0304 08:53:03.666622 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f6d20466e2868a1aa79fbcbfb7111232-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" (UID: \"f6d20466e2868a1aa79fbcbfb7111232\") " pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.666665 kubelet[2521]: I0304 08:53:03.666639 2521 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f6d20466e2868a1aa79fbcbfb7111232-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" (UID: \"f6d20466e2868a1aa79fbcbfb7111232\") " pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.837040 kubelet[2521]: I0304 08:53:03.837004 2521 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.837367 kubelet[2521]: E0304 08:53:03.837340 2521 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.9.143:6443/api/v1/nodes\": dial tcp 10.0.9.143:6443: connect: connection refused" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:03.936224 containerd[1618]: time="2026-03-04T08:53:03.936151684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-2-039fb286b9,Uid:3bbc586856608510e5160b2ae3647978,Namespace:kube-system,Attempt:0,}" Mar 4 08:53:03.939739 containerd[1618]: time="2026-03-04T08:53:03.939509061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-2-039fb286b9,Uid:edad63d7700610d0acdd1669487da661,Namespace:kube-system,Attempt:0,}" Mar 4 08:53:03.943944 containerd[1618]: time="2026-03-04T08:53:03.943913124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-2-039fb286b9,Uid:f6d20466e2868a1aa79fbcbfb7111232,Namespace:kube-system,Attempt:0,}" Mar 4 08:53:03.962375 containerd[1618]: time="2026-03-04T08:53:03.962276337Z" level=info msg="connecting to shim ed3dc60f26ec781aed47dfc7aa19e59d9029012b049aa5701228f373f69a5973" address="unix:///run/containerd/s/5dae65718c556d5dffec4ae52bcab057335f21bf2d3300fe635fa5bd4a8a2b56" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:03.970752 containerd[1618]: time="2026-03-04T08:53:03.970700860Z" level=info msg="connecting to shim 50c3427d0c71fc9022bd76d83620385c85526aa7c9de617e26aa4c194877fe54" address="unix:///run/containerd/s/ef941cc08e187e203b363fe26719b3e9f97803a10475f978f424fdeba5bdcfa3" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:03.975429 containerd[1618]: time="2026-03-04T08:53:03.975314523Z" level=info msg="connecting to shim bc990f2717a985b15212b0ca37077a03161c098ea35d054c669fbe6e30d39dbd" address="unix:///run/containerd/s/aad77d73ad9fdfab2a93bbb373b1fc73e2b61ff49d81750433e77dad9dc6227e" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:03.993362 systemd[1]: Started cri-containerd-ed3dc60f26ec781aed47dfc7aa19e59d9029012b049aa5701228f373f69a5973.scope - libcontainer container ed3dc60f26ec781aed47dfc7aa19e59d9029012b049aa5701228f373f69a5973. Mar 4 08:53:03.996859 systemd[1]: Started cri-containerd-50c3427d0c71fc9022bd76d83620385c85526aa7c9de617e26aa4c194877fe54.scope - libcontainer container 50c3427d0c71fc9022bd76d83620385c85526aa7c9de617e26aa4c194877fe54. Mar 4 08:53:03.998038 systemd[1]: Started cri-containerd-bc990f2717a985b15212b0ca37077a03161c098ea35d054c669fbe6e30d39dbd.scope - libcontainer container bc990f2717a985b15212b0ca37077a03161c098ea35d054c669fbe6e30d39dbd. Mar 4 08:53:04.039543 containerd[1618]: time="2026-03-04T08:53:04.039484288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-2-039fb286b9,Uid:edad63d7700610d0acdd1669487da661,Namespace:kube-system,Attempt:0,} returns sandbox id \"ed3dc60f26ec781aed47dfc7aa19e59d9029012b049aa5701228f373f69a5973\"" Mar 4 08:53:04.042862 containerd[1618]: time="2026-03-04T08:53:04.042727545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-2-039fb286b9,Uid:f6d20466e2868a1aa79fbcbfb7111232,Namespace:kube-system,Attempt:0,} returns sandbox id \"bc990f2717a985b15212b0ca37077a03161c098ea35d054c669fbe6e30d39dbd\"" Mar 4 08:53:04.044794 containerd[1618]: time="2026-03-04T08:53:04.044760755Z" level=info msg="CreateContainer within sandbox \"ed3dc60f26ec781aed47dfc7aa19e59d9029012b049aa5701228f373f69a5973\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 4 08:53:04.045403 containerd[1618]: time="2026-03-04T08:53:04.045373678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-2-039fb286b9,Uid:3bbc586856608510e5160b2ae3647978,Namespace:kube-system,Attempt:0,} returns sandbox id \"50c3427d0c71fc9022bd76d83620385c85526aa7c9de617e26aa4c194877fe54\"" Mar 4 08:53:04.047585 containerd[1618]: time="2026-03-04T08:53:04.047333168Z" level=info msg="CreateContainer within sandbox \"bc990f2717a985b15212b0ca37077a03161c098ea35d054c669fbe6e30d39dbd\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 4 08:53:04.052944 containerd[1618]: time="2026-03-04T08:53:04.052883956Z" level=info msg="Container 1c9c661c9461e8ed191dabcdfd9deb3aea3cec5d3afd2efe891122e8e41763ef: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:04.059245 containerd[1618]: time="2026-03-04T08:53:04.058710986Z" level=info msg="CreateContainer within sandbox \"50c3427d0c71fc9022bd76d83620385c85526aa7c9de617e26aa4c194877fe54\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 4 08:53:04.066465 containerd[1618]: time="2026-03-04T08:53:04.066393704Z" level=info msg="CreateContainer within sandbox \"bc990f2717a985b15212b0ca37077a03161c098ea35d054c669fbe6e30d39dbd\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1c9c661c9461e8ed191dabcdfd9deb3aea3cec5d3afd2efe891122e8e41763ef\"" Mar 4 08:53:04.066946 kubelet[2521]: E0304 08:53:04.066916 2521 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.9.143:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-2-039fb286b9?timeout=10s\": dial tcp 10.0.9.143:6443: connect: connection refused" interval="800ms" Mar 4 08:53:04.067514 containerd[1618]: time="2026-03-04T08:53:04.067155108Z" level=info msg="StartContainer for \"1c9c661c9461e8ed191dabcdfd9deb3aea3cec5d3afd2efe891122e8e41763ef\"" Mar 4 08:53:04.067665 containerd[1618]: time="2026-03-04T08:53:04.067402350Z" level=info msg="Container 6966fb3a7b1498afe4f61f903948348a871faab655e2bab2232767f8942b70f0: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:04.069870 containerd[1618]: time="2026-03-04T08:53:04.069841282Z" level=info msg="connecting to shim 1c9c661c9461e8ed191dabcdfd9deb3aea3cec5d3afd2efe891122e8e41763ef" address="unix:///run/containerd/s/aad77d73ad9fdfab2a93bbb373b1fc73e2b61ff49d81750433e77dad9dc6227e" protocol=ttrpc version=3 Mar 4 08:53:04.071744 containerd[1618]: time="2026-03-04T08:53:04.071711171Z" level=info msg="Container 881eadac760ef1ce372c9bd767db2722c84fc3bb6e5822fb921f0a4e646bff5d: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:04.075373 containerd[1618]: time="2026-03-04T08:53:04.075333550Z" level=info msg="CreateContainer within sandbox \"ed3dc60f26ec781aed47dfc7aa19e59d9029012b049aa5701228f373f69a5973\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6966fb3a7b1498afe4f61f903948348a871faab655e2bab2232767f8942b70f0\"" Mar 4 08:53:04.075854 containerd[1618]: time="2026-03-04T08:53:04.075817472Z" level=info msg="StartContainer for \"6966fb3a7b1498afe4f61f903948348a871faab655e2bab2232767f8942b70f0\"" Mar 4 08:53:04.077154 containerd[1618]: time="2026-03-04T08:53:04.077053518Z" level=info msg="connecting to shim 6966fb3a7b1498afe4f61f903948348a871faab655e2bab2232767f8942b70f0" address="unix:///run/containerd/s/5dae65718c556d5dffec4ae52bcab057335f21bf2d3300fe635fa5bd4a8a2b56" protocol=ttrpc version=3 Mar 4 08:53:04.080844 containerd[1618]: time="2026-03-04T08:53:04.080803537Z" level=info msg="CreateContainer within sandbox \"50c3427d0c71fc9022bd76d83620385c85526aa7c9de617e26aa4c194877fe54\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"881eadac760ef1ce372c9bd767db2722c84fc3bb6e5822fb921f0a4e646bff5d\"" Mar 4 08:53:04.081498 containerd[1618]: time="2026-03-04T08:53:04.081467381Z" level=info msg="StartContainer for \"881eadac760ef1ce372c9bd767db2722c84fc3bb6e5822fb921f0a4e646bff5d\"" Mar 4 08:53:04.083313 containerd[1618]: time="2026-03-04T08:53:04.083285510Z" level=info msg="connecting to shim 881eadac760ef1ce372c9bd767db2722c84fc3bb6e5822fb921f0a4e646bff5d" address="unix:///run/containerd/s/ef941cc08e187e203b363fe26719b3e9f97803a10475f978f424fdeba5bdcfa3" protocol=ttrpc version=3 Mar 4 08:53:04.090365 systemd[1]: Started cri-containerd-1c9c661c9461e8ed191dabcdfd9deb3aea3cec5d3afd2efe891122e8e41763ef.scope - libcontainer container 1c9c661c9461e8ed191dabcdfd9deb3aea3cec5d3afd2efe891122e8e41763ef. Mar 4 08:53:04.093223 systemd[1]: Started cri-containerd-6966fb3a7b1498afe4f61f903948348a871faab655e2bab2232767f8942b70f0.scope - libcontainer container 6966fb3a7b1498afe4f61f903948348a871faab655e2bab2232767f8942b70f0. Mar 4 08:53:04.112306 systemd[1]: Started cri-containerd-881eadac760ef1ce372c9bd767db2722c84fc3bb6e5822fb921f0a4e646bff5d.scope - libcontainer container 881eadac760ef1ce372c9bd767db2722c84fc3bb6e5822fb921f0a4e646bff5d. Mar 4 08:53:04.150469 containerd[1618]: time="2026-03-04T08:53:04.150393490Z" level=info msg="StartContainer for \"1c9c661c9461e8ed191dabcdfd9deb3aea3cec5d3afd2efe891122e8e41763ef\" returns successfully" Mar 4 08:53:04.150717 containerd[1618]: time="2026-03-04T08:53:04.150655891Z" level=info msg="StartContainer for \"6966fb3a7b1498afe4f61f903948348a871faab655e2bab2232767f8942b70f0\" returns successfully" Mar 4 08:53:04.162420 containerd[1618]: time="2026-03-04T08:53:04.162333031Z" level=info msg="StartContainer for \"881eadac760ef1ce372c9bd767db2722c84fc3bb6e5822fb921f0a4e646bff5d\" returns successfully" Mar 4 08:53:04.240282 kubelet[2521]: I0304 08:53:04.240040 2521 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:04.240487 kubelet[2521]: E0304 08:53:04.240389 2521 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.9.143:6443/api/v1/nodes\": dial tcp 10.0.9.143:6443: connect: connection refused" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:04.498024 kubelet[2521]: E0304 08:53:04.497911 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:04.501792 kubelet[2521]: E0304 08:53:04.501744 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:04.504450 kubelet[2521]: E0304 08:53:04.504428 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:05.042433 kubelet[2521]: I0304 08:53:05.042399 2521 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:05.506386 kubelet[2521]: E0304 08:53:05.506353 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:05.507183 kubelet[2521]: E0304 08:53:05.506827 2521 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.185264 kubelet[2521]: E0304 08:53:06.185223 2521 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-2-039fb286b9\" not found" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.268753 kubelet[2521]: I0304 08:53:06.268508 2521 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.364792 kubelet[2521]: I0304 08:53:06.364739 2521 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.369679 kubelet[2521]: E0304 08:53:06.369539 2521 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-2-039fb286b9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.369679 kubelet[2521]: I0304 08:53:06.369570 2521 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.371581 kubelet[2521]: E0304 08:53:06.371542 2521 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.371581 kubelet[2521]: I0304 08:53:06.371573 2521 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.373600 kubelet[2521]: E0304 08:53:06.373562 2521 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:06.454279 kubelet[2521]: I0304 08:53:06.454057 2521 apiserver.go:52] "Watching apiserver" Mar 4 08:53:06.465127 kubelet[2521]: I0304 08:53:06.465059 2521 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 4 08:53:07.153681 kubelet[2521]: I0304 08:53:07.153443 2521 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:07.204699 kubelet[2521]: I0304 08:53:07.204665 2521 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:08.170012 systemd[1]: Reload requested from client PID 2806 ('systemctl') (unit session-11.scope)... Mar 4 08:53:08.170026 systemd[1]: Reloading... Mar 4 08:53:08.250203 zram_generator::config[2852]: No configuration found. Mar 4 08:53:08.423250 systemd[1]: Reloading finished in 252 ms. Mar 4 08:53:08.442672 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:53:08.462204 systemd[1]: kubelet.service: Deactivated successfully. Mar 4 08:53:08.462449 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:53:08.462515 systemd[1]: kubelet.service: Consumed 994ms CPU time, 128.7M memory peak. Mar 4 08:53:08.464228 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 08:53:08.611727 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 08:53:08.624920 (kubelet)[2894]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 4 08:53:08.841115 kubelet[2894]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 4 08:53:08.841115 kubelet[2894]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 4 08:53:08.841115 kubelet[2894]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 4 08:53:08.841486 kubelet[2894]: I0304 08:53:08.841146 2894 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 4 08:53:08.848341 kubelet[2894]: I0304 08:53:08.848304 2894 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 4 08:53:08.848341 kubelet[2894]: I0304 08:53:08.848335 2894 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 4 08:53:08.848596 kubelet[2894]: I0304 08:53:08.848579 2894 server.go:956] "Client rotation is on, will bootstrap in background" Mar 4 08:53:08.849890 kubelet[2894]: I0304 08:53:08.849874 2894 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 4 08:53:08.852064 kubelet[2894]: I0304 08:53:08.852042 2894 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 4 08:53:08.855562 kubelet[2894]: I0304 08:53:08.855517 2894 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 4 08:53:08.858248 kubelet[2894]: I0304 08:53:08.858228 2894 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 4 08:53:08.858434 kubelet[2894]: I0304 08:53:08.858410 2894 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 4 08:53:08.858588 kubelet[2894]: I0304 08:53:08.858432 2894 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-2-039fb286b9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 4 08:53:08.858662 kubelet[2894]: I0304 08:53:08.858595 2894 topology_manager.go:138] "Creating topology manager with none policy" Mar 4 08:53:08.858662 kubelet[2894]: I0304 08:53:08.858605 2894 container_manager_linux.go:303] "Creating device plugin manager" Mar 4 08:53:08.858706 kubelet[2894]: I0304 08:53:08.858669 2894 state_mem.go:36] "Initialized new in-memory state store" Mar 4 08:53:08.858834 kubelet[2894]: I0304 08:53:08.858822 2894 kubelet.go:480] "Attempting to sync node with API server" Mar 4 08:53:08.858866 kubelet[2894]: I0304 08:53:08.858836 2894 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 4 08:53:08.858866 kubelet[2894]: I0304 08:53:08.858864 2894 kubelet.go:386] "Adding apiserver pod source" Mar 4 08:53:08.858918 kubelet[2894]: I0304 08:53:08.858879 2894 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 4 08:53:08.862193 kubelet[2894]: I0304 08:53:08.860956 2894 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 4 08:53:08.862193 kubelet[2894]: I0304 08:53:08.861714 2894 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 4 08:53:08.863834 kubelet[2894]: I0304 08:53:08.863802 2894 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 4 08:53:08.863909 kubelet[2894]: I0304 08:53:08.863857 2894 server.go:1289] "Started kubelet" Mar 4 08:53:08.864916 kubelet[2894]: I0304 08:53:08.864722 2894 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 4 08:53:08.868417 kubelet[2894]: I0304 08:53:08.866369 2894 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 4 08:53:08.868417 kubelet[2894]: I0304 08:53:08.866440 2894 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 4 08:53:08.872048 kubelet[2894]: I0304 08:53:08.872015 2894 server.go:317] "Adding debug handlers to kubelet server" Mar 4 08:53:08.875241 kubelet[2894]: I0304 08:53:08.875201 2894 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 4 08:53:08.878084 kubelet[2894]: I0304 08:53:08.877849 2894 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 4 08:53:08.884769 kubelet[2894]: E0304 08:53:08.881855 2894 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-2-039fb286b9\" not found" Mar 4 08:53:08.884769 kubelet[2894]: I0304 08:53:08.881890 2894 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 4 08:53:08.884769 kubelet[2894]: I0304 08:53:08.882186 2894 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 4 08:53:08.884769 kubelet[2894]: I0304 08:53:08.882404 2894 reconciler.go:26] "Reconciler: start to sync state" Mar 4 08:53:08.885626 kubelet[2894]: E0304 08:53:08.885003 2894 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 4 08:53:08.885626 kubelet[2894]: I0304 08:53:08.885444 2894 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 4 08:53:08.890351 kubelet[2894]: I0304 08:53:08.890311 2894 factory.go:223] Registration of the containerd container factory successfully Mar 4 08:53:08.890351 kubelet[2894]: I0304 08:53:08.890335 2894 factory.go:223] Registration of the systemd container factory successfully Mar 4 08:53:08.897193 kubelet[2894]: I0304 08:53:08.897139 2894 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 4 08:53:08.898598 kubelet[2894]: I0304 08:53:08.898576 2894 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 4 08:53:08.898767 kubelet[2894]: I0304 08:53:08.898753 2894 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 4 08:53:08.898828 kubelet[2894]: I0304 08:53:08.898819 2894 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 4 08:53:08.898874 kubelet[2894]: I0304 08:53:08.898867 2894 kubelet.go:2436] "Starting kubelet main sync loop" Mar 4 08:53:08.898958 kubelet[2894]: E0304 08:53:08.898941 2894 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 4 08:53:08.929507 kubelet[2894]: I0304 08:53:08.929476 2894 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 4 08:53:08.929507 kubelet[2894]: I0304 08:53:08.929498 2894 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 4 08:53:08.929650 kubelet[2894]: I0304 08:53:08.929521 2894 state_mem.go:36] "Initialized new in-memory state store" Mar 4 08:53:08.929732 kubelet[2894]: I0304 08:53:08.929676 2894 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 4 08:53:08.929765 kubelet[2894]: I0304 08:53:08.929700 2894 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 4 08:53:08.929787 kubelet[2894]: I0304 08:53:08.929757 2894 policy_none.go:49] "None policy: Start" Mar 4 08:53:08.929787 kubelet[2894]: I0304 08:53:08.929780 2894 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 4 08:53:08.929835 kubelet[2894]: I0304 08:53:08.929806 2894 state_mem.go:35] "Initializing new in-memory state store" Mar 4 08:53:08.929956 kubelet[2894]: I0304 08:53:08.929941 2894 state_mem.go:75] "Updated machine memory state" Mar 4 08:53:08.934067 kubelet[2894]: E0304 08:53:08.934042 2894 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 4 08:53:08.934347 kubelet[2894]: I0304 08:53:08.934216 2894 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 4 08:53:08.934347 kubelet[2894]: I0304 08:53:08.934232 2894 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 4 08:53:08.934480 kubelet[2894]: I0304 08:53:08.934454 2894 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 4 08:53:08.935768 kubelet[2894]: E0304 08:53:08.935397 2894 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 4 08:53:09.000324 kubelet[2894]: I0304 08:53:09.000271 2894 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.001050 kubelet[2894]: I0304 08:53:09.000995 2894 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.001136 kubelet[2894]: I0304 08:53:09.001071 2894 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.009029 kubelet[2894]: E0304 08:53:09.008993 2894 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-2-039fb286b9\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.009202 kubelet[2894]: E0304 08:53:09.008998 2894 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.037378 kubelet[2894]: I0304 08:53:09.037347 2894 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.048338 kubelet[2894]: I0304 08:53:09.048302 2894 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.048461 kubelet[2894]: I0304 08:53:09.048391 2894 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.084551 kubelet[2894]: I0304 08:53:09.084502 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.084759 kubelet[2894]: I0304 08:53:09.084561 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/edad63d7700610d0acdd1669487da661-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-2-039fb286b9\" (UID: \"edad63d7700610d0acdd1669487da661\") " pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.084759 kubelet[2894]: I0304 08:53:09.084622 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f6d20466e2868a1aa79fbcbfb7111232-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" (UID: \"f6d20466e2868a1aa79fbcbfb7111232\") " pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.084759 kubelet[2894]: I0304 08:53:09.084708 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f6d20466e2868a1aa79fbcbfb7111232-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" (UID: \"f6d20466e2868a1aa79fbcbfb7111232\") " pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.084759 kubelet[2894]: I0304 08:53:09.084755 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.085089 kubelet[2894]: I0304 08:53:09.084777 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.085089 kubelet[2894]: I0304 08:53:09.084833 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.085089 kubelet[2894]: I0304 08:53:09.084861 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f6d20466e2868a1aa79fbcbfb7111232-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" (UID: \"f6d20466e2868a1aa79fbcbfb7111232\") " pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.085089 kubelet[2894]: I0304 08:53:09.084881 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3bbc586856608510e5160b2ae3647978-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-2-039fb286b9\" (UID: \"3bbc586856608510e5160b2ae3647978\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.859215 kubelet[2894]: I0304 08:53:09.859139 2894 apiserver.go:52] "Watching apiserver" Mar 4 08:53:09.883127 kubelet[2894]: I0304 08:53:09.883086 2894 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 4 08:53:09.913713 kubelet[2894]: I0304 08:53:09.913475 2894 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.921242 kubelet[2894]: E0304 08:53:09.921139 2894 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-2-039fb286b9\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" Mar 4 08:53:09.967606 kubelet[2894]: I0304 08:53:09.967527 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-2-039fb286b9" podStartSLOduration=2.967497969 podStartE2EDuration="2.967497969s" podCreationTimestamp="2026-03-04 08:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 08:53:09.95392402 +0000 UTC m=+1.324252872" watchObservedRunningTime="2026-03-04 08:53:09.967497969 +0000 UTC m=+1.337826741" Mar 4 08:53:09.983728 kubelet[2894]: I0304 08:53:09.983608 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-2-039fb286b9" podStartSLOduration=2.98359121 podStartE2EDuration="2.98359121s" podCreationTimestamp="2026-03-04 08:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 08:53:09.968340853 +0000 UTC m=+1.338669665" watchObservedRunningTime="2026-03-04 08:53:09.98359121 +0000 UTC m=+1.353920022" Mar 4 08:53:09.983728 kubelet[2894]: I0304 08:53:09.983717 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-2-039fb286b9" podStartSLOduration=0.983712371 podStartE2EDuration="983.712371ms" podCreationTimestamp="2026-03-04 08:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 08:53:09.98353421 +0000 UTC m=+1.353862982" watchObservedRunningTime="2026-03-04 08:53:09.983712371 +0000 UTC m=+1.354041183" Mar 4 08:53:13.372119 kubelet[2894]: I0304 08:53:13.372075 2894 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 4 08:53:13.372489 containerd[1618]: time="2026-03-04T08:53:13.372422783Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 4 08:53:13.372651 kubelet[2894]: I0304 08:53:13.372594 2894 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 4 08:53:14.112320 systemd[1]: Created slice kubepods-besteffort-podfef24052_f850_4251_8106_74563e8ba328.slice - libcontainer container kubepods-besteffort-podfef24052_f850_4251_8106_74563e8ba328.slice. Mar 4 08:53:14.116644 kubelet[2894]: I0304 08:53:14.116581 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/fef24052-f850-4251-8106-74563e8ba328-kube-proxy\") pod \"kube-proxy-7xn57\" (UID: \"fef24052-f850-4251-8106-74563e8ba328\") " pod="kube-system/kube-proxy-7xn57" Mar 4 08:53:14.116644 kubelet[2894]: I0304 08:53:14.116622 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fef24052-f850-4251-8106-74563e8ba328-xtables-lock\") pod \"kube-proxy-7xn57\" (UID: \"fef24052-f850-4251-8106-74563e8ba328\") " pod="kube-system/kube-proxy-7xn57" Mar 4 08:53:14.116644 kubelet[2894]: I0304 08:53:14.116650 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fef24052-f850-4251-8106-74563e8ba328-lib-modules\") pod \"kube-proxy-7xn57\" (UID: \"fef24052-f850-4251-8106-74563e8ba328\") " pod="kube-system/kube-proxy-7xn57" Mar 4 08:53:14.116804 kubelet[2894]: I0304 08:53:14.116734 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdrnk\" (UniqueName: \"kubernetes.io/projected/fef24052-f850-4251-8106-74563e8ba328-kube-api-access-fdrnk\") pod \"kube-proxy-7xn57\" (UID: \"fef24052-f850-4251-8106-74563e8ba328\") " pod="kube-system/kube-proxy-7xn57" Mar 4 08:53:14.225099 kubelet[2894]: E0304 08:53:14.225059 2894 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 4 08:53:14.225099 kubelet[2894]: E0304 08:53:14.225094 2894 projected.go:194] Error preparing data for projected volume kube-api-access-fdrnk for pod kube-system/kube-proxy-7xn57: configmap "kube-root-ca.crt" not found Mar 4 08:53:14.225261 kubelet[2894]: E0304 08:53:14.225197 2894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fef24052-f850-4251-8106-74563e8ba328-kube-api-access-fdrnk podName:fef24052-f850-4251-8106-74563e8ba328 nodeName:}" failed. No retries permitted until 2026-03-04 08:53:14.725141744 +0000 UTC m=+6.095470556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fdrnk" (UniqueName: "kubernetes.io/projected/fef24052-f850-4251-8106-74563e8ba328-kube-api-access-fdrnk") pod "kube-proxy-7xn57" (UID: "fef24052-f850-4251-8106-74563e8ba328") : configmap "kube-root-ca.crt" not found Mar 4 08:53:14.626133 systemd[1]: Created slice kubepods-besteffort-podc273e9f0_43f3_4100_a011_b4840683c333.slice - libcontainer container kubepods-besteffort-podc273e9f0_43f3_4100_a011_b4840683c333.slice. Mar 4 08:53:14.719512 kubelet[2894]: I0304 08:53:14.719467 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnkg\" (UniqueName: \"kubernetes.io/projected/c273e9f0-43f3-4100-a011-b4840683c333-kube-api-access-8fnkg\") pod \"tigera-operator-6bf85f8dd-2rbj8\" (UID: \"c273e9f0-43f3-4100-a011-b4840683c333\") " pod="tigera-operator/tigera-operator-6bf85f8dd-2rbj8" Mar 4 08:53:14.719512 kubelet[2894]: I0304 08:53:14.719511 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c273e9f0-43f3-4100-a011-b4840683c333-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-2rbj8\" (UID: \"c273e9f0-43f3-4100-a011-b4840683c333\") " pod="tigera-operator/tigera-operator-6bf85f8dd-2rbj8" Mar 4 08:53:14.930650 containerd[1618]: time="2026-03-04T08:53:14.930394198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-2rbj8,Uid:c273e9f0-43f3-4100-a011-b4840683c333,Namespace:tigera-operator,Attempt:0,}" Mar 4 08:53:14.948726 containerd[1618]: time="2026-03-04T08:53:14.948676731Z" level=info msg="connecting to shim c30d5da52add232a70c3cce3520c5626836abfa94b7dee28177686949e866348" address="unix:///run/containerd/s/3834657e1c3171cc9388d82c57b0af6125863bed51b9728dfb5b8acf50830a72" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:14.973584 systemd[1]: Started cri-containerd-c30d5da52add232a70c3cce3520c5626836abfa94b7dee28177686949e866348.scope - libcontainer container c30d5da52add232a70c3cce3520c5626836abfa94b7dee28177686949e866348. Mar 4 08:53:15.001981 containerd[1618]: time="2026-03-04T08:53:15.001936641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-2rbj8,Uid:c273e9f0-43f3-4100-a011-b4840683c333,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c30d5da52add232a70c3cce3520c5626836abfa94b7dee28177686949e866348\"" Mar 4 08:53:15.004100 containerd[1618]: time="2026-03-04T08:53:15.004071411Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 4 08:53:15.031984 containerd[1618]: time="2026-03-04T08:53:15.031939393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7xn57,Uid:fef24052-f850-4251-8106-74563e8ba328,Namespace:kube-system,Attempt:0,}" Mar 4 08:53:15.047929 containerd[1618]: time="2026-03-04T08:53:15.047871393Z" level=info msg="connecting to shim a8d99149b1ac1cebece8ca8836380895a9da68bc0c0c0b7cde29b948caf15885" address="unix:///run/containerd/s/c4ae23931f97724538d7c80ba734eba9028b1b0aaad4b0b037689331690f90e3" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:15.073404 systemd[1]: Started cri-containerd-a8d99149b1ac1cebece8ca8836380895a9da68bc0c0c0b7cde29b948caf15885.scope - libcontainer container a8d99149b1ac1cebece8ca8836380895a9da68bc0c0c0b7cde29b948caf15885. Mar 4 08:53:15.095203 containerd[1618]: time="2026-03-04T08:53:15.095151393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7xn57,Uid:fef24052-f850-4251-8106-74563e8ba328,Namespace:kube-system,Attempt:0,} returns sandbox id \"a8d99149b1ac1cebece8ca8836380895a9da68bc0c0c0b7cde29b948caf15885\"" Mar 4 08:53:15.099933 containerd[1618]: time="2026-03-04T08:53:15.099892817Z" level=info msg="CreateContainer within sandbox \"a8d99149b1ac1cebece8ca8836380895a9da68bc0c0c0b7cde29b948caf15885\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 4 08:53:15.109930 containerd[1618]: time="2026-03-04T08:53:15.109883188Z" level=info msg="Container ba9cbdab54a10c975944b4bae3ad4665c8fcdd6175c40a9b18ce116d7618121f: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:15.117181 containerd[1618]: time="2026-03-04T08:53:15.117113664Z" level=info msg="CreateContainer within sandbox \"a8d99149b1ac1cebece8ca8836380895a9da68bc0c0c0b7cde29b948caf15885\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ba9cbdab54a10c975944b4bae3ad4665c8fcdd6175c40a9b18ce116d7618121f\"" Mar 4 08:53:15.117763 containerd[1618]: time="2026-03-04T08:53:15.117700787Z" level=info msg="StartContainer for \"ba9cbdab54a10c975944b4bae3ad4665c8fcdd6175c40a9b18ce116d7618121f\"" Mar 4 08:53:15.119251 containerd[1618]: time="2026-03-04T08:53:15.119212675Z" level=info msg="connecting to shim ba9cbdab54a10c975944b4bae3ad4665c8fcdd6175c40a9b18ce116d7618121f" address="unix:///run/containerd/s/c4ae23931f97724538d7c80ba734eba9028b1b0aaad4b0b037689331690f90e3" protocol=ttrpc version=3 Mar 4 08:53:15.146373 systemd[1]: Started cri-containerd-ba9cbdab54a10c975944b4bae3ad4665c8fcdd6175c40a9b18ce116d7618121f.scope - libcontainer container ba9cbdab54a10c975944b4bae3ad4665c8fcdd6175c40a9b18ce116d7618121f. Mar 4 08:53:15.226406 containerd[1618]: time="2026-03-04T08:53:15.226310777Z" level=info msg="StartContainer for \"ba9cbdab54a10c975944b4bae3ad4665c8fcdd6175c40a9b18ce116d7618121f\" returns successfully" Mar 4 08:53:16.282721 kubelet[2894]: I0304 08:53:16.282648 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7xn57" podStartSLOduration=2.28263173 podStartE2EDuration="2.28263173s" podCreationTimestamp="2026-03-04 08:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 08:53:15.938057064 +0000 UTC m=+7.308385836" watchObservedRunningTime="2026-03-04 08:53:16.28263173 +0000 UTC m=+7.652960542" Mar 4 08:53:17.272824 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3804982242.mount: Deactivated successfully. Mar 4 08:53:18.283971 containerd[1618]: time="2026-03-04T08:53:18.283879911Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:18.285162 containerd[1618]: time="2026-03-04T08:53:18.285116798Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 4 08:53:18.286215 containerd[1618]: time="2026-03-04T08:53:18.286158763Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:18.291604 containerd[1618]: time="2026-03-04T08:53:18.291554350Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:18.292826 containerd[1618]: time="2026-03-04T08:53:18.292774836Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 3.288666385s" Mar 4 08:53:18.292826 containerd[1618]: time="2026-03-04T08:53:18.292817597Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 4 08:53:18.297228 containerd[1618]: time="2026-03-04T08:53:18.297193379Z" level=info msg="CreateContainer within sandbox \"c30d5da52add232a70c3cce3520c5626836abfa94b7dee28177686949e866348\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 4 08:53:18.303204 containerd[1618]: time="2026-03-04T08:53:18.302527566Z" level=info msg="Container 9dcae0aeeea8ee842a581a248a5c36f34ae3a4cba5926346a300336d9ec6824b: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:18.308674 containerd[1618]: time="2026-03-04T08:53:18.308629837Z" level=info msg="CreateContainer within sandbox \"c30d5da52add232a70c3cce3520c5626836abfa94b7dee28177686949e866348\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9dcae0aeeea8ee842a581a248a5c36f34ae3a4cba5926346a300336d9ec6824b\"" Mar 4 08:53:18.309352 containerd[1618]: time="2026-03-04T08:53:18.309324840Z" level=info msg="StartContainer for \"9dcae0aeeea8ee842a581a248a5c36f34ae3a4cba5926346a300336d9ec6824b\"" Mar 4 08:53:18.310141 containerd[1618]: time="2026-03-04T08:53:18.310098484Z" level=info msg="connecting to shim 9dcae0aeeea8ee842a581a248a5c36f34ae3a4cba5926346a300336d9ec6824b" address="unix:///run/containerd/s/3834657e1c3171cc9388d82c57b0af6125863bed51b9728dfb5b8acf50830a72" protocol=ttrpc version=3 Mar 4 08:53:18.329480 systemd[1]: Started cri-containerd-9dcae0aeeea8ee842a581a248a5c36f34ae3a4cba5926346a300336d9ec6824b.scope - libcontainer container 9dcae0aeeea8ee842a581a248a5c36f34ae3a4cba5926346a300336d9ec6824b. Mar 4 08:53:18.353501 containerd[1618]: time="2026-03-04T08:53:18.353460744Z" level=info msg="StartContainer for \"9dcae0aeeea8ee842a581a248a5c36f34ae3a4cba5926346a300336d9ec6824b\" returns successfully" Mar 4 08:53:18.942650 kubelet[2894]: I0304 08:53:18.942130 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-2rbj8" podStartSLOduration=1.652161056 podStartE2EDuration="4.942114647s" podCreationTimestamp="2026-03-04 08:53:14 +0000 UTC" firstStartedPulling="2026-03-04 08:53:15.003566249 +0000 UTC m=+6.373895061" lastFinishedPulling="2026-03-04 08:53:18.29351988 +0000 UTC m=+9.663848652" observedRunningTime="2026-03-04 08:53:18.942067047 +0000 UTC m=+10.312395859" watchObservedRunningTime="2026-03-04 08:53:18.942114647 +0000 UTC m=+10.312443459" Mar 4 08:53:23.615019 sudo[1918]: pam_unix(sudo:session): session closed for user root Mar 4 08:53:23.709281 sshd[1917]: Connection closed by 20.161.92.111 port 60734 Mar 4 08:53:23.709823 sshd-session[1914]: pam_unix(sshd:session): session closed for user core Mar 4 08:53:23.713304 systemd[1]: sshd@10-10.0.9.143:22-20.161.92.111:60734.service: Deactivated successfully. Mar 4 08:53:23.716121 systemd[1]: session-11.scope: Deactivated successfully. Mar 4 08:53:23.717070 systemd[1]: session-11.scope: Consumed 6.258s CPU time, 223.9M memory peak. Mar 4 08:53:23.718396 systemd-logind[1606]: Session 11 logged out. Waiting for processes to exit. Mar 4 08:53:23.719737 systemd-logind[1606]: Removed session 11. Mar 4 08:53:27.131073 systemd[1]: Created slice kubepods-besteffort-podd4695f47_24c2_4291_8f45_8f3673fee5e2.slice - libcontainer container kubepods-besteffort-podd4695f47_24c2_4291_8f45_8f3673fee5e2.slice. Mar 4 08:53:27.185427 systemd[1]: Created slice kubepods-besteffort-pod30004df9_7ca1_4f2e_8cab_91693fc9e877.slice - libcontainer container kubepods-besteffort-pod30004df9_7ca1_4f2e_8cab_91693fc9e877.slice. Mar 4 08:53:27.195828 kubelet[2894]: I0304 08:53:27.195710 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-cni-net-dir\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.195828 kubelet[2894]: I0304 08:53:27.195758 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4695f47-24c2-4291-8f45-8f3673fee5e2-tigera-ca-bundle\") pod \"calico-typha-f86b7d4d8-w9fpj\" (UID: \"d4695f47-24c2-4291-8f45-8f3673fee5e2\") " pod="calico-system/calico-typha-f86b7d4d8-w9fpj" Mar 4 08:53:27.195828 kubelet[2894]: I0304 08:53:27.195785 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-sys-fs\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.195828 kubelet[2894]: I0304 08:53:27.195825 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30004df9-7ca1-4f2e-8cab-91693fc9e877-tigera-ca-bundle\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.195828 kubelet[2894]: I0304 08:53:27.195840 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-var-lib-calico\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196329 kubelet[2894]: I0304 08:53:27.195856 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-xtables-lock\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196329 kubelet[2894]: I0304 08:53:27.195875 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp5ts\" (UniqueName: \"kubernetes.io/projected/d4695f47-24c2-4291-8f45-8f3673fee5e2-kube-api-access-xp5ts\") pod \"calico-typha-f86b7d4d8-w9fpj\" (UID: \"d4695f47-24c2-4291-8f45-8f3673fee5e2\") " pod="calico-system/calico-typha-f86b7d4d8-w9fpj" Mar 4 08:53:27.196329 kubelet[2894]: I0304 08:53:27.195890 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-policysync\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196329 kubelet[2894]: I0304 08:53:27.195931 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d4695f47-24c2-4291-8f45-8f3673fee5e2-typha-certs\") pod \"calico-typha-f86b7d4d8-w9fpj\" (UID: \"d4695f47-24c2-4291-8f45-8f3673fee5e2\") " pod="calico-system/calico-typha-f86b7d4d8-w9fpj" Mar 4 08:53:27.196329 kubelet[2894]: I0304 08:53:27.195980 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-cni-bin-dir\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196436 kubelet[2894]: I0304 08:53:27.195998 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-flexvol-driver-host\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196436 kubelet[2894]: I0304 08:53:27.196032 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-lib-modules\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196436 kubelet[2894]: I0304 08:53:27.196061 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-nodeproc\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196436 kubelet[2894]: I0304 08:53:27.196083 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-var-run-calico\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196436 kubelet[2894]: I0304 08:53:27.196101 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp6lk\" (UniqueName: \"kubernetes.io/projected/30004df9-7ca1-4f2e-8cab-91693fc9e877-kube-api-access-jp6lk\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196536 kubelet[2894]: I0304 08:53:27.196131 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-bpffs\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196536 kubelet[2894]: I0304 08:53:27.196160 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/30004df9-7ca1-4f2e-8cab-91693fc9e877-node-certs\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.196536 kubelet[2894]: I0304 08:53:27.196195 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/30004df9-7ca1-4f2e-8cab-91693fc9e877-cni-log-dir\") pod \"calico-node-7svcw\" (UID: \"30004df9-7ca1-4f2e-8cab-91693fc9e877\") " pod="calico-system/calico-node-7svcw" Mar 4 08:53:27.285766 kubelet[2894]: E0304 08:53:27.285569 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:27.297028 kubelet[2894]: I0304 08:53:27.296972 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac86e3f5-5d3b-4978-8a8a-3c851693c8e7-registration-dir\") pod \"csi-node-driver-hcf7z\" (UID: \"ac86e3f5-5d3b-4978-8a8a-3c851693c8e7\") " pod="calico-system/csi-node-driver-hcf7z" Mar 4 08:53:27.297028 kubelet[2894]: I0304 08:53:27.297016 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ac86e3f5-5d3b-4978-8a8a-3c851693c8e7-varrun\") pod \"csi-node-driver-hcf7z\" (UID: \"ac86e3f5-5d3b-4978-8a8a-3c851693c8e7\") " pod="calico-system/csi-node-driver-hcf7z" Mar 4 08:53:27.297028 kubelet[2894]: I0304 08:53:27.297033 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzcjb\" (UniqueName: \"kubernetes.io/projected/ac86e3f5-5d3b-4978-8a8a-3c851693c8e7-kube-api-access-pzcjb\") pod \"csi-node-driver-hcf7z\" (UID: \"ac86e3f5-5d3b-4978-8a8a-3c851693c8e7\") " pod="calico-system/csi-node-driver-hcf7z" Mar 4 08:53:27.297326 kubelet[2894]: I0304 08:53:27.297062 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac86e3f5-5d3b-4978-8a8a-3c851693c8e7-socket-dir\") pod \"csi-node-driver-hcf7z\" (UID: \"ac86e3f5-5d3b-4978-8a8a-3c851693c8e7\") " pod="calico-system/csi-node-driver-hcf7z" Mar 4 08:53:27.299821 kubelet[2894]: I0304 08:53:27.297162 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac86e3f5-5d3b-4978-8a8a-3c851693c8e7-kubelet-dir\") pod \"csi-node-driver-hcf7z\" (UID: \"ac86e3f5-5d3b-4978-8a8a-3c851693c8e7\") " pod="calico-system/csi-node-driver-hcf7z" Mar 4 08:53:27.299933 kubelet[2894]: E0304 08:53:27.299906 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.299933 kubelet[2894]: W0304 08:53:27.299925 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.299987 kubelet[2894]: E0304 08:53:27.299953 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.300189 kubelet[2894]: E0304 08:53:27.300150 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.300189 kubelet[2894]: W0304 08:53:27.300179 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.300189 kubelet[2894]: E0304 08:53:27.300189 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.300458 kubelet[2894]: E0304 08:53:27.300356 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.300458 kubelet[2894]: W0304 08:53:27.300371 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.300458 kubelet[2894]: E0304 08:53:27.300379 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.300555 kubelet[2894]: E0304 08:53:27.300514 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.300555 kubelet[2894]: W0304 08:53:27.300522 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.300555 kubelet[2894]: E0304 08:53:27.300530 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.301008 kubelet[2894]: E0304 08:53:27.300670 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.301008 kubelet[2894]: W0304 08:53:27.300683 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.301008 kubelet[2894]: E0304 08:53:27.300692 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.301008 kubelet[2894]: E0304 08:53:27.300869 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.301008 kubelet[2894]: W0304 08:53:27.300876 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.301008 kubelet[2894]: E0304 08:53:27.300885 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.301496 kubelet[2894]: E0304 08:53:27.301461 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.301496 kubelet[2894]: W0304 08:53:27.301475 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.301496 kubelet[2894]: E0304 08:53:27.301489 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.301764 kubelet[2894]: E0304 08:53:27.301734 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.301764 kubelet[2894]: W0304 08:53:27.301751 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.301764 kubelet[2894]: E0304 08:53:27.301762 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.302339 kubelet[2894]: E0304 08:53:27.302316 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.302339 kubelet[2894]: W0304 08:53:27.302333 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.302421 kubelet[2894]: E0304 08:53:27.302345 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.302717 kubelet[2894]: E0304 08:53:27.302697 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.302774 kubelet[2894]: W0304 08:53:27.302755 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.302774 kubelet[2894]: E0304 08:53:27.302769 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.303005 kubelet[2894]: E0304 08:53:27.302983 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.303005 kubelet[2894]: W0304 08:53:27.303000 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.303005 kubelet[2894]: E0304 08:53:27.303010 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.303272 kubelet[2894]: E0304 08:53:27.303240 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.303272 kubelet[2894]: W0304 08:53:27.303256 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.303272 kubelet[2894]: E0304 08:53:27.303267 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.303802 kubelet[2894]: E0304 08:53:27.303767 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.303802 kubelet[2894]: W0304 08:53:27.303800 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.303889 kubelet[2894]: E0304 08:53:27.303811 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.304087 kubelet[2894]: E0304 08:53:27.303998 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.304087 kubelet[2894]: W0304 08:53:27.304013 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.304087 kubelet[2894]: E0304 08:53:27.304024 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.308134 kubelet[2894]: E0304 08:53:27.307257 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.308134 kubelet[2894]: W0304 08:53:27.307283 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.308134 kubelet[2894]: E0304 08:53:27.307297 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.308783 kubelet[2894]: E0304 08:53:27.308530 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.308783 kubelet[2894]: W0304 08:53:27.308552 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.308783 kubelet[2894]: E0304 08:53:27.308566 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.311383 kubelet[2894]: E0304 08:53:27.311055 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.311383 kubelet[2894]: W0304 08:53:27.311077 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.311383 kubelet[2894]: E0304 08:53:27.311090 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.313099 kubelet[2894]: E0304 08:53:27.313061 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.313099 kubelet[2894]: W0304 08:53:27.313083 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.313099 kubelet[2894]: E0304 08:53:27.313096 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.314284 kubelet[2894]: E0304 08:53:27.314204 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.314284 kubelet[2894]: W0304 08:53:27.314226 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.314284 kubelet[2894]: E0304 08:53:27.314240 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.315463 kubelet[2894]: E0304 08:53:27.315434 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.315463 kubelet[2894]: W0304 08:53:27.315457 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.315810 kubelet[2894]: E0304 08:53:27.315471 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.315810 kubelet[2894]: E0304 08:53:27.315676 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.315810 kubelet[2894]: W0304 08:53:27.315686 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.315810 kubelet[2894]: E0304 08:53:27.315696 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.324701 kubelet[2894]: E0304 08:53:27.324663 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.324701 kubelet[2894]: W0304 08:53:27.324693 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.324967 kubelet[2894]: E0304 08:53:27.324714 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.330945 kubelet[2894]: E0304 08:53:27.330911 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.330945 kubelet[2894]: W0304 08:53:27.330933 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.331111 kubelet[2894]: E0304 08:53:27.330955 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.399422 kubelet[2894]: E0304 08:53:27.399322 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.399422 kubelet[2894]: W0304 08:53:27.399344 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.399422 kubelet[2894]: E0304 08:53:27.399363 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.400279 kubelet[2894]: E0304 08:53:27.400262 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.400279 kubelet[2894]: W0304 08:53:27.400280 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.400394 kubelet[2894]: E0304 08:53:27.400294 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.400521 kubelet[2894]: E0304 08:53:27.400508 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.400553 kubelet[2894]: W0304 08:53:27.400525 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.400581 kubelet[2894]: E0304 08:53:27.400559 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.400923 kubelet[2894]: E0304 08:53:27.400882 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.400978 kubelet[2894]: W0304 08:53:27.400934 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.400978 kubelet[2894]: E0304 08:53:27.400959 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.401131 kubelet[2894]: E0304 08:53:27.401120 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.401131 kubelet[2894]: W0304 08:53:27.401131 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.401242 kubelet[2894]: E0304 08:53:27.401139 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.401350 kubelet[2894]: E0304 08:53:27.401337 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.401350 kubelet[2894]: W0304 08:53:27.401349 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.401422 kubelet[2894]: E0304 08:53:27.401360 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.401521 kubelet[2894]: E0304 08:53:27.401510 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.401556 kubelet[2894]: W0304 08:53:27.401522 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.401556 kubelet[2894]: E0304 08:53:27.401530 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.401680 kubelet[2894]: E0304 08:53:27.401670 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.401711 kubelet[2894]: W0304 08:53:27.401680 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.401711 kubelet[2894]: E0304 08:53:27.401688 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.401915 kubelet[2894]: E0304 08:53:27.401902 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.401915 kubelet[2894]: W0304 08:53:27.401914 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.402022 kubelet[2894]: E0304 08:53:27.401922 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.402071 kubelet[2894]: E0304 08:53:27.402054 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.402071 kubelet[2894]: W0304 08:53:27.402066 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.402161 kubelet[2894]: E0304 08:53:27.402073 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.402217 kubelet[2894]: E0304 08:53:27.402209 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.402217 kubelet[2894]: W0304 08:53:27.402217 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.402299 kubelet[2894]: E0304 08:53:27.402225 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.402372 kubelet[2894]: E0304 08:53:27.402345 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.402372 kubelet[2894]: W0304 08:53:27.402355 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.402372 kubelet[2894]: E0304 08:53:27.402363 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.402495 kubelet[2894]: E0304 08:53:27.402478 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.402495 kubelet[2894]: W0304 08:53:27.402484 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.402495 kubelet[2894]: E0304 08:53:27.402492 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.402620 kubelet[2894]: E0304 08:53:27.402608 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.402656 kubelet[2894]: W0304 08:53:27.402621 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.402656 kubelet[2894]: E0304 08:53:27.402629 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.402761 kubelet[2894]: E0304 08:53:27.402751 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.402761 kubelet[2894]: W0304 08:53:27.402761 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.402817 kubelet[2894]: E0304 08:53:27.402768 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.402924 kubelet[2894]: E0304 08:53:27.402913 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.402924 kubelet[2894]: W0304 08:53:27.402923 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.403001 kubelet[2894]: E0304 08:53:27.402931 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.403078 kubelet[2894]: E0304 08:53:27.403066 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.403078 kubelet[2894]: W0304 08:53:27.403077 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.403144 kubelet[2894]: E0304 08:53:27.403085 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.403228 kubelet[2894]: E0304 08:53:27.403215 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.403228 kubelet[2894]: W0304 08:53:27.403226 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.403321 kubelet[2894]: E0304 08:53:27.403234 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.403373 kubelet[2894]: E0304 08:53:27.403361 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.403373 kubelet[2894]: W0304 08:53:27.403372 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.403424 kubelet[2894]: E0304 08:53:27.403379 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.403661 kubelet[2894]: E0304 08:53:27.403648 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.403661 kubelet[2894]: W0304 08:53:27.403661 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.403761 kubelet[2894]: E0304 08:53:27.403670 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.403810 kubelet[2894]: E0304 08:53:27.403796 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.403810 kubelet[2894]: W0304 08:53:27.403806 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.403882 kubelet[2894]: E0304 08:53:27.403815 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.403945 kubelet[2894]: E0304 08:53:27.403933 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.403945 kubelet[2894]: W0304 08:53:27.403943 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.403997 kubelet[2894]: E0304 08:53:27.403950 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.404162 kubelet[2894]: E0304 08:53:27.404150 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.404162 kubelet[2894]: W0304 08:53:27.404161 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.404266 kubelet[2894]: E0304 08:53:27.404185 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.404345 kubelet[2894]: E0304 08:53:27.404333 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.404345 kubelet[2894]: W0304 08:53:27.404344 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.404393 kubelet[2894]: E0304 08:53:27.404352 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.404499 kubelet[2894]: E0304 08:53:27.404489 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.404528 kubelet[2894]: W0304 08:53:27.404501 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.404528 kubelet[2894]: E0304 08:53:27.404509 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.419007 kubelet[2894]: E0304 08:53:27.418974 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:27.419007 kubelet[2894]: W0304 08:53:27.418999 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:27.419149 kubelet[2894]: E0304 08:53:27.419018 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:27.434653 containerd[1618]: time="2026-03-04T08:53:27.434604361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f86b7d4d8-w9fpj,Uid:d4695f47-24c2-4291-8f45-8f3673fee5e2,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:27.451765 containerd[1618]: time="2026-03-04T08:53:27.451689408Z" level=info msg="connecting to shim fb38f1cb88f32112bd997bac7f4d3a8ac12e7142e3a9f74fabec7c35fe15d342" address="unix:///run/containerd/s/370b17fa6611b00b8b478a24e26180e39788646dd9bbb41cad64d4751bbe7bda" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:27.472347 systemd[1]: Started cri-containerd-fb38f1cb88f32112bd997bac7f4d3a8ac12e7142e3a9f74fabec7c35fe15d342.scope - libcontainer container fb38f1cb88f32112bd997bac7f4d3a8ac12e7142e3a9f74fabec7c35fe15d342. Mar 4 08:53:27.489993 containerd[1618]: time="2026-03-04T08:53:27.489260278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7svcw,Uid:30004df9-7ca1-4f2e-8cab-91693fc9e877,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:27.509107 containerd[1618]: time="2026-03-04T08:53:27.509052259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f86b7d4d8-w9fpj,Uid:d4695f47-24c2-4291-8f45-8f3673fee5e2,Namespace:calico-system,Attempt:0,} returns sandbox id \"fb38f1cb88f32112bd997bac7f4d3a8ac12e7142e3a9f74fabec7c35fe15d342\"" Mar 4 08:53:27.510521 containerd[1618]: time="2026-03-04T08:53:27.510498546Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 4 08:53:27.515327 containerd[1618]: time="2026-03-04T08:53:27.515284570Z" level=info msg="connecting to shim b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe" address="unix:///run/containerd/s/c49f814ee75629357f171d4754921aabd946d2b59a0161221a58773d79a9c12a" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:27.538351 systemd[1]: Started cri-containerd-b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe.scope - libcontainer container b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe. Mar 4 08:53:27.560741 containerd[1618]: time="2026-03-04T08:53:27.560702000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7svcw,Uid:30004df9-7ca1-4f2e-8cab-91693fc9e877,Namespace:calico-system,Attempt:0,} returns sandbox id \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\"" Mar 4 08:53:28.901074 kubelet[2894]: E0304 08:53:28.900104 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:29.108298 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1095805488.mount: Deactivated successfully. Mar 4 08:53:30.187234 containerd[1618]: time="2026-03-04T08:53:30.187181547Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:30.188603 containerd[1618]: time="2026-03-04T08:53:30.188322353Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 4 08:53:30.189439 containerd[1618]: time="2026-03-04T08:53:30.189410718Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:30.192570 containerd[1618]: time="2026-03-04T08:53:30.192524214Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:30.193331 containerd[1618]: time="2026-03-04T08:53:30.193301018Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.682771872s" Mar 4 08:53:30.193403 containerd[1618]: time="2026-03-04T08:53:30.193332898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 4 08:53:30.194194 containerd[1618]: time="2026-03-04T08:53:30.194150022Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 4 08:53:30.203845 containerd[1618]: time="2026-03-04T08:53:30.203805111Z" level=info msg="CreateContainer within sandbox \"fb38f1cb88f32112bd997bac7f4d3a8ac12e7142e3a9f74fabec7c35fe15d342\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 4 08:53:30.213128 containerd[1618]: time="2026-03-04T08:53:30.212242674Z" level=info msg="Container 619322240f69c512266c7d5a5703b7e64b9dea8d184176b88324a457aeae13e8: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:30.219670 containerd[1618]: time="2026-03-04T08:53:30.219631672Z" level=info msg="CreateContainer within sandbox \"fb38f1cb88f32112bd997bac7f4d3a8ac12e7142e3a9f74fabec7c35fe15d342\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"619322240f69c512266c7d5a5703b7e64b9dea8d184176b88324a457aeae13e8\"" Mar 4 08:53:30.220490 containerd[1618]: time="2026-03-04T08:53:30.220456076Z" level=info msg="StartContainer for \"619322240f69c512266c7d5a5703b7e64b9dea8d184176b88324a457aeae13e8\"" Mar 4 08:53:30.222290 containerd[1618]: time="2026-03-04T08:53:30.222236845Z" level=info msg="connecting to shim 619322240f69c512266c7d5a5703b7e64b9dea8d184176b88324a457aeae13e8" address="unix:///run/containerd/s/370b17fa6611b00b8b478a24e26180e39788646dd9bbb41cad64d4751bbe7bda" protocol=ttrpc version=3 Mar 4 08:53:30.244346 systemd[1]: Started cri-containerd-619322240f69c512266c7d5a5703b7e64b9dea8d184176b88324a457aeae13e8.scope - libcontainer container 619322240f69c512266c7d5a5703b7e64b9dea8d184176b88324a457aeae13e8. Mar 4 08:53:30.279577 containerd[1618]: time="2026-03-04T08:53:30.279539335Z" level=info msg="StartContainer for \"619322240f69c512266c7d5a5703b7e64b9dea8d184176b88324a457aeae13e8\" returns successfully" Mar 4 08:53:30.900189 kubelet[2894]: E0304 08:53:30.900035 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:30.968003 kubelet[2894]: I0304 08:53:30.967881 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-f86b7d4d8-w9fpj" podStartSLOduration=1.284110386 podStartE2EDuration="3.967861783s" podCreationTimestamp="2026-03-04 08:53:27 +0000 UTC" firstStartedPulling="2026-03-04 08:53:27.510284225 +0000 UTC m=+18.880613037" lastFinishedPulling="2026-03-04 08:53:30.194035662 +0000 UTC m=+21.564364434" observedRunningTime="2026-03-04 08:53:30.96718682 +0000 UTC m=+22.337515672" watchObservedRunningTime="2026-03-04 08:53:30.967861783 +0000 UTC m=+22.338190595" Mar 4 08:53:31.012608 kubelet[2894]: E0304 08:53:31.012479 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.012608 kubelet[2894]: W0304 08:53:31.012503 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.012608 kubelet[2894]: E0304 08:53:31.012520 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.012856 kubelet[2894]: E0304 08:53:31.012843 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.012957 kubelet[2894]: W0304 08:53:31.012916 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.013022 kubelet[2894]: E0304 08:53:31.013001 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.013266 kubelet[2894]: E0304 08:53:31.013251 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.013348 kubelet[2894]: W0304 08:53:31.013335 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.013403 kubelet[2894]: E0304 08:53:31.013393 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.013992 kubelet[2894]: E0304 08:53:31.013601 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.013992 kubelet[2894]: W0304 08:53:31.013617 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.013992 kubelet[2894]: E0304 08:53:31.013627 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.017097 kubelet[2894]: E0304 08:53:31.014161 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.017097 kubelet[2894]: W0304 08:53:31.017093 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.017228 kubelet[2894]: E0304 08:53:31.017110 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.017435 kubelet[2894]: E0304 08:53:31.017407 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.017435 kubelet[2894]: W0304 08:53:31.017425 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.017496 kubelet[2894]: E0304 08:53:31.017437 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.017689 kubelet[2894]: E0304 08:53:31.017673 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.017689 kubelet[2894]: W0304 08:53:31.017688 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.017752 kubelet[2894]: E0304 08:53:31.017698 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.017873 kubelet[2894]: E0304 08:53:31.017855 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.017873 kubelet[2894]: W0304 08:53:31.017867 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.017917 kubelet[2894]: E0304 08:53:31.017876 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.018051 kubelet[2894]: E0304 08:53:31.018032 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.018051 kubelet[2894]: W0304 08:53:31.018050 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.018113 kubelet[2894]: E0304 08:53:31.018058 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.018220 kubelet[2894]: E0304 08:53:31.018200 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.018220 kubelet[2894]: W0304 08:53:31.018216 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.018282 kubelet[2894]: E0304 08:53:31.018225 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.018365 kubelet[2894]: E0304 08:53:31.018341 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.018365 kubelet[2894]: W0304 08:53:31.018357 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.018365 kubelet[2894]: E0304 08:53:31.018365 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.018524 kubelet[2894]: E0304 08:53:31.018509 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.018524 kubelet[2894]: W0304 08:53:31.018520 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.018580 kubelet[2894]: E0304 08:53:31.018528 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.018672 kubelet[2894]: E0304 08:53:31.018659 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.018672 kubelet[2894]: W0304 08:53:31.018670 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.018732 kubelet[2894]: E0304 08:53:31.018680 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.019085 kubelet[2894]: E0304 08:53:31.019065 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.019085 kubelet[2894]: W0304 08:53:31.019076 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.019085 kubelet[2894]: E0304 08:53:31.019085 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.019232 kubelet[2894]: E0304 08:53:31.019219 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.019232 kubelet[2894]: W0304 08:53:31.019231 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.019292 kubelet[2894]: E0304 08:53:31.019238 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.027656 kubelet[2894]: E0304 08:53:31.027630 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.027656 kubelet[2894]: W0304 08:53:31.027653 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.027729 kubelet[2894]: E0304 08:53:31.027667 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.027862 kubelet[2894]: E0304 08:53:31.027848 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.027862 kubelet[2894]: W0304 08:53:31.027860 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.027920 kubelet[2894]: E0304 08:53:31.027869 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.028105 kubelet[2894]: E0304 08:53:31.028089 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.028105 kubelet[2894]: W0304 08:53:31.028103 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.028159 kubelet[2894]: E0304 08:53:31.028112 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.028363 kubelet[2894]: E0304 08:53:31.028331 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.028363 kubelet[2894]: W0304 08:53:31.028341 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.028363 kubelet[2894]: E0304 08:53:31.028350 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.029370 kubelet[2894]: E0304 08:53:31.029262 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.029370 kubelet[2894]: W0304 08:53:31.029279 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.029370 kubelet[2894]: E0304 08:53:31.029301 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.029533 kubelet[2894]: E0304 08:53:31.029517 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.029533 kubelet[2894]: W0304 08:53:31.029531 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.029637 kubelet[2894]: E0304 08:53:31.029543 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.029719 kubelet[2894]: E0304 08:53:31.029705 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.029719 kubelet[2894]: W0304 08:53:31.029716 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.029769 kubelet[2894]: E0304 08:53:31.029725 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.029969 kubelet[2894]: E0304 08:53:31.029898 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.029969 kubelet[2894]: W0304 08:53:31.029912 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.029969 kubelet[2894]: E0304 08:53:31.029922 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.030531 kubelet[2894]: E0304 08:53:31.030509 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.030531 kubelet[2894]: W0304 08:53:31.030531 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.030758 kubelet[2894]: E0304 08:53:31.030545 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.030885 kubelet[2894]: E0304 08:53:31.030870 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.031055 kubelet[2894]: W0304 08:53:31.030942 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.031055 kubelet[2894]: E0304 08:53:31.030959 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.031202 kubelet[2894]: E0304 08:53:31.031188 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.031256 kubelet[2894]: W0304 08:53:31.031244 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.031319 kubelet[2894]: E0304 08:53:31.031307 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.031631 kubelet[2894]: E0304 08:53:31.031527 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.031631 kubelet[2894]: W0304 08:53:31.031540 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.031631 kubelet[2894]: E0304 08:53:31.031551 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.031789 kubelet[2894]: E0304 08:53:31.031776 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.031851 kubelet[2894]: W0304 08:53:31.031838 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.031911 kubelet[2894]: E0304 08:53:31.031899 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.032160 kubelet[2894]: E0304 08:53:31.032146 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.032266 kubelet[2894]: W0304 08:53:31.032252 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.032318 kubelet[2894]: E0304 08:53:31.032307 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.032650 kubelet[2894]: E0304 08:53:31.032617 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.032650 kubelet[2894]: W0304 08:53:31.032632 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.032732 kubelet[2894]: E0304 08:53:31.032644 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.032916 kubelet[2894]: E0304 08:53:31.032877 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.032916 kubelet[2894]: W0304 08:53:31.032890 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.032916 kubelet[2894]: E0304 08:53:31.032900 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.033267 kubelet[2894]: E0304 08:53:31.033249 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.033337 kubelet[2894]: W0304 08:53:31.033325 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.033388 kubelet[2894]: E0304 08:53:31.033377 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.033655 kubelet[2894]: E0304 08:53:31.033608 2894 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 08:53:31.033655 kubelet[2894]: W0304 08:53:31.033621 2894 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 08:53:31.033655 kubelet[2894]: E0304 08:53:31.033632 2894 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 08:53:31.710610 containerd[1618]: time="2026-03-04T08:53:31.710565707Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:31.711631 containerd[1618]: time="2026-03-04T08:53:31.711602112Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 4 08:53:31.712599 containerd[1618]: time="2026-03-04T08:53:31.712571757Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:31.715080 containerd[1618]: time="2026-03-04T08:53:31.714806808Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:31.715837 containerd[1618]: time="2026-03-04T08:53:31.715714693Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.52150363s" Mar 4 08:53:31.715837 containerd[1618]: time="2026-03-04T08:53:31.715750893Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 4 08:53:31.720199 containerd[1618]: time="2026-03-04T08:53:31.719456992Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 4 08:53:31.726210 containerd[1618]: time="2026-03-04T08:53:31.726161586Z" level=info msg="Container f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:31.735068 containerd[1618]: time="2026-03-04T08:53:31.735029591Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440\"" Mar 4 08:53:31.736528 containerd[1618]: time="2026-03-04T08:53:31.736493998Z" level=info msg="StartContainer for \"f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440\"" Mar 4 08:53:31.738063 containerd[1618]: time="2026-03-04T08:53:31.738032566Z" level=info msg="connecting to shim f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440" address="unix:///run/containerd/s/c49f814ee75629357f171d4754921aabd946d2b59a0161221a58773d79a9c12a" protocol=ttrpc version=3 Mar 4 08:53:31.756341 systemd[1]: Started cri-containerd-f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440.scope - libcontainer container f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440. Mar 4 08:53:31.836682 containerd[1618]: time="2026-03-04T08:53:31.836603786Z" level=info msg="StartContainer for \"f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440\" returns successfully" Mar 4 08:53:31.848742 systemd[1]: cri-containerd-f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440.scope: Deactivated successfully. Mar 4 08:53:31.852178 containerd[1618]: time="2026-03-04T08:53:31.851543501Z" level=info msg="received container exit event container_id:\"f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440\" id:\"f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440\" pid:3556 exited_at:{seconds:1772614411 nanos:851197300}" Mar 4 08:53:31.870773 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f4cb138e3407425ce1c93ab6e9686953c3be488494aaf85035c6711646014440-rootfs.mount: Deactivated successfully. Mar 4 08:53:31.962571 kubelet[2894]: I0304 08:53:31.962454 2894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 08:53:32.901199 kubelet[2894]: E0304 08:53:32.900677 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:34.899899 kubelet[2894]: E0304 08:53:34.899842 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:35.971824 containerd[1618]: time="2026-03-04T08:53:35.971779220Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 4 08:53:36.900020 kubelet[2894]: E0304 08:53:36.899939 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:38.901204 kubelet[2894]: E0304 08:53:38.900343 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:40.900102 kubelet[2894]: E0304 08:53:40.900036 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:42.134049 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2207864455.mount: Deactivated successfully. Mar 4 08:53:42.159261 containerd[1618]: time="2026-03-04T08:53:42.159199415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:42.160091 containerd[1618]: time="2026-03-04T08:53:42.160046859Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 4 08:53:42.161186 containerd[1618]: time="2026-03-04T08:53:42.161145065Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:42.163380 containerd[1618]: time="2026-03-04T08:53:42.163344636Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:42.163902 containerd[1618]: time="2026-03-04T08:53:42.163864959Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 6.192044098s" Mar 4 08:53:42.163902 containerd[1618]: time="2026-03-04T08:53:42.163898279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 4 08:53:42.168442 containerd[1618]: time="2026-03-04T08:53:42.168410622Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 4 08:53:42.177766 containerd[1618]: time="2026-03-04T08:53:42.177724509Z" level=info msg="Container 45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:42.191691 containerd[1618]: time="2026-03-04T08:53:42.191624899Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af\"" Mar 4 08:53:42.192414 containerd[1618]: time="2026-03-04T08:53:42.192389983Z" level=info msg="StartContainer for \"45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af\"" Mar 4 08:53:42.194788 containerd[1618]: time="2026-03-04T08:53:42.194758235Z" level=info msg="connecting to shim 45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af" address="unix:///run/containerd/s/c49f814ee75629357f171d4754921aabd946d2b59a0161221a58773d79a9c12a" protocol=ttrpc version=3 Mar 4 08:53:42.220397 systemd[1]: Started cri-containerd-45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af.scope - libcontainer container 45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af. Mar 4 08:53:42.299906 containerd[1618]: time="2026-03-04T08:53:42.299866688Z" level=info msg="StartContainer for \"45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af\" returns successfully" Mar 4 08:53:42.403320 systemd[1]: cri-containerd-45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af.scope: Deactivated successfully. Mar 4 08:53:42.404818 containerd[1618]: time="2026-03-04T08:53:42.404695659Z" level=info msg="received container exit event container_id:\"45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af\" id:\"45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af\" pid:3614 exited_at:{seconds:1772614422 nanos:404410858}" Mar 4 08:53:42.900029 kubelet[2894]: E0304 08:53:42.899972 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:43.132569 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-45f740f362f199830c04b06b62e4366ace3d46b314c1820912479fde46a239af-rootfs.mount: Deactivated successfully. Mar 4 08:53:43.136384 kubelet[2894]: I0304 08:53:43.136089 2894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 08:53:44.900227 kubelet[2894]: E0304 08:53:44.900111 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:45.993770 containerd[1618]: time="2026-03-04T08:53:45.993731127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 4 08:53:46.900683 kubelet[2894]: E0304 08:53:46.900280 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:48.924945 kubelet[2894]: E0304 08:53:48.924541 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:50.074200 containerd[1618]: time="2026-03-04T08:53:50.073860962Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:50.074673 containerd[1618]: time="2026-03-04T08:53:50.074641846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 4 08:53:50.075491 containerd[1618]: time="2026-03-04T08:53:50.075463131Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:50.077483 containerd[1618]: time="2026-03-04T08:53:50.077426741Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:50.078242 containerd[1618]: time="2026-03-04T08:53:50.078209385Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 4.084057535s" Mar 4 08:53:50.078289 containerd[1618]: time="2026-03-04T08:53:50.078246665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 4 08:53:50.082001 containerd[1618]: time="2026-03-04T08:53:50.081958284Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 4 08:53:50.090733 containerd[1618]: time="2026-03-04T08:53:50.089591282Z" level=info msg="Container 09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:50.100265 containerd[1618]: time="2026-03-04T08:53:50.100223816Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94\"" Mar 4 08:53:50.101005 containerd[1618]: time="2026-03-04T08:53:50.100963260Z" level=info msg="StartContainer for \"09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94\"" Mar 4 08:53:50.102433 containerd[1618]: time="2026-03-04T08:53:50.102408387Z" level=info msg="connecting to shim 09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94" address="unix:///run/containerd/s/c49f814ee75629357f171d4754921aabd946d2b59a0161221a58773d79a9c12a" protocol=ttrpc version=3 Mar 4 08:53:50.122323 systemd[1]: Started cri-containerd-09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94.scope - libcontainer container 09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94. Mar 4 08:53:50.197438 containerd[1618]: time="2026-03-04T08:53:50.197400469Z" level=info msg="StartContainer for \"09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94\" returns successfully" Mar 4 08:53:50.902309 kubelet[2894]: E0304 08:53:50.902239 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:51.512251 containerd[1618]: time="2026-03-04T08:53:51.512147211Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 4 08:53:51.514236 systemd[1]: cri-containerd-09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94.scope: Deactivated successfully. Mar 4 08:53:51.514579 systemd[1]: cri-containerd-09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94.scope: Consumed 486ms CPU time, 192.5M memory peak, 171.3M written to disk. Mar 4 08:53:51.516000 containerd[1618]: time="2026-03-04T08:53:51.515970790Z" level=info msg="received container exit event container_id:\"09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94\" id:\"09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94\" pid:3676 exited_at:{seconds:1772614431 nanos:515699549}" Mar 4 08:53:51.534634 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-09252e45b39068ead87eec8b7f2d875c1e747f367ff61ce3c82d7dbf9f2d6d94-rootfs.mount: Deactivated successfully. Mar 4 08:53:51.571588 kubelet[2894]: I0304 08:53:51.560806 2894 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 4 08:53:52.835624 systemd[1]: Created slice kubepods-burstable-pode9b093ad_c025_4929_90c7_213ec5cd3786.slice - libcontainer container kubepods-burstable-pode9b093ad_c025_4929_90c7_213ec5cd3786.slice. Mar 4 08:53:52.898414 kubelet[2894]: I0304 08:53:52.898358 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9b093ad-c025-4929-90c7-213ec5cd3786-config-volume\") pod \"coredns-674b8bbfcf-qsgm2\" (UID: \"e9b093ad-c025-4929-90c7-213ec5cd3786\") " pod="kube-system/coredns-674b8bbfcf-qsgm2" Mar 4 08:53:52.898414 kubelet[2894]: I0304 08:53:52.898405 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8lm\" (UniqueName: \"kubernetes.io/projected/e9b093ad-c025-4929-90c7-213ec5cd3786-kube-api-access-4k8lm\") pod \"coredns-674b8bbfcf-qsgm2\" (UID: \"e9b093ad-c025-4929-90c7-213ec5cd3786\") " pod="kube-system/coredns-674b8bbfcf-qsgm2" Mar 4 08:53:52.994385 systemd[1]: Created slice kubepods-burstable-pod819d499d_6035_4023_9c4c_4eddc1d8fdc0.slice - libcontainer container kubepods-burstable-pod819d499d_6035_4023_9c4c_4eddc1d8fdc0.slice. Mar 4 08:53:53.009595 systemd[1]: Created slice kubepods-besteffort-pod30192d89_02dd_40c8_8733_2e00d914bf2b.slice - libcontainer container kubepods-besteffort-pod30192d89_02dd_40c8_8733_2e00d914bf2b.slice. Mar 4 08:53:53.031601 containerd[1618]: time="2026-03-04T08:53:53.031200549Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 4 08:53:53.031638 systemd[1]: Created slice kubepods-besteffort-podf128d3e7_c371_4b02_b4e6_418d24da20f2.slice - libcontainer container kubepods-besteffort-podf128d3e7_c371_4b02_b4e6_418d24da20f2.slice. Mar 4 08:53:53.038276 systemd[1]: Created slice kubepods-besteffort-pod53822f89_b17a_4d51_b036_a94ebb7fdda1.slice - libcontainer container kubepods-besteffort-pod53822f89_b17a_4d51_b036_a94ebb7fdda1.slice. Mar 4 08:53:53.046399 systemd[1]: Created slice kubepods-besteffort-podb7abfa40_2bcb_4a6d_9983_687f32ff2ccc.slice - libcontainer container kubepods-besteffort-podb7abfa40_2bcb_4a6d_9983_687f32ff2ccc.slice. Mar 4 08:53:53.047778 containerd[1618]: time="2026-03-04T08:53:53.047104869Z" level=info msg="Container 11a22ba4d215fd670179ee6578a6cd53e9ba7f0e79f33b0c0feb3ad83890ba24: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:53.058141 systemd[1]: Created slice kubepods-besteffort-pod92dfd492_72b0_45e2_bd43_99c3f711766e.slice - libcontainer container kubepods-besteffort-pod92dfd492_72b0_45e2_bd43_99c3f711766e.slice. Mar 4 08:53:53.062813 containerd[1618]: time="2026-03-04T08:53:53.062770709Z" level=info msg="CreateContainer within sandbox \"b27bcb7ccfa60bd9c8d6a3c4554609b6ba1b2d209035cb65c8473f29f2294ffe\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"11a22ba4d215fd670179ee6578a6cd53e9ba7f0e79f33b0c0feb3ad83890ba24\"" Mar 4 08:53:53.063449 containerd[1618]: time="2026-03-04T08:53:53.063395152Z" level=info msg="StartContainer for \"11a22ba4d215fd670179ee6578a6cd53e9ba7f0e79f33b0c0feb3ad83890ba24\"" Mar 4 08:53:53.065544 systemd[1]: Created slice kubepods-besteffort-podac86e3f5_5d3b_4978_8a8a_3c851693c8e7.slice - libcontainer container kubepods-besteffort-podac86e3f5_5d3b_4978_8a8a_3c851693c8e7.slice. Mar 4 08:53:53.068811 containerd[1618]: time="2026-03-04T08:53:53.068758299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hcf7z,Uid:ac86e3f5-5d3b-4978-8a8a-3c851693c8e7,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:53.069582 containerd[1618]: time="2026-03-04T08:53:53.069541343Z" level=info msg="connecting to shim 11a22ba4d215fd670179ee6578a6cd53e9ba7f0e79f33b0c0feb3ad83890ba24" address="unix:///run/containerd/s/c49f814ee75629357f171d4754921aabd946d2b59a0161221a58773d79a9c12a" protocol=ttrpc version=3 Mar 4 08:53:53.100240 kubelet[2894]: I0304 08:53:53.099561 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/92dfd492-72b0-45e2-bd43-99c3f711766e-calico-apiserver-certs\") pod \"calico-apiserver-b6d5c8957-vwh62\" (UID: \"92dfd492-72b0-45e2-bd43-99c3f711766e\") " pod="calico-system/calico-apiserver-b6d5c8957-vwh62" Mar 4 08:53:53.100240 kubelet[2894]: I0304 08:53:53.099606 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lph2l\" (UniqueName: \"kubernetes.io/projected/92dfd492-72b0-45e2-bd43-99c3f711766e-kube-api-access-lph2l\") pod \"calico-apiserver-b6d5c8957-vwh62\" (UID: \"92dfd492-72b0-45e2-bd43-99c3f711766e\") " pod="calico-system/calico-apiserver-b6d5c8957-vwh62" Mar 4 08:53:53.100240 kubelet[2894]: I0304 08:53:53.099631 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5lw\" (UniqueName: \"kubernetes.io/projected/53822f89-b17a-4d51-b036-a94ebb7fdda1-kube-api-access-gg5lw\") pod \"goldmane-5b85766d88-qsxfw\" (UID: \"53822f89-b17a-4d51-b036-a94ebb7fdda1\") " pod="calico-system/goldmane-5b85766d88-qsxfw" Mar 4 08:53:53.100240 kubelet[2894]: I0304 08:53:53.099670 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/819d499d-6035-4023-9c4c-4eddc1d8fdc0-config-volume\") pod \"coredns-674b8bbfcf-6qmwn\" (UID: \"819d499d-6035-4023-9c4c-4eddc1d8fdc0\") " pod="kube-system/coredns-674b8bbfcf-6qmwn" Mar 4 08:53:53.100240 kubelet[2894]: I0304 08:53:53.099691 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f128d3e7-c371-4b02-b4e6-418d24da20f2-calico-apiserver-certs\") pod \"calico-apiserver-b6d5c8957-hzhws\" (UID: \"f128d3e7-c371-4b02-b4e6-418d24da20f2\") " pod="calico-system/calico-apiserver-b6d5c8957-hzhws" Mar 4 08:53:53.100439 kubelet[2894]: I0304 08:53:53.099749 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-ca-bundle\") pod \"whisker-74d466c56f-v7njq\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " pod="calico-system/whisker-74d466c56f-v7njq" Mar 4 08:53:53.100439 kubelet[2894]: I0304 08:53:53.099767 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg6vn\" (UniqueName: \"kubernetes.io/projected/30192d89-02dd-40c8-8733-2e00d914bf2b-kube-api-access-rg6vn\") pod \"calico-kube-controllers-7c9fcf4458-gm722\" (UID: \"30192d89-02dd-40c8-8733-2e00d914bf2b\") " pod="calico-system/calico-kube-controllers-7c9fcf4458-gm722" Mar 4 08:53:53.100439 kubelet[2894]: I0304 08:53:53.099883 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8sf7\" (UniqueName: \"kubernetes.io/projected/819d499d-6035-4023-9c4c-4eddc1d8fdc0-kube-api-access-h8sf7\") pod \"coredns-674b8bbfcf-6qmwn\" (UID: \"819d499d-6035-4023-9c4c-4eddc1d8fdc0\") " pod="kube-system/coredns-674b8bbfcf-6qmwn" Mar 4 08:53:53.100439 kubelet[2894]: I0304 08:53:53.099901 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6krdp\" (UniqueName: \"kubernetes.io/projected/f128d3e7-c371-4b02-b4e6-418d24da20f2-kube-api-access-6krdp\") pod \"calico-apiserver-b6d5c8957-hzhws\" (UID: \"f128d3e7-c371-4b02-b4e6-418d24da20f2\") " pod="calico-system/calico-apiserver-b6d5c8957-hzhws" Mar 4 08:53:53.100439 kubelet[2894]: I0304 08:53:53.099919 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-backend-key-pair\") pod \"whisker-74d466c56f-v7njq\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " pod="calico-system/whisker-74d466c56f-v7njq" Mar 4 08:53:53.100540 kubelet[2894]: I0304 08:53:53.099943 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkt9m\" (UniqueName: \"kubernetes.io/projected/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-kube-api-access-bkt9m\") pod \"whisker-74d466c56f-v7njq\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " pod="calico-system/whisker-74d466c56f-v7njq" Mar 4 08:53:53.100540 kubelet[2894]: I0304 08:53:53.099959 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30192d89-02dd-40c8-8733-2e00d914bf2b-tigera-ca-bundle\") pod \"calico-kube-controllers-7c9fcf4458-gm722\" (UID: \"30192d89-02dd-40c8-8733-2e00d914bf2b\") " pod="calico-system/calico-kube-controllers-7c9fcf4458-gm722" Mar 4 08:53:53.100540 kubelet[2894]: I0304 08:53:53.099986 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-nginx-config\") pod \"whisker-74d466c56f-v7njq\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " pod="calico-system/whisker-74d466c56f-v7njq" Mar 4 08:53:53.100540 kubelet[2894]: I0304 08:53:53.100005 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53822f89-b17a-4d51-b036-a94ebb7fdda1-config\") pod \"goldmane-5b85766d88-qsxfw\" (UID: \"53822f89-b17a-4d51-b036-a94ebb7fdda1\") " pod="calico-system/goldmane-5b85766d88-qsxfw" Mar 4 08:53:53.100540 kubelet[2894]: I0304 08:53:53.100024 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53822f89-b17a-4d51-b036-a94ebb7fdda1-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-qsxfw\" (UID: \"53822f89-b17a-4d51-b036-a94ebb7fdda1\") " pod="calico-system/goldmane-5b85766d88-qsxfw" Mar 4 08:53:53.100647 kubelet[2894]: I0304 08:53:53.100041 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/53822f89-b17a-4d51-b036-a94ebb7fdda1-goldmane-key-pair\") pod \"goldmane-5b85766d88-qsxfw\" (UID: \"53822f89-b17a-4d51-b036-a94ebb7fdda1\") " pod="calico-system/goldmane-5b85766d88-qsxfw" Mar 4 08:53:53.101375 systemd[1]: Started cri-containerd-11a22ba4d215fd670179ee6578a6cd53e9ba7f0e79f33b0c0feb3ad83890ba24.scope - libcontainer container 11a22ba4d215fd670179ee6578a6cd53e9ba7f0e79f33b0c0feb3ad83890ba24. Mar 4 08:53:53.139409 containerd[1618]: time="2026-03-04T08:53:53.139366017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsgm2,Uid:e9b093ad-c025-4929-90c7-213ec5cd3786,Namespace:kube-system,Attempt:0,}" Mar 4 08:53:53.145777 containerd[1618]: time="2026-03-04T08:53:53.145699889Z" level=error msg="Failed to destroy network for sandbox \"3414f2f8442a65a9996e783f2712638b03f68352086cd6c9aaeb49d56edb23b7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 08:53:53.147416 containerd[1618]: time="2026-03-04T08:53:53.147350017Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hcf7z,Uid:ac86e3f5-5d3b-4978-8a8a-3c851693c8e7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3414f2f8442a65a9996e783f2712638b03f68352086cd6c9aaeb49d56edb23b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 08:53:53.147937 kubelet[2894]: E0304 08:53:53.147797 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3414f2f8442a65a9996e783f2712638b03f68352086cd6c9aaeb49d56edb23b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 08:53:53.147937 kubelet[2894]: E0304 08:53:53.147885 2894 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3414f2f8442a65a9996e783f2712638b03f68352086cd6c9aaeb49d56edb23b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hcf7z" Mar 4 08:53:53.147937 kubelet[2894]: E0304 08:53:53.147906 2894 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3414f2f8442a65a9996e783f2712638b03f68352086cd6c9aaeb49d56edb23b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hcf7z" Mar 4 08:53:53.148561 kubelet[2894]: E0304 08:53:53.148497 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hcf7z_calico-system(ac86e3f5-5d3b-4978-8a8a-3c851693c8e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hcf7z_calico-system(ac86e3f5-5d3b-4978-8a8a-3c851693c8e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3414f2f8442a65a9996e783f2712638b03f68352086cd6c9aaeb49d56edb23b7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hcf7z" podUID="ac86e3f5-5d3b-4978-8a8a-3c851693c8e7" Mar 4 08:53:53.170954 containerd[1618]: time="2026-03-04T08:53:53.170875617Z" level=info msg="StartContainer for \"11a22ba4d215fd670179ee6578a6cd53e9ba7f0e79f33b0c0feb3ad83890ba24\" returns successfully" Mar 4 08:53:53.194327 containerd[1618]: time="2026-03-04T08:53:53.194283535Z" level=error msg="Failed to destroy network for sandbox \"586b56ee58ed8138bd13e371027595d184e8a8b4c7a875c88c86e00691e87e3d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 08:53:53.195506 containerd[1618]: time="2026-03-04T08:53:53.195476221Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsgm2,Uid:e9b093ad-c025-4929-90c7-213ec5cd3786,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"586b56ee58ed8138bd13e371027595d184e8a8b4c7a875c88c86e00691e87e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 08:53:53.195923 kubelet[2894]: E0304 08:53:53.195693 2894 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"586b56ee58ed8138bd13e371027595d184e8a8b4c7a875c88c86e00691e87e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 08:53:53.195923 kubelet[2894]: E0304 08:53:53.195745 2894 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"586b56ee58ed8138bd13e371027595d184e8a8b4c7a875c88c86e00691e87e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qsgm2" Mar 4 08:53:53.195923 kubelet[2894]: E0304 08:53:53.195791 2894 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"586b56ee58ed8138bd13e371027595d184e8a8b4c7a875c88c86e00691e87e3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qsgm2" Mar 4 08:53:53.196055 kubelet[2894]: E0304 08:53:53.195832 2894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-qsgm2_kube-system(e9b093ad-c025-4929-90c7-213ec5cd3786)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-qsgm2_kube-system(e9b093ad-c025-4929-90c7-213ec5cd3786)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"586b56ee58ed8138bd13e371027595d184e8a8b4c7a875c88c86e00691e87e3d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-qsgm2" podUID="e9b093ad-c025-4929-90c7-213ec5cd3786" Mar 4 08:53:53.295716 containerd[1618]: time="2026-03-04T08:53:53.295661089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74d466c56f-v7njq,Uid:b7abfa40-2bcb-4a6d-9983-687f32ff2ccc,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:53.305231 containerd[1618]: time="2026-03-04T08:53:53.305179417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6qmwn,Uid:819d499d-6035-4023-9c4c-4eddc1d8fdc0,Namespace:kube-system,Attempt:0,}" Mar 4 08:53:53.317533 containerd[1618]: time="2026-03-04T08:53:53.317489440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c9fcf4458-gm722,Uid:30192d89-02dd-40c8-8733-2e00d914bf2b,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:53.338595 containerd[1618]: time="2026-03-04T08:53:53.338541866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b6d5c8957-hzhws,Uid:f128d3e7-c371-4b02-b4e6-418d24da20f2,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:53.342132 containerd[1618]: time="2026-03-04T08:53:53.342094844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-qsxfw,Uid:53822f89-b17a-4d51-b036-a94ebb7fdda1,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:53.363512 containerd[1618]: time="2026-03-04T08:53:53.362548148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b6d5c8957-vwh62,Uid:92dfd492-72b0-45e2-bd43-99c3f711766e,Namespace:calico-system,Attempt:0,}" Mar 4 08:53:53.589087 systemd-networkd[1441]: cali3eaa5c9803f: Link UP Mar 4 08:53:53.589580 systemd-networkd[1441]: cali3eaa5c9803f: Gained carrier Mar 4 08:53:53.603012 containerd[1618]: 2026-03-04 08:53:53.334 [ERROR][3827] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 4 08:53:53.603012 containerd[1618]: 2026-03-04 08:53:53.367 [INFO][3827] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0 whisker-74d466c56f- calico-system b7abfa40-2bcb-4a6d-9983-687f32ff2ccc 870 0 2026-03-04 08:53:30 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:74d466c56f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 whisker-74d466c56f-v7njq eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3eaa5c9803f [] [] }} ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-" Mar 4 08:53:53.603012 containerd[1618]: 2026-03-04 08:53:53.367 [INFO][3827] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:53:53.603012 containerd[1618]: 2026-03-04 08:53:53.457 [INFO][3915] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.474 [INFO][3915] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002eb260), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"whisker-74d466c56f-v7njq", "timestamp":"2026-03-04 08:53:53.45761223 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000356f20)} Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.474 [INFO][3915] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.475 [INFO][3915] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.475 [INFO][3915] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.540 [INFO][3915] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.545 [INFO][3915] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.559 [INFO][3915] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.561 [INFO][3915] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603255 containerd[1618]: 2026-03-04 08:53:53.564 [INFO][3915] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603444 containerd[1618]: 2026-03-04 08:53:53.564 [INFO][3915] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603444 containerd[1618]: 2026-03-04 08:53:53.566 [INFO][3915] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea Mar 4 08:53:53.603444 containerd[1618]: 2026-03-04 08:53:53.570 [INFO][3915] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603444 containerd[1618]: 2026-03-04 08:53:53.575 [INFO][3915] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.65/26] block=192.168.126.64/26 handle="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603444 containerd[1618]: 2026-03-04 08:53:53.576 [INFO][3915] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.65/26] handle="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.603444 containerd[1618]: 2026-03-04 08:53:53.576 [INFO][3915] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:53:53.603444 containerd[1618]: 2026-03-04 08:53:53.576 [INFO][3915] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.65/26] IPv6=[] ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:53:53.603566 containerd[1618]: 2026-03-04 08:53:53.578 [INFO][3827] cni-plugin/k8s.go 418: Populated endpoint ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0", GenerateName:"whisker-74d466c56f-", Namespace:"calico-system", SelfLink:"", UID:"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"74d466c56f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"whisker-74d466c56f-v7njq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.126.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3eaa5c9803f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.603566 containerd[1618]: 2026-03-04 08:53:53.578 [INFO][3827] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.65/32] ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:53:53.603630 containerd[1618]: 2026-03-04 08:53:53.578 [INFO][3827] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3eaa5c9803f ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:53:53.603630 containerd[1618]: 2026-03-04 08:53:53.590 [INFO][3827] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:53:53.603668 containerd[1618]: 2026-03-04 08:53:53.591 [INFO][3827] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0", GenerateName:"whisker-74d466c56f-", Namespace:"calico-system", SelfLink:"", UID:"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"74d466c56f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea", Pod:"whisker-74d466c56f-v7njq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.126.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3eaa5c9803f", MAC:"5a:55:6e:c5:4f:cd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.603760 containerd[1618]: 2026-03-04 08:53:53.601 [INFO][3827] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Namespace="calico-system" Pod="whisker-74d466c56f-v7njq" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:53:53.624569 containerd[1618]: time="2026-03-04T08:53:53.624451835Z" level=info msg="connecting to shim 47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" address="unix:///run/containerd/s/f9874770ea0577a984b91fbdc9e4f5f6e48b12437fa50ff7e861b7c9cff2799c" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:53.644369 systemd[1]: Started cri-containerd-47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea.scope - libcontainer container 47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea. Mar 4 08:53:53.680032 containerd[1618]: time="2026-03-04T08:53:53.679988556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74d466c56f-v7njq,Uid:b7abfa40-2bcb-4a6d-9983-687f32ff2ccc,Namespace:calico-system,Attempt:0,} returns sandbox id \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\"" Mar 4 08:53:53.683313 containerd[1618]: time="2026-03-04T08:53:53.683134652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 4 08:53:53.686897 systemd-networkd[1441]: calif9d5df435ff: Link UP Mar 4 08:53:53.687814 systemd-networkd[1441]: calif9d5df435ff: Gained carrier Mar 4 08:53:53.703017 containerd[1618]: 2026-03-04 08:53:53.360 [ERROR][3852] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 4 08:53:53.703017 containerd[1618]: 2026-03-04 08:53:53.381 [INFO][3852] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0 calico-kube-controllers-7c9fcf4458- calico-system 30192d89-02dd-40c8-8733-2e00d914bf2b 853 0 2026-03-04 08:53:27 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7c9fcf4458 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 calico-kube-controllers-7c9fcf4458-gm722 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif9d5df435ff [] [] }} ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-" Mar 4 08:53:53.703017 containerd[1618]: 2026-03-04 08:53:53.381 [INFO][3852] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" Mar 4 08:53:53.703017 containerd[1618]: 2026-03-04 08:53:53.456 [INFO][3923] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" HandleID="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.475 [INFO][3923] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" HandleID="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003e4a70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"calico-kube-controllers-7c9fcf4458-gm722", "timestamp":"2026-03-04 08:53:53.456361423 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002a4c60)} Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.475 [INFO][3923] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.576 [INFO][3923] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.576 [INFO][3923] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.641 [INFO][3923] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.646 [INFO][3923] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.660 [INFO][3923] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.663 [INFO][3923] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703258 containerd[1618]: 2026-03-04 08:53:53.665 [INFO][3923] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703431 containerd[1618]: 2026-03-04 08:53:53.665 [INFO][3923] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703431 containerd[1618]: 2026-03-04 08:53:53.667 [INFO][3923] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be Mar 4 08:53:53.703431 containerd[1618]: 2026-03-04 08:53:53.671 [INFO][3923] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703431 containerd[1618]: 2026-03-04 08:53:53.678 [INFO][3923] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.66/26] block=192.168.126.64/26 handle="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703431 containerd[1618]: 2026-03-04 08:53:53.678 [INFO][3923] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.66/26] handle="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.703431 containerd[1618]: 2026-03-04 08:53:53.678 [INFO][3923] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:53:53.703431 containerd[1618]: 2026-03-04 08:53:53.678 [INFO][3923] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.66/26] IPv6=[] ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" HandleID="k8s-pod-network.d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" Mar 4 08:53:53.703557 containerd[1618]: 2026-03-04 08:53:53.681 [INFO][3852] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0", GenerateName:"calico-kube-controllers-7c9fcf4458-", Namespace:"calico-system", SelfLink:"", UID:"30192d89-02dd-40c8-8733-2e00d914bf2b", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c9fcf4458", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"calico-kube-controllers-7c9fcf4458-gm722", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.126.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif9d5df435ff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.703604 containerd[1618]: 2026-03-04 08:53:53.681 [INFO][3852] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.66/32] ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" Mar 4 08:53:53.703604 containerd[1618]: 2026-03-04 08:53:53.681 [INFO][3852] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9d5df435ff ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" Mar 4 08:53:53.703604 containerd[1618]: 2026-03-04 08:53:53.688 [INFO][3852] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" Mar 4 08:53:53.703663 containerd[1618]: 2026-03-04 08:53:53.688 [INFO][3852] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0", GenerateName:"calico-kube-controllers-7c9fcf4458-", Namespace:"calico-system", SelfLink:"", UID:"30192d89-02dd-40c8-8733-2e00d914bf2b", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c9fcf4458", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be", Pod:"calico-kube-controllers-7c9fcf4458-gm722", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.126.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif9d5df435ff", MAC:"be:86:af:42:e7:92", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.703709 containerd[1618]: 2026-03-04 08:53:53.701 [INFO][3852] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" Namespace="calico-system" Pod="calico-kube-controllers-7c9fcf4458-gm722" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--kube--controllers--7c9fcf4458--gm722-eth0" Mar 4 08:53:53.719728 containerd[1618]: time="2026-03-04T08:53:53.719652277Z" level=info msg="connecting to shim d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be" address="unix:///run/containerd/s/f3b75ae6071af3a9d36a3dd4fd3d0388361d8095f1b032c5fa46e58c2b229061" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:53.742368 systemd[1]: Started cri-containerd-d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be.scope - libcontainer container d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be. Mar 4 08:53:53.783303 systemd-networkd[1441]: calib43e61894fd: Link UP Mar 4 08:53:53.784584 containerd[1618]: time="2026-03-04T08:53:53.783626962Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c9fcf4458-gm722,Uid:30192d89-02dd-40c8-8733-2e00d914bf2b,Namespace:calico-system,Attempt:0,} returns sandbox id \"d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be\"" Mar 4 08:53:53.783545 systemd-networkd[1441]: calib43e61894fd: Gained carrier Mar 4 08:53:53.799048 containerd[1618]: 2026-03-04 08:53:53.339 [ERROR][3839] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 4 08:53:53.799048 containerd[1618]: 2026-03-04 08:53:53.379 [INFO][3839] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0 coredns-674b8bbfcf- kube-system 819d499d-6035-4023-9c4c-4eddc1d8fdc0 852 0 2026-03-04 08:53:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 coredns-674b8bbfcf-6qmwn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib43e61894fd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-" Mar 4 08:53:53.799048 containerd[1618]: 2026-03-04 08:53:53.379 [INFO][3839] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" Mar 4 08:53:53.799048 containerd[1618]: 2026-03-04 08:53:53.459 [INFO][3921] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" HandleID="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Workload="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.474 [INFO][3921] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" HandleID="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Workload="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004762c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"coredns-674b8bbfcf-6qmwn", "timestamp":"2026-03-04 08:53:53.459010517 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400038a000)} Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.474 [INFO][3921] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.678 [INFO][3921] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.679 [INFO][3921] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.741 [INFO][3921] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.748 [INFO][3921] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.760 [INFO][3921] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.762 [INFO][3921] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799284 containerd[1618]: 2026-03-04 08:53:53.764 [INFO][3921] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799448 containerd[1618]: 2026-03-04 08:53:53.764 [INFO][3921] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799448 containerd[1618]: 2026-03-04 08:53:53.766 [INFO][3921] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302 Mar 4 08:53:53.799448 containerd[1618]: 2026-03-04 08:53:53.772 [INFO][3921] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799448 containerd[1618]: 2026-03-04 08:53:53.777 [INFO][3921] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.67/26] block=192.168.126.64/26 handle="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799448 containerd[1618]: 2026-03-04 08:53:53.778 [INFO][3921] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.67/26] handle="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.799448 containerd[1618]: 2026-03-04 08:53:53.778 [INFO][3921] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:53:53.799448 containerd[1618]: 2026-03-04 08:53:53.778 [INFO][3921] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.67/26] IPv6=[] ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" HandleID="k8s-pod-network.c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Workload="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" Mar 4 08:53:53.799578 containerd[1618]: 2026-03-04 08:53:53.779 [INFO][3839] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"819d499d-6035-4023-9c4c-4eddc1d8fdc0", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"coredns-674b8bbfcf-6qmwn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib43e61894fd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.799578 containerd[1618]: 2026-03-04 08:53:53.779 [INFO][3839] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.67/32] ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" Mar 4 08:53:53.799578 containerd[1618]: 2026-03-04 08:53:53.779 [INFO][3839] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib43e61894fd ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" Mar 4 08:53:53.799578 containerd[1618]: 2026-03-04 08:53:53.784 [INFO][3839] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" Mar 4 08:53:53.799578 containerd[1618]: 2026-03-04 08:53:53.784 [INFO][3839] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"819d499d-6035-4023-9c4c-4eddc1d8fdc0", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302", Pod:"coredns-674b8bbfcf-6qmwn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib43e61894fd", MAC:"e2:cf:b9:f8:d9:57", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.799578 containerd[1618]: 2026-03-04 08:53:53.797 [INFO][3839] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" Namespace="kube-system" Pod="coredns-674b8bbfcf-6qmwn" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--6qmwn-eth0" Mar 4 08:53:53.816261 containerd[1618]: time="2026-03-04T08:53:53.815699044Z" level=info msg="connecting to shim c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302" address="unix:///run/containerd/s/2a1610b6e2b9602796a4ff261158775bd908bb0f498fca1e67bca6b015993401" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:53.834337 systemd[1]: Started cri-containerd-c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302.scope - libcontainer container c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302. Mar 4 08:53:53.868359 containerd[1618]: time="2026-03-04T08:53:53.868318551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6qmwn,Uid:819d499d-6035-4023-9c4c-4eddc1d8fdc0,Namespace:kube-system,Attempt:0,} returns sandbox id \"c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302\"" Mar 4 08:53:53.873463 containerd[1618]: time="2026-03-04T08:53:53.873414057Z" level=info msg="CreateContainer within sandbox \"c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 4 08:53:53.885278 systemd-networkd[1441]: calib0a43366f8c: Link UP Mar 4 08:53:53.886460 systemd-networkd[1441]: calib0a43366f8c: Gained carrier Mar 4 08:53:53.890419 containerd[1618]: time="2026-03-04T08:53:53.890381583Z" level=info msg="Container fac15f1e17ccb8b15dfe31f96d49f46cf3468400e059e0e56908b470ae81d361: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:53.899728 containerd[1618]: time="2026-03-04T08:53:53.899682910Z" level=info msg="CreateContainer within sandbox \"c4693943187a60fada36d3419c71b0075b6c4d1a71173d73d84a09fb16cc8302\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fac15f1e17ccb8b15dfe31f96d49f46cf3468400e059e0e56908b470ae81d361\"" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.393 [ERROR][3884] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.420 [INFO][3884] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0 goldmane-5b85766d88- calico-system 53822f89-b17a-4d51-b036-a94ebb7fdda1 856 0 2026-03-04 08:53:26 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 goldmane-5b85766d88-qsxfw eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib0a43366f8c [] [] }} ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.420 [INFO][3884] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.471 [INFO][3940] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" HandleID="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Workload="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.494 [INFO][3940] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" HandleID="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Workload="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003f0df0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"goldmane-5b85766d88-qsxfw", "timestamp":"2026-03-04 08:53:53.471391819 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000240580)} Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.494 [INFO][3940] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.778 [INFO][3940] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.778 [INFO][3940] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.842 [INFO][3940] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.847 [INFO][3940] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.861 [INFO][3940] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.864 [INFO][3940] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.866 [INFO][3940] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.866 [INFO][3940] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.867 [INFO][3940] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268 Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.872 [INFO][3940] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.878 [INFO][3940] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.68/26] block=192.168.126.64/26 handle="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.879 [INFO][3940] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.68/26] handle="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.879 [INFO][3940] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:53:53.899947 containerd[1618]: 2026-03-04 08:53:53.879 [INFO][3940] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.68/26] IPv6=[] ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" HandleID="k8s-pod-network.3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Workload="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" Mar 4 08:53:53.902458 containerd[1618]: 2026-03-04 08:53:53.883 [INFO][3884] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"53822f89-b17a-4d51-b036-a94ebb7fdda1", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"goldmane-5b85766d88-qsxfw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.126.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib0a43366f8c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.902458 containerd[1618]: 2026-03-04 08:53:53.883 [INFO][3884] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.68/32] ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" Mar 4 08:53:53.902458 containerd[1618]: 2026-03-04 08:53:53.883 [INFO][3884] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib0a43366f8c ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" Mar 4 08:53:53.902458 containerd[1618]: 2026-03-04 08:53:53.886 [INFO][3884] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" Mar 4 08:53:53.902458 containerd[1618]: 2026-03-04 08:53:53.887 [INFO][3884] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"53822f89-b17a-4d51-b036-a94ebb7fdda1", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268", Pod:"goldmane-5b85766d88-qsxfw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.126.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib0a43366f8c", MAC:"a2:26:f6:c4:9a:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:53.902458 containerd[1618]: 2026-03-04 08:53:53.897 [INFO][3884] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" Namespace="calico-system" Pod="goldmane-5b85766d88-qsxfw" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-goldmane--5b85766d88--qsxfw-eth0" Mar 4 08:53:53.902458 containerd[1618]: time="2026-03-04T08:53:53.901135317Z" level=info msg="StartContainer for \"fac15f1e17ccb8b15dfe31f96d49f46cf3468400e059e0e56908b470ae81d361\"" Mar 4 08:53:53.903837 containerd[1618]: time="2026-03-04T08:53:53.903807091Z" level=info msg="connecting to shim fac15f1e17ccb8b15dfe31f96d49f46cf3468400e059e0e56908b470ae81d361" address="unix:///run/containerd/s/2a1610b6e2b9602796a4ff261158775bd908bb0f498fca1e67bca6b015993401" protocol=ttrpc version=3 Mar 4 08:53:53.921033 containerd[1618]: time="2026-03-04T08:53:53.920928377Z" level=info msg="connecting to shim 3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268" address="unix:///run/containerd/s/a5d0c1a9536ad90526cc86b2df4a45f782a04b287774f36a81f8eab39671fa58" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:53.922335 systemd[1]: Started cri-containerd-fac15f1e17ccb8b15dfe31f96d49f46cf3468400e059e0e56908b470ae81d361.scope - libcontainer container fac15f1e17ccb8b15dfe31f96d49f46cf3468400e059e0e56908b470ae81d361. Mar 4 08:53:53.950419 systemd[1]: Started cri-containerd-3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268.scope - libcontainer container 3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268. Mar 4 08:53:53.962495 containerd[1618]: time="2026-03-04T08:53:53.962457708Z" level=info msg="StartContainer for \"fac15f1e17ccb8b15dfe31f96d49f46cf3468400e059e0e56908b470ae81d361\" returns successfully" Mar 4 08:53:53.989765 systemd-networkd[1441]: cali0f83317bb72: Link UP Mar 4 08:53:53.990639 systemd-networkd[1441]: cali0f83317bb72: Gained carrier Mar 4 08:53:54.002284 containerd[1618]: time="2026-03-04T08:53:54.002220349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-qsxfw,Uid:53822f89-b17a-4d51-b036-a94ebb7fdda1,Namespace:calico-system,Attempt:0,} returns sandbox id \"3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268\"" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.394 [ERROR][3870] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.421 [INFO][3870] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0 calico-apiserver-b6d5c8957- calico-system f128d3e7-c371-4b02-b4e6-418d24da20f2 855 0 2026-03-04 08:53:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b6d5c8957 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 calico-apiserver-b6d5c8957-hzhws eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali0f83317bb72 [] [] }} ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.421 [INFO][3870] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.482 [INFO][3945] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" HandleID="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.496 [INFO][3945] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" HandleID="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004db90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"calico-apiserver-b6d5c8957-hzhws", "timestamp":"2026-03-04 08:53:53.482522916 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400019a6e0)} Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.498 [INFO][3945] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.879 [INFO][3945] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.880 [INFO][3945] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.942 [INFO][3945] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.951 [INFO][3945] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.963 [INFO][3945] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.966 [INFO][3945] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.970 [INFO][3945] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.970 [INFO][3945] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.972 [INFO][3945] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3 Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.977 [INFO][3945] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.985 [INFO][3945] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.69/26] block=192.168.126.64/26 handle="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.985 [INFO][3945] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.69/26] handle="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.985 [INFO][3945] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:53:54.009200 containerd[1618]: 2026-03-04 08:53:53.985 [INFO][3945] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.69/26] IPv6=[] ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" HandleID="k8s-pod-network.58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" Mar 4 08:53:54.009720 containerd[1618]: 2026-03-04 08:53:53.988 [INFO][3870] cni-plugin/k8s.go 418: Populated endpoint ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0", GenerateName:"calico-apiserver-b6d5c8957-", Namespace:"calico-system", SelfLink:"", UID:"f128d3e7-c371-4b02-b4e6-418d24da20f2", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b6d5c8957", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"calico-apiserver-b6d5c8957-hzhws", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0f83317bb72", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:54.009720 containerd[1618]: 2026-03-04 08:53:53.988 [INFO][3870] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.69/32] ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" Mar 4 08:53:54.009720 containerd[1618]: 2026-03-04 08:53:53.988 [INFO][3870] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f83317bb72 ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" Mar 4 08:53:54.009720 containerd[1618]: 2026-03-04 08:53:53.991 [INFO][3870] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" Mar 4 08:53:54.009720 containerd[1618]: 2026-03-04 08:53:53.992 [INFO][3870] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0", GenerateName:"calico-apiserver-b6d5c8957-", Namespace:"calico-system", SelfLink:"", UID:"f128d3e7-c371-4b02-b4e6-418d24da20f2", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b6d5c8957", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3", Pod:"calico-apiserver-b6d5c8957-hzhws", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0f83317bb72", MAC:"52:95:41:db:01:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:54.009720 containerd[1618]: 2026-03-04 08:53:54.005 [INFO][3870] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-hzhws" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--hzhws-eth0" Mar 4 08:53:54.033208 containerd[1618]: time="2026-03-04T08:53:54.030847214Z" level=info msg="connecting to shim 58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3" address="unix:///run/containerd/s/4dd47ff01651dcf659cb4697ef97a5a787cb3d24d2719985a1ec4029e37923e9" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:54.035684 systemd[1]: run-netns-cni\x2d92237475\x2d582a\x2dfc99\x2d4287\x2d9e8b39995bf3.mount: Deactivated successfully. Mar 4 08:53:54.069852 kubelet[2894]: I0304 08:53:54.069784 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-7svcw" podStartSLOduration=4.552972232 podStartE2EDuration="27.069765452s" podCreationTimestamp="2026-03-04 08:53:27 +0000 UTC" firstStartedPulling="2026-03-04 08:53:27.562002447 +0000 UTC m=+18.932331259" lastFinishedPulling="2026-03-04 08:53:50.078795667 +0000 UTC m=+41.449124479" observedRunningTime="2026-03-04 08:53:54.068634726 +0000 UTC m=+45.438963538" watchObservedRunningTime="2026-03-04 08:53:54.069765452 +0000 UTC m=+45.440094264" Mar 4 08:53:54.084896 kubelet[2894]: I0304 08:53:54.084383 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-6qmwn" podStartSLOduration=40.084365566 podStartE2EDuration="40.084365566s" podCreationTimestamp="2026-03-04 08:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 08:53:54.082655357 +0000 UTC m=+45.452984169" watchObservedRunningTime="2026-03-04 08:53:54.084365566 +0000 UTC m=+45.454694338" Mar 4 08:53:54.085649 systemd[1]: Started cri-containerd-58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3.scope - libcontainer container 58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3. Mar 4 08:53:54.105905 systemd-networkd[1441]: calif47dd769628: Link UP Mar 4 08:53:54.106740 systemd-networkd[1441]: calif47dd769628: Gained carrier Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.444 [ERROR][3902] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.470 [INFO][3902] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0 calico-apiserver-b6d5c8957- calico-system 92dfd492-72b0-45e2-bd43-99c3f711766e 857 0 2026-03-04 08:53:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b6d5c8957 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 calico-apiserver-b6d5c8957-vwh62 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calif47dd769628 [] [] }} ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.470 [INFO][3902] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.508 [INFO][3964] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" HandleID="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.517 [INFO][3964] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" HandleID="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002eb7c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"calico-apiserver-b6d5c8957-vwh62", "timestamp":"2026-03-04 08:53:53.508439407 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000484f20)} Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.518 [INFO][3964] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.985 [INFO][3964] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:53.985 [INFO][3964] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.055 [INFO][3964] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.062 [INFO][3964] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.070 [INFO][3964] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.074 [INFO][3964] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.077 [INFO][3964] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.077 [INFO][3964] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.081 [INFO][3964] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39 Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.089 [INFO][3964] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.098 [INFO][3964] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.70/26] block=192.168.126.64/26 handle="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.098 [INFO][3964] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.70/26] handle="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.098 [INFO][3964] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:53:54.123197 containerd[1618]: 2026-03-04 08:53:54.098 [INFO][3964] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.70/26] IPv6=[] ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" HandleID="k8s-pod-network.76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Workload="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" Mar 4 08:53:54.123727 containerd[1618]: 2026-03-04 08:53:54.101 [INFO][3902] cni-plugin/k8s.go 418: Populated endpoint ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0", GenerateName:"calico-apiserver-b6d5c8957-", Namespace:"calico-system", SelfLink:"", UID:"92dfd492-72b0-45e2-bd43-99c3f711766e", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b6d5c8957", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"calico-apiserver-b6d5c8957-vwh62", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif47dd769628", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:54.123727 containerd[1618]: 2026-03-04 08:53:54.101 [INFO][3902] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.70/32] ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" Mar 4 08:53:54.123727 containerd[1618]: 2026-03-04 08:53:54.101 [INFO][3902] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif47dd769628 ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" Mar 4 08:53:54.123727 containerd[1618]: 2026-03-04 08:53:54.108 [INFO][3902] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" Mar 4 08:53:54.123727 containerd[1618]: 2026-03-04 08:53:54.109 [INFO][3902] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0", GenerateName:"calico-apiserver-b6d5c8957-", Namespace:"calico-system", SelfLink:"", UID:"92dfd492-72b0-45e2-bd43-99c3f711766e", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b6d5c8957", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39", Pod:"calico-apiserver-b6d5c8957-vwh62", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif47dd769628", MAC:"ba:b8:55:95:73:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:53:54.123727 containerd[1618]: 2026-03-04 08:53:54.120 [INFO][3902] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" Namespace="calico-system" Pod="calico-apiserver-b6d5c8957-vwh62" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-calico--apiserver--b6d5c8957--vwh62-eth0" Mar 4 08:53:54.145232 containerd[1618]: time="2026-03-04T08:53:54.145065113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b6d5c8957-hzhws,Uid:f128d3e7-c371-4b02-b4e6-418d24da20f2,Namespace:calico-system,Attempt:0,} returns sandbox id \"58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3\"" Mar 4 08:53:54.148304 containerd[1618]: time="2026-03-04T08:53:54.146775122Z" level=info msg="connecting to shim 76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39" address="unix:///run/containerd/s/2dfa1267e59db0eec3ce0ffc0ef6ef6fc2f5e83f3a65f20fb8f0b5e5265bb786" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:53:54.178513 systemd[1]: Started cri-containerd-76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39.scope - libcontainer container 76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39. Mar 4 08:53:54.209004 containerd[1618]: time="2026-03-04T08:53:54.208964957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b6d5c8957-vwh62,Uid:92dfd492-72b0-45e2-bd43-99c3f711766e,Namespace:calico-system,Attempt:0,} returns sandbox id \"76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39\"" Mar 4 08:53:54.635338 systemd-networkd[1441]: cali3eaa5c9803f: Gained IPv6LL Mar 4 08:53:55.146323 systemd-networkd[1441]: cali0f83317bb72: Gained IPv6LL Mar 4 08:53:55.163931 systemd-networkd[1441]: vxlan.calico: Link UP Mar 4 08:53:55.164116 systemd-networkd[1441]: vxlan.calico: Gained carrier Mar 4 08:53:55.274295 systemd-networkd[1441]: calif9d5df435ff: Gained IPv6LL Mar 4 08:53:55.456369 containerd[1618]: time="2026-03-04T08:53:55.456303478Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:55.458078 containerd[1618]: time="2026-03-04T08:53:55.458051927Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 4 08:53:55.459343 containerd[1618]: time="2026-03-04T08:53:55.459297253Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:55.461763 containerd[1618]: time="2026-03-04T08:53:55.461718905Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:55.462951 containerd[1618]: time="2026-03-04T08:53:55.462924791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.779736938s" Mar 4 08:53:55.463028 containerd[1618]: time="2026-03-04T08:53:55.462956112Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 4 08:53:55.463936 containerd[1618]: time="2026-03-04T08:53:55.463915876Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 4 08:53:55.468290 containerd[1618]: time="2026-03-04T08:53:55.468245658Z" level=info msg="CreateContainer within sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 4 08:53:55.479351 containerd[1618]: time="2026-03-04T08:53:55.479297274Z" level=info msg="Container 7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:55.486554 containerd[1618]: time="2026-03-04T08:53:55.486446711Z" level=info msg="CreateContainer within sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\"" Mar 4 08:53:55.487309 containerd[1618]: time="2026-03-04T08:53:55.487025914Z" level=info msg="StartContainer for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\"" Mar 4 08:53:55.488082 containerd[1618]: time="2026-03-04T08:53:55.488049999Z" level=info msg="connecting to shim 7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5" address="unix:///run/containerd/s/f9874770ea0577a984b91fbdc9e4f5f6e48b12437fa50ff7e861b7c9cff2799c" protocol=ttrpc version=3 Mar 4 08:53:55.509396 systemd[1]: Started cri-containerd-7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5.scope - libcontainer container 7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5. Mar 4 08:53:55.530280 systemd-networkd[1441]: calib0a43366f8c: Gained IPv6LL Mar 4 08:53:55.546458 containerd[1618]: time="2026-03-04T08:53:55.546310694Z" level=info msg="StartContainer for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" returns successfully" Mar 4 08:53:55.658436 systemd-networkd[1441]: calib43e61894fd: Gained IPv6LL Mar 4 08:53:55.914343 systemd-networkd[1441]: calif47dd769628: Gained IPv6LL Mar 4 08:53:56.746428 systemd-networkd[1441]: vxlan.calico: Gained IPv6LL Mar 4 08:53:58.817054 containerd[1618]: time="2026-03-04T08:53:58.816988028Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:58.818400 containerd[1618]: time="2026-03-04T08:53:58.818369955Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 4 08:53:58.819598 containerd[1618]: time="2026-03-04T08:53:58.819552801Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:58.822076 containerd[1618]: time="2026-03-04T08:53:58.822030654Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:53:58.823065 containerd[1618]: time="2026-03-04T08:53:58.823017659Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.358989422s" Mar 4 08:53:58.823065 containerd[1618]: time="2026-03-04T08:53:58.823059019Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 4 08:53:58.824232 containerd[1618]: time="2026-03-04T08:53:58.824171584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 4 08:53:58.834767 containerd[1618]: time="2026-03-04T08:53:58.834728318Z" level=info msg="CreateContainer within sandbox \"d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 4 08:53:58.844487 containerd[1618]: time="2026-03-04T08:53:58.844272406Z" level=info msg="Container f6c245b6a1ea4668b6145375b0ca156556339749f157a0d71674f227f86e8aae: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:53:58.851814 containerd[1618]: time="2026-03-04T08:53:58.851760244Z" level=info msg="CreateContainer within sandbox \"d54c70376378f7e7ec44ae5efdd503dcea0f015a3ad9d18391613f4dd61eb7be\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f6c245b6a1ea4668b6145375b0ca156556339749f157a0d71674f227f86e8aae\"" Mar 4 08:53:58.852434 containerd[1618]: time="2026-03-04T08:53:58.852359247Z" level=info msg="StartContainer for \"f6c245b6a1ea4668b6145375b0ca156556339749f157a0d71674f227f86e8aae\"" Mar 4 08:53:58.854050 containerd[1618]: time="2026-03-04T08:53:58.853634774Z" level=info msg="connecting to shim f6c245b6a1ea4668b6145375b0ca156556339749f157a0d71674f227f86e8aae" address="unix:///run/containerd/s/f3b75ae6071af3a9d36a3dd4fd3d0388361d8095f1b032c5fa46e58c2b229061" protocol=ttrpc version=3 Mar 4 08:53:58.873526 systemd[1]: Started cri-containerd-f6c245b6a1ea4668b6145375b0ca156556339749f157a0d71674f227f86e8aae.scope - libcontainer container f6c245b6a1ea4668b6145375b0ca156556339749f157a0d71674f227f86e8aae. Mar 4 08:53:58.913113 containerd[1618]: time="2026-03-04T08:53:58.913059115Z" level=info msg="StartContainer for \"f6c245b6a1ea4668b6145375b0ca156556339749f157a0d71674f227f86e8aae\" returns successfully" Mar 4 08:53:59.084388 kubelet[2894]: I0304 08:53:59.084239 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7c9fcf4458-gm722" podStartSLOduration=27.045317887 podStartE2EDuration="32.084220662s" podCreationTimestamp="2026-03-04 08:53:27 +0000 UTC" firstStartedPulling="2026-03-04 08:53:53.785075249 +0000 UTC m=+45.155404061" lastFinishedPulling="2026-03-04 08:53:58.823978024 +0000 UTC m=+50.194306836" observedRunningTime="2026-03-04 08:53:59.083548259 +0000 UTC m=+50.453877071" watchObservedRunningTime="2026-03-04 08:53:59.084220662 +0000 UTC m=+50.454549474" Mar 4 08:54:01.115854 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3132881339.mount: Deactivated successfully. Mar 4 08:54:01.526283 containerd[1618]: time="2026-03-04T08:54:01.526240797Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:01.527604 containerd[1618]: time="2026-03-04T08:54:01.526935961Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 4 08:54:01.527849 containerd[1618]: time="2026-03-04T08:54:01.527823565Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:01.530993 containerd[1618]: time="2026-03-04T08:54:01.530910581Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:01.531812 containerd[1618]: time="2026-03-04T08:54:01.531785905Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 2.70758448s" Mar 4 08:54:01.531896 containerd[1618]: time="2026-03-04T08:54:01.531883186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 4 08:54:01.534044 containerd[1618]: time="2026-03-04T08:54:01.533988036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 4 08:54:01.537709 containerd[1618]: time="2026-03-04T08:54:01.537676935Z" level=info msg="CreateContainer within sandbox \"3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 4 08:54:01.547979 containerd[1618]: time="2026-03-04T08:54:01.547125783Z" level=info msg="Container 09566401aa241b61d969740d55a83da93b93ebc1c4c83c479629dea8d9a1ad8c: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:01.557079 containerd[1618]: time="2026-03-04T08:54:01.557026433Z" level=info msg="CreateContainer within sandbox \"3cf994ed39994ba8b499c0ad87b5a50d682d980f1a116d0073f3a825ea2b4268\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"09566401aa241b61d969740d55a83da93b93ebc1c4c83c479629dea8d9a1ad8c\"" Mar 4 08:54:01.557644 containerd[1618]: time="2026-03-04T08:54:01.557567316Z" level=info msg="StartContainer for \"09566401aa241b61d969740d55a83da93b93ebc1c4c83c479629dea8d9a1ad8c\"" Mar 4 08:54:01.558815 containerd[1618]: time="2026-03-04T08:54:01.558731442Z" level=info msg="connecting to shim 09566401aa241b61d969740d55a83da93b93ebc1c4c83c479629dea8d9a1ad8c" address="unix:///run/containerd/s/a5d0c1a9536ad90526cc86b2df4a45f782a04b287774f36a81f8eab39671fa58" protocol=ttrpc version=3 Mar 4 08:54:01.584499 systemd[1]: Started cri-containerd-09566401aa241b61d969740d55a83da93b93ebc1c4c83c479629dea8d9a1ad8c.scope - libcontainer container 09566401aa241b61d969740d55a83da93b93ebc1c4c83c479629dea8d9a1ad8c. Mar 4 08:54:01.622435 containerd[1618]: time="2026-03-04T08:54:01.622400564Z" level=info msg="StartContainer for \"09566401aa241b61d969740d55a83da93b93ebc1c4c83c479629dea8d9a1ad8c\" returns successfully" Mar 4 08:54:02.092514 kubelet[2894]: I0304 08:54:02.092433 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-qsxfw" podStartSLOduration=28.562944991 podStartE2EDuration="36.092348026s" podCreationTimestamp="2026-03-04 08:53:26 +0000 UTC" firstStartedPulling="2026-03-04 08:53:54.003456796 +0000 UTC m=+45.373785608" lastFinishedPulling="2026-03-04 08:54:01.532859871 +0000 UTC m=+52.903188643" observedRunningTime="2026-03-04 08:54:02.090366656 +0000 UTC m=+53.460695468" watchObservedRunningTime="2026-03-04 08:54:02.092348026 +0000 UTC m=+53.462676798" Mar 4 08:54:04.552570 containerd[1618]: time="2026-03-04T08:54:04.552471333Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:04.553370 containerd[1618]: time="2026-03-04T08:54:04.553326737Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 4 08:54:04.554426 containerd[1618]: time="2026-03-04T08:54:04.554395902Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:04.557263 containerd[1618]: time="2026-03-04T08:54:04.557217717Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:04.558140 containerd[1618]: time="2026-03-04T08:54:04.558016041Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.023983724s" Mar 4 08:54:04.558140 containerd[1618]: time="2026-03-04T08:54:04.558050321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 4 08:54:04.560031 containerd[1618]: time="2026-03-04T08:54:04.559132366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 4 08:54:04.564102 containerd[1618]: time="2026-03-04T08:54:04.564073831Z" level=info msg="CreateContainer within sandbox \"58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 4 08:54:04.571488 containerd[1618]: time="2026-03-04T08:54:04.571452629Z" level=info msg="Container 9c02cf42a3d7b82f979dc2e3da3db26fb1dae2128e32101a35816dca0ebe84de: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:04.582158 containerd[1618]: time="2026-03-04T08:54:04.582117963Z" level=info msg="CreateContainer within sandbox \"58099b1c8d51b07492eafdcd21c82fd8cb506edb8fa869b6825c051734d2c3b3\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9c02cf42a3d7b82f979dc2e3da3db26fb1dae2128e32101a35816dca0ebe84de\"" Mar 4 08:54:04.582921 containerd[1618]: time="2026-03-04T08:54:04.582879407Z" level=info msg="StartContainer for \"9c02cf42a3d7b82f979dc2e3da3db26fb1dae2128e32101a35816dca0ebe84de\"" Mar 4 08:54:04.584202 containerd[1618]: time="2026-03-04T08:54:04.584151493Z" level=info msg="connecting to shim 9c02cf42a3d7b82f979dc2e3da3db26fb1dae2128e32101a35816dca0ebe84de" address="unix:///run/containerd/s/4dd47ff01651dcf659cb4697ef97a5a787cb3d24d2719985a1ec4029e37923e9" protocol=ttrpc version=3 Mar 4 08:54:04.602330 systemd[1]: Started cri-containerd-9c02cf42a3d7b82f979dc2e3da3db26fb1dae2128e32101a35816dca0ebe84de.scope - libcontainer container 9c02cf42a3d7b82f979dc2e3da3db26fb1dae2128e32101a35816dca0ebe84de. Mar 4 08:54:04.635996 containerd[1618]: time="2026-03-04T08:54:04.635958396Z" level=info msg="StartContainer for \"9c02cf42a3d7b82f979dc2e3da3db26fb1dae2128e32101a35816dca0ebe84de\" returns successfully" Mar 4 08:54:04.901252 containerd[1618]: time="2026-03-04T08:54:04.900800018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hcf7z,Uid:ac86e3f5-5d3b-4978-8a8a-3c851693c8e7,Namespace:calico-system,Attempt:0,}" Mar 4 08:54:04.933017 containerd[1618]: time="2026-03-04T08:54:04.932518258Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:04.933300 containerd[1618]: time="2026-03-04T08:54:04.933020381Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 4 08:54:04.937148 containerd[1618]: time="2026-03-04T08:54:04.937079362Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 377.04159ms" Mar 4 08:54:04.937148 containerd[1618]: time="2026-03-04T08:54:04.937111362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 4 08:54:04.938496 containerd[1618]: time="2026-03-04T08:54:04.938463129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 4 08:54:04.945760 containerd[1618]: time="2026-03-04T08:54:04.945719725Z" level=info msg="CreateContainer within sandbox \"76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 4 08:54:04.960332 containerd[1618]: time="2026-03-04T08:54:04.960211239Z" level=info msg="Container c951c53b9aabb73c9c634d0b373d6944210c256f2d935362fd1a44ebec7b7bae: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:04.974073 containerd[1618]: time="2026-03-04T08:54:04.973927028Z" level=info msg="CreateContainer within sandbox \"76c8b4358322bcd10c6852f503157c47af3188c9c76022c1b3f79c72169baf39\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c951c53b9aabb73c9c634d0b373d6944210c256f2d935362fd1a44ebec7b7bae\"" Mar 4 08:54:04.975893 containerd[1618]: time="2026-03-04T08:54:04.974706712Z" level=info msg="StartContainer for \"c951c53b9aabb73c9c634d0b373d6944210c256f2d935362fd1a44ebec7b7bae\"" Mar 4 08:54:04.976886 containerd[1618]: time="2026-03-04T08:54:04.976856483Z" level=info msg="connecting to shim c951c53b9aabb73c9c634d0b373d6944210c256f2d935362fd1a44ebec7b7bae" address="unix:///run/containerd/s/2dfa1267e59db0eec3ce0ffc0ef6ef6fc2f5e83f3a65f20fb8f0b5e5265bb786" protocol=ttrpc version=3 Mar 4 08:54:05.001355 systemd[1]: Started cri-containerd-c951c53b9aabb73c9c634d0b373d6944210c256f2d935362fd1a44ebec7b7bae.scope - libcontainer container c951c53b9aabb73c9c634d0b373d6944210c256f2d935362fd1a44ebec7b7bae. Mar 4 08:54:05.041948 systemd-networkd[1441]: calif5905188029: Link UP Mar 4 08:54:05.042216 systemd-networkd[1441]: calif5905188029: Gained carrier Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:04.953 [INFO][4892] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0 csi-node-driver- calico-system ac86e3f5-5d3b-4978-8a8a-3c851693c8e7 701 0 2026-03-04 08:53:27 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 csi-node-driver-hcf7z eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif5905188029 [] [] }} ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:04.953 [INFO][4892] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:04.986 [INFO][4905] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" HandleID="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Workload="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:04.999 [INFO][4905] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" HandleID="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Workload="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002eba10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"csi-node-driver-hcf7z", "timestamp":"2026-03-04 08:54:04.986612653 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40005fcf20)} Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:04.999 [INFO][4905] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:04.999 [INFO][4905] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:04.999 [INFO][4905] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.002 [INFO][4905] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.010 [INFO][4905] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.016 [INFO][4905] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.019 [INFO][4905] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.021 [INFO][4905] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.021 [INFO][4905] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.023 [INFO][4905] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961 Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.029 [INFO][4905] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.037 [INFO][4905] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.71/26] block=192.168.126.64/26 handle="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.037 [INFO][4905] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.71/26] handle="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.037 [INFO][4905] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:54:05.063151 containerd[1618]: 2026-03-04 08:54:05.037 [INFO][4905] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.71/26] IPv6=[] ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" HandleID="k8s-pod-network.1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Workload="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" Mar 4 08:54:05.063972 containerd[1618]: 2026-03-04 08:54:05.040 [INFO][4892] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ac86e3f5-5d3b-4978-8a8a-3c851693c8e7", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"csi-node-driver-hcf7z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.126.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif5905188029", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:54:05.063972 containerd[1618]: 2026-03-04 08:54:05.040 [INFO][4892] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.71/32] ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" Mar 4 08:54:05.063972 containerd[1618]: 2026-03-04 08:54:05.040 [INFO][4892] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif5905188029 ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" Mar 4 08:54:05.063972 containerd[1618]: 2026-03-04 08:54:05.045 [INFO][4892] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" Mar 4 08:54:05.063972 containerd[1618]: 2026-03-04 08:54:05.046 [INFO][4892] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ac86e3f5-5d3b-4978-8a8a-3c851693c8e7", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961", Pod:"csi-node-driver-hcf7z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.126.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif5905188029", MAC:"ae:a4:e2:f2:02:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:54:05.063972 containerd[1618]: 2026-03-04 08:54:05.059 [INFO][4892] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" Namespace="calico-system" Pod="csi-node-driver-hcf7z" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-csi--node--driver--hcf7z-eth0" Mar 4 08:54:05.063972 containerd[1618]: time="2026-03-04T08:54:05.063260201Z" level=info msg="StartContainer for \"c951c53b9aabb73c9c634d0b373d6944210c256f2d935362fd1a44ebec7b7bae\" returns successfully" Mar 4 08:54:05.099886 containerd[1618]: time="2026-03-04T08:54:05.099309344Z" level=info msg="connecting to shim 1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961" address="unix:///run/containerd/s/691952b35b35c5db4bda4256bd55b7eb8810df9530729d316bdaeda2bb1f9301" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:54:05.108195 kubelet[2894]: I0304 08:54:05.107458 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-b6d5c8957-vwh62" podStartSLOduration=29.379145419 podStartE2EDuration="40.107439745s" podCreationTimestamp="2026-03-04 08:53:25 +0000 UTC" firstStartedPulling="2026-03-04 08:53:54.210012642 +0000 UTC m=+45.580341454" lastFinishedPulling="2026-03-04 08:54:04.938306968 +0000 UTC m=+56.308635780" observedRunningTime="2026-03-04 08:54:05.102806481 +0000 UTC m=+56.473135293" watchObservedRunningTime="2026-03-04 08:54:05.107439745 +0000 UTC m=+56.477768557" Mar 4 08:54:05.119432 kubelet[2894]: I0304 08:54:05.118270 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-b6d5c8957-hzhws" podStartSLOduration=29.706195357 podStartE2EDuration="40.11824984s" podCreationTimestamp="2026-03-04 08:53:25 +0000 UTC" firstStartedPulling="2026-03-04 08:53:54.146931083 +0000 UTC m=+45.517259895" lastFinishedPulling="2026-03-04 08:54:04.558985566 +0000 UTC m=+55.929314378" observedRunningTime="2026-03-04 08:54:05.116851193 +0000 UTC m=+56.487180005" watchObservedRunningTime="2026-03-04 08:54:05.11824984 +0000 UTC m=+56.488578692" Mar 4 08:54:05.141384 systemd[1]: Started cri-containerd-1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961.scope - libcontainer container 1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961. Mar 4 08:54:05.173226 containerd[1618]: time="2026-03-04T08:54:05.173185638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hcf7z,Uid:ac86e3f5-5d3b-4978-8a8a-3c851693c8e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961\"" Mar 4 08:54:05.576878 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount421336288.mount: Deactivated successfully. Mar 4 08:54:06.099246 kubelet[2894]: I0304 08:54:06.099215 2894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 08:54:06.099630 kubelet[2894]: I0304 08:54:06.099606 2894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 08:54:06.474362 systemd-networkd[1441]: calif5905188029: Gained IPv6LL Mar 4 08:54:06.853681 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount678873294.mount: Deactivated successfully. Mar 4 08:54:06.868569 containerd[1618]: time="2026-03-04T08:54:06.868489429Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:06.869834 containerd[1618]: time="2026-03-04T08:54:06.869765235Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 4 08:54:06.870500 containerd[1618]: time="2026-03-04T08:54:06.870460439Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:06.872989 containerd[1618]: time="2026-03-04T08:54:06.872934931Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:06.873416 containerd[1618]: time="2026-03-04T08:54:06.873380334Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 1.934882005s" Mar 4 08:54:06.873458 containerd[1618]: time="2026-03-04T08:54:06.873415574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 4 08:54:06.874834 containerd[1618]: time="2026-03-04T08:54:06.874801301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 4 08:54:06.877680 containerd[1618]: time="2026-03-04T08:54:06.877634275Z" level=info msg="CreateContainer within sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 4 08:54:06.885196 containerd[1618]: time="2026-03-04T08:54:06.884644151Z" level=info msg="Container 00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:06.892412 containerd[1618]: time="2026-03-04T08:54:06.892366830Z" level=info msg="CreateContainer within sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\"" Mar 4 08:54:06.893806 containerd[1618]: time="2026-03-04T08:54:06.893773357Z" level=info msg="StartContainer for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\"" Mar 4 08:54:06.895050 containerd[1618]: time="2026-03-04T08:54:06.895011243Z" level=info msg="connecting to shim 00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3" address="unix:///run/containerd/s/f9874770ea0577a984b91fbdc9e4f5f6e48b12437fa50ff7e861b7c9cff2799c" protocol=ttrpc version=3 Mar 4 08:54:06.923536 systemd[1]: Started cri-containerd-00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3.scope - libcontainer container 00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3. Mar 4 08:54:06.960888 containerd[1618]: time="2026-03-04T08:54:06.960851297Z" level=info msg="StartContainer for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" returns successfully" Mar 4 08:54:07.106024 containerd[1618]: time="2026-03-04T08:54:07.105907632Z" level=info msg="StopContainer for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" with timeout 30 (s)" Mar 4 08:54:07.106135 containerd[1618]: time="2026-03-04T08:54:07.106077873Z" level=info msg="StopContainer for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" with timeout 30 (s)" Mar 4 08:54:07.106742 containerd[1618]: time="2026-03-04T08:54:07.106558795Z" level=info msg="Stop container \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" with signal terminated" Mar 4 08:54:07.106742 containerd[1618]: time="2026-03-04T08:54:07.106585315Z" level=info msg="Stop container \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" with signal terminated" Mar 4 08:54:07.115126 systemd[1]: cri-containerd-00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3.scope: Deactivated successfully. Mar 4 08:54:07.119729 kubelet[2894]: I0304 08:54:07.119551 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-74d466c56f-v7njq" podStartSLOduration=23.927951853 podStartE2EDuration="37.119533221s" podCreationTimestamp="2026-03-04 08:53:30 +0000 UTC" firstStartedPulling="2026-03-04 08:53:53.682539489 +0000 UTC m=+45.052868301" lastFinishedPulling="2026-03-04 08:54:06.874120857 +0000 UTC m=+58.244449669" observedRunningTime="2026-03-04 08:54:07.11929614 +0000 UTC m=+58.489624952" watchObservedRunningTime="2026-03-04 08:54:07.119533221 +0000 UTC m=+58.489862033" Mar 4 08:54:07.120862 containerd[1618]: time="2026-03-04T08:54:07.120801307Z" level=info msg="received container exit event container_id:\"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" id:\"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" pid:5060 exit_status:2 exited_at:{seconds:1772614447 nanos:119780182}" Mar 4 08:54:07.131144 systemd[1]: cri-containerd-7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5.scope: Deactivated successfully. Mar 4 08:54:07.132151 containerd[1618]: time="2026-03-04T08:54:07.132101365Z" level=info msg="received container exit event container_id:\"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" id:\"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" pid:4628 exited_at:{seconds:1772614447 nanos:131852083}" Mar 4 08:54:07.147800 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3-rootfs.mount: Deactivated successfully. Mar 4 08:54:07.695452 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5-rootfs.mount: Deactivated successfully. Mar 4 08:54:07.900272 containerd[1618]: time="2026-03-04T08:54:07.900074056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsgm2,Uid:e9b093ad-c025-4929-90c7-213ec5cd3786,Namespace:kube-system,Attempt:0,}" Mar 4 08:54:10.524140 containerd[1618]: time="2026-03-04T08:54:10.524080514Z" level=info msg="StopContainer for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" returns successfully" Mar 4 08:54:10.525827 containerd[1618]: time="2026-03-04T08:54:10.525454201Z" level=info msg="StopContainer for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" returns successfully" Mar 4 08:54:10.526048 containerd[1618]: time="2026-03-04T08:54:10.525934763Z" level=info msg="StopPodSandbox for \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\"" Mar 4 08:54:10.526151 containerd[1618]: time="2026-03-04T08:54:10.526124444Z" level=info msg="Container to stop \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 4 08:54:10.526360 containerd[1618]: time="2026-03-04T08:54:10.526176044Z" level=info msg="Container to stop \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 4 08:54:10.533121 systemd[1]: cri-containerd-47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea.scope: Deactivated successfully. Mar 4 08:54:10.537568 containerd[1618]: time="2026-03-04T08:54:10.537531542Z" level=info msg="received sandbox exit event container_id:\"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" id:\"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" exit_status:137 exited_at:{seconds:1772614450 nanos:537297701}" monitor_name=podsandbox Mar 4 08:54:10.562109 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea-rootfs.mount: Deactivated successfully. Mar 4 08:54:10.565838 containerd[1618]: time="2026-03-04T08:54:10.565802165Z" level=info msg="shim disconnected" id=47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea namespace=k8s.io Mar 4 08:54:10.565972 containerd[1618]: time="2026-03-04T08:54:10.565896845Z" level=warning msg="cleaning up after shim disconnected" id=47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea namespace=k8s.io Mar 4 08:54:10.565972 containerd[1618]: time="2026-03-04T08:54:10.565929326Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 4 08:54:10.578718 containerd[1618]: time="2026-03-04T08:54:10.578669790Z" level=info msg="received sandbox container exit event sandbox_id:\"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" exit_status:137 exited_at:{seconds:1772614450 nanos:537297701}" monitor_name=criService Mar 4 08:54:10.581386 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea-shm.mount: Deactivated successfully. Mar 4 08:54:10.636545 systemd-networkd[1441]: cali3eaa5c9803f: Link DOWN Mar 4 08:54:10.636551 systemd-networkd[1441]: cali3eaa5c9803f: Lost carrier Mar 4 08:54:10.660605 systemd-networkd[1441]: cali7f450fe3056: Link UP Mar 4 08:54:10.663049 systemd-networkd[1441]: cali7f450fe3056: Gained carrier Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.567 [INFO][5156] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0 coredns-674b8bbfcf- kube-system e9b093ad-c025-4929-90c7-213ec5cd3786 851 0 2026-03-04 08:53:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 coredns-674b8bbfcf-qsgm2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7f450fe3056 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.568 [INFO][5156] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.596 [INFO][5202] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" HandleID="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Workload="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.606 [INFO][5202] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" HandleID="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Workload="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f83b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"coredns-674b8bbfcf-qsgm2", "timestamp":"2026-03-04 08:54:10.596097598 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000544f20)} Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.607 [INFO][5202] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.607 [INFO][5202] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.607 [INFO][5202] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.611 [INFO][5202] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.615 [INFO][5202] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.619 [INFO][5202] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.621 [INFO][5202] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.625 [INFO][5202] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.625 [INFO][5202] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.627 [INFO][5202] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.634 [INFO][5202] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.652 [INFO][5202] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.72/26] block=192.168.126.64/26 handle="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.652 [INFO][5202] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.72/26] handle="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.652 [INFO][5202] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:54:10.679353 containerd[1618]: 2026-03-04 08:54:10.652 [INFO][5202] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.72/26] IPv6=[] ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" HandleID="k8s-pod-network.89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Workload="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" Mar 4 08:54:10.680563 containerd[1618]: 2026-03-04 08:54:10.656 [INFO][5156] cni-plugin/k8s.go 418: Populated endpoint ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e9b093ad-c025-4929-90c7-213ec5cd3786", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"coredns-674b8bbfcf-qsgm2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7f450fe3056", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:54:10.680563 containerd[1618]: 2026-03-04 08:54:10.656 [INFO][5156] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.72/32] ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" Mar 4 08:54:10.680563 containerd[1618]: 2026-03-04 08:54:10.656 [INFO][5156] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7f450fe3056 ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" Mar 4 08:54:10.680563 containerd[1618]: 2026-03-04 08:54:10.663 [INFO][5156] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" Mar 4 08:54:10.680563 containerd[1618]: 2026-03-04 08:54:10.667 [INFO][5156] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e9b093ad-c025-4929-90c7-213ec5cd3786", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 53, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef", Pod:"coredns-674b8bbfcf-qsgm2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7f450fe3056", MAC:"56:7d:1d:59:52:2b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:54:10.680563 containerd[1618]: 2026-03-04 08:54:10.677 [INFO][5156] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsgm2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-coredns--674b8bbfcf--qsgm2-eth0" Mar 4 08:54:10.710741 containerd[1618]: time="2026-03-04T08:54:10.710594539Z" level=info msg="connecting to shim 89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef" address="unix:///run/containerd/s/0f15df7031e0b8fe8ecb2cd4dc68994327a497c2696e7f071de7484ec2ca8358" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.635 [INFO][5220] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.635 [INFO][5220] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" iface="eth0" netns="/var/run/netns/cni-70f0b75f-1174-2147-931c-6e18c8423154" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.635 [INFO][5220] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" iface="eth0" netns="/var/run/netns/cni-70f0b75f-1174-2147-931c-6e18c8423154" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.654 [INFO][5220] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" after=17.988251ms iface="eth0" netns="/var/run/netns/cni-70f0b75f-1174-2147-931c-6e18c8423154" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.655 [INFO][5220] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.655 [INFO][5220] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.688 [INFO][5238] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.688 [INFO][5238] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.688 [INFO][5238] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.730 [INFO][5238] ipam/ipam_plugin.go 516: Released address using handleID ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.730 [INFO][5238] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.732 [INFO][5238] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:54:10.736899 containerd[1618]: 2026-03-04 08:54:10.734 [INFO][5220] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:54:10.737440 containerd[1618]: time="2026-03-04T08:54:10.737392194Z" level=info msg="TearDown network for sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" successfully" Mar 4 08:54:10.737440 containerd[1618]: time="2026-03-04T08:54:10.737424315Z" level=info msg="StopPodSandbox for \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" returns successfully" Mar 4 08:54:10.739365 systemd[1]: Started cri-containerd-89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef.scope - libcontainer container 89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef. Mar 4 08:54:10.783599 containerd[1618]: time="2026-03-04T08:54:10.783485028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsgm2,Uid:e9b093ad-c025-4929-90c7-213ec5cd3786,Namespace:kube-system,Attempt:0,} returns sandbox id \"89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef\"" Mar 4 08:54:10.789840 containerd[1618]: time="2026-03-04T08:54:10.789801820Z" level=info msg="CreateContainer within sandbox \"89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 4 08:54:10.800838 containerd[1618]: time="2026-03-04T08:54:10.800787876Z" level=info msg="Container cb828f092024ebcef8ba0b54dc82dbfd0bbc0f654a3f47ba4d29a2c72ee06626: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:10.806792 containerd[1618]: time="2026-03-04T08:54:10.806753786Z" level=info msg="CreateContainer within sandbox \"89803c3573ea7ca01bfd86a174b65003aeda1c5338ced23e8abecc0f2eaeadef\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cb828f092024ebcef8ba0b54dc82dbfd0bbc0f654a3f47ba4d29a2c72ee06626\"" Mar 4 08:54:10.807408 containerd[1618]: time="2026-03-04T08:54:10.807364669Z" level=info msg="StartContainer for \"cb828f092024ebcef8ba0b54dc82dbfd0bbc0f654a3f47ba4d29a2c72ee06626\"" Mar 4 08:54:10.808572 containerd[1618]: time="2026-03-04T08:54:10.808537915Z" level=info msg="connecting to shim cb828f092024ebcef8ba0b54dc82dbfd0bbc0f654a3f47ba4d29a2c72ee06626" address="unix:///run/containerd/s/0f15df7031e0b8fe8ecb2cd4dc68994327a497c2696e7f071de7484ec2ca8358" protocol=ttrpc version=3 Mar 4 08:54:10.824945 kubelet[2894]: I0304 08:54:10.824867 2894 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-nginx-config\") pod \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " Mar 4 08:54:10.824945 kubelet[2894]: I0304 08:54:10.824923 2894 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkt9m\" (UniqueName: \"kubernetes.io/projected/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-kube-api-access-bkt9m\") pod \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " Mar 4 08:54:10.824945 kubelet[2894]: I0304 08:54:10.824956 2894 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-ca-bundle\") pod \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " Mar 4 08:54:10.825540 kubelet[2894]: I0304 08:54:10.825004 2894 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-backend-key-pair\") pod \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\" (UID: \"b7abfa40-2bcb-4a6d-9983-687f32ff2ccc\") " Mar 4 08:54:10.825744 kubelet[2894]: I0304 08:54:10.825696 2894 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc" (UID: "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 4 08:54:10.825952 kubelet[2894]: I0304 08:54:10.825924 2894 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc" (UID: "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 4 08:54:10.827343 systemd[1]: Started cri-containerd-cb828f092024ebcef8ba0b54dc82dbfd0bbc0f654a3f47ba4d29a2c72ee06626.scope - libcontainer container cb828f092024ebcef8ba0b54dc82dbfd0bbc0f654a3f47ba4d29a2c72ee06626. Mar 4 08:54:10.827627 kubelet[2894]: I0304 08:54:10.827591 2894 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc" (UID: "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 4 08:54:10.828582 kubelet[2894]: I0304 08:54:10.828512 2894 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-kube-api-access-bkt9m" (OuterVolumeSpecName: "kube-api-access-bkt9m") pod "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc" (UID: "b7abfa40-2bcb-4a6d-9983-687f32ff2ccc"). InnerVolumeSpecName "kube-api-access-bkt9m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 4 08:54:10.854893 containerd[1618]: time="2026-03-04T08:54:10.854855390Z" level=info msg="StartContainer for \"cb828f092024ebcef8ba0b54dc82dbfd0bbc0f654a3f47ba4d29a2c72ee06626\" returns successfully" Mar 4 08:54:10.906559 systemd[1]: Removed slice kubepods-besteffort-podb7abfa40_2bcb_4a6d_9983_687f32ff2ccc.slice - libcontainer container kubepods-besteffort-podb7abfa40_2bcb_4a6d_9983_687f32ff2ccc.slice. Mar 4 08:54:10.925759 kubelet[2894]: I0304 08:54:10.925718 2894 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-ca-bundle\") on node \"ci-4459-2-4-2-039fb286b9\" DevicePath \"\"" Mar 4 08:54:10.925759 kubelet[2894]: I0304 08:54:10.925749 2894 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-whisker-backend-key-pair\") on node \"ci-4459-2-4-2-039fb286b9\" DevicePath \"\"" Mar 4 08:54:10.925759 kubelet[2894]: I0304 08:54:10.925759 2894 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-nginx-config\") on node \"ci-4459-2-4-2-039fb286b9\" DevicePath \"\"" Mar 4 08:54:10.925759 kubelet[2894]: I0304 08:54:10.925768 2894 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkt9m\" (UniqueName: \"kubernetes.io/projected/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc-kube-api-access-bkt9m\") on node \"ci-4459-2-4-2-039fb286b9\" DevicePath \"\"" Mar 4 08:54:11.114291 kubelet[2894]: I0304 08:54:11.114109 2894 scope.go:117] "RemoveContainer" containerID="00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3" Mar 4 08:54:11.119187 containerd[1618]: time="2026-03-04T08:54:11.119054649Z" level=info msg="RemoveContainer for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\"" Mar 4 08:54:11.125541 containerd[1618]: time="2026-03-04T08:54:11.125490721Z" level=info msg="RemoveContainer for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" returns successfully" Mar 4 08:54:11.125877 kubelet[2894]: I0304 08:54:11.125762 2894 scope.go:117] "RemoveContainer" containerID="7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5" Mar 4 08:54:11.129755 containerd[1618]: time="2026-03-04T08:54:11.129712023Z" level=info msg="RemoveContainer for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\"" Mar 4 08:54:11.138186 containerd[1618]: time="2026-03-04T08:54:11.138072825Z" level=info msg="RemoveContainer for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" returns successfully" Mar 4 08:54:11.138370 kubelet[2894]: I0304 08:54:11.138340 2894 scope.go:117] "RemoveContainer" containerID="00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3" Mar 4 08:54:11.138792 containerd[1618]: time="2026-03-04T08:54:11.138634668Z" level=error msg="ContainerStatus for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\": not found" Mar 4 08:54:11.140338 kubelet[2894]: E0304 08:54:11.140291 2894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\": not found" containerID="00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3" Mar 4 08:54:11.140408 kubelet[2894]: I0304 08:54:11.140339 2894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3"} err="failed to get container status \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\": rpc error: code = NotFound desc = an error occurred when try to find container \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\": not found" Mar 4 08:54:11.140408 kubelet[2894]: I0304 08:54:11.140375 2894 scope.go:117] "RemoveContainer" containerID="7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5" Mar 4 08:54:11.140700 containerd[1618]: time="2026-03-04T08:54:11.140634958Z" level=error msg="ContainerStatus for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\": not found" Mar 4 08:54:11.140852 kubelet[2894]: E0304 08:54:11.140830 2894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\": not found" containerID="7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5" Mar 4 08:54:11.140949 kubelet[2894]: I0304 08:54:11.140929 2894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5"} err="failed to get container status \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\": rpc error: code = NotFound desc = an error occurred when try to find container \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\": not found" Mar 4 08:54:11.141016 kubelet[2894]: I0304 08:54:11.141005 2894 scope.go:117] "RemoveContainer" containerID="00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3" Mar 4 08:54:11.141298 containerd[1618]: time="2026-03-04T08:54:11.141251041Z" level=error msg="ContainerStatus for \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\": not found" Mar 4 08:54:11.141446 kubelet[2894]: I0304 08:54:11.141367 2894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3"} err="failed to get container status \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\": rpc error: code = NotFound desc = an error occurred when try to find container \"00a885a19f5d2928e774975beae43afe774c55aa2df685bfd87b6d3e63f6b4f3\": not found" Mar 4 08:54:11.141446 kubelet[2894]: I0304 08:54:11.141444 2894 scope.go:117] "RemoveContainer" containerID="7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5" Mar 4 08:54:11.141744 containerd[1618]: time="2026-03-04T08:54:11.141706083Z" level=error msg="ContainerStatus for \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\": not found" Mar 4 08:54:11.141925 kubelet[2894]: I0304 08:54:11.141891 2894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5"} err="failed to get container status \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\": rpc error: code = NotFound desc = an error occurred when try to find container \"7bc5cb60373846ac8fc8aa3a7d2dbda3e30c4c94beddc2b0f7ea149aabbe61e5\": not found" Mar 4 08:54:11.146459 kubelet[2894]: I0304 08:54:11.146070 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-qsgm2" podStartSLOduration=57.146058705 podStartE2EDuration="57.146058705s" podCreationTimestamp="2026-03-04 08:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 08:54:11.130314746 +0000 UTC m=+62.500643558" watchObservedRunningTime="2026-03-04 08:54:11.146058705 +0000 UTC m=+62.516387517" Mar 4 08:54:11.200128 systemd[1]: Created slice kubepods-besteffort-pod9ca55130_90a4_4b1d_bd37_3e2f7b0e3b05.slice - libcontainer container kubepods-besteffort-pod9ca55130_90a4_4b1d_bd37_3e2f7b0e3b05.slice. Mar 4 08:54:11.327615 kubelet[2894]: I0304 08:54:11.327531 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbn9z\" (UniqueName: \"kubernetes.io/projected/9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05-kube-api-access-nbn9z\") pod \"whisker-bfc69b9bb-8s7c2\" (UID: \"9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05\") " pod="calico-system/whisker-bfc69b9bb-8s7c2" Mar 4 08:54:11.327830 kubelet[2894]: I0304 08:54:11.327642 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05-whisker-backend-key-pair\") pod \"whisker-bfc69b9bb-8s7c2\" (UID: \"9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05\") " pod="calico-system/whisker-bfc69b9bb-8s7c2" Mar 4 08:54:11.327830 kubelet[2894]: I0304 08:54:11.327696 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05-whisker-ca-bundle\") pod \"whisker-bfc69b9bb-8s7c2\" (UID: \"9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05\") " pod="calico-system/whisker-bfc69b9bb-8s7c2" Mar 4 08:54:11.327830 kubelet[2894]: I0304 08:54:11.327761 2894 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05-nginx-config\") pod \"whisker-bfc69b9bb-8s7c2\" (UID: \"9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05\") " pod="calico-system/whisker-bfc69b9bb-8s7c2" Mar 4 08:54:11.505299 containerd[1618]: time="2026-03-04T08:54:11.505254646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bfc69b9bb-8s7c2,Uid:9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05,Namespace:calico-system,Attempt:0,}" Mar 4 08:54:11.527282 systemd[1]: run-netns-cni\x2d70f0b75f\x2d1174\x2d2147\x2d931c\x2d6e18c8423154.mount: Deactivated successfully. Mar 4 08:54:11.527372 systemd[1]: var-lib-kubelet-pods-b7abfa40\x2d2bcb\x2d4a6d\x2d9983\x2d687f32ff2ccc-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbkt9m.mount: Deactivated successfully. Mar 4 08:54:11.527421 systemd[1]: var-lib-kubelet-pods-b7abfa40\x2d2bcb\x2d4a6d\x2d9983\x2d687f32ff2ccc-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 4 08:54:11.613823 systemd-networkd[1441]: cali77dfcf50224: Link UP Mar 4 08:54:11.614298 systemd-networkd[1441]: cali77dfcf50224: Gained carrier Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.547 [INFO][5364] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0 whisker-bfc69b9bb- calico-system 9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05 1038 0 2026-03-04 08:54:11 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:bfc69b9bb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-2-039fb286b9 whisker-bfc69b9bb-8s7c2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali77dfcf50224 [] [] }} ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.547 [INFO][5364] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.570 [INFO][5379] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" HandleID="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.579 [INFO][5379] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" HandleID="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000364330), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-2-039fb286b9", "pod":"whisker-bfc69b9bb-8s7c2", "timestamp":"2026-03-04 08:54:11.570035774 +0000 UTC"}, Hostname:"ci-4459-2-4-2-039fb286b9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400018c840)} Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.579 [INFO][5379] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.579 [INFO][5379] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.579 [INFO][5379] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-2-039fb286b9' Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.582 [INFO][5379] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.586 [INFO][5379] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.591 [INFO][5379] ipam/ipam.go 526: Trying affinity for 192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.592 [INFO][5379] ipam/ipam.go 160: Attempting to load block cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.595 [INFO][5379] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.595 [INFO][5379] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.597 [INFO][5379] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1 Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.601 [INFO][5379] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.609 [INFO][5379] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.126.73/26] block=192.168.126.64/26 handle="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.609 [INFO][5379] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.126.73/26] handle="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" host="ci-4459-2-4-2-039fb286b9" Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.609 [INFO][5379] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:54:11.632015 containerd[1618]: 2026-03-04 08:54:11.609 [INFO][5379] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.126.73/26] IPv6=[] ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" HandleID="k8s-pod-network.5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" Mar 4 08:54:11.632854 containerd[1618]: 2026-03-04 08:54:11.610 [INFO][5364] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0", GenerateName:"whisker-bfc69b9bb-", Namespace:"calico-system", SelfLink:"", UID:"9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05", ResourceVersion:"1038", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 54, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bfc69b9bb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"", Pod:"whisker-bfc69b9bb-8s7c2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.126.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali77dfcf50224", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:54:11.632854 containerd[1618]: 2026-03-04 08:54:11.611 [INFO][5364] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.73/32] ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" Mar 4 08:54:11.632854 containerd[1618]: 2026-03-04 08:54:11.611 [INFO][5364] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali77dfcf50224 ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" Mar 4 08:54:11.632854 containerd[1618]: 2026-03-04 08:54:11.615 [INFO][5364] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" Mar 4 08:54:11.632854 containerd[1618]: 2026-03-04 08:54:11.617 [INFO][5364] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0", GenerateName:"whisker-bfc69b9bb-", Namespace:"calico-system", SelfLink:"", UID:"9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05", ResourceVersion:"1038", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 8, 54, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bfc69b9bb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-2-039fb286b9", ContainerID:"5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1", Pod:"whisker-bfc69b9bb-8s7c2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.126.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali77dfcf50224", MAC:"02:53:a3:20:36:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 08:54:11.632854 containerd[1618]: 2026-03-04 08:54:11.628 [INFO][5364] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" Namespace="calico-system" Pod="whisker-bfc69b9bb-8s7c2" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--bfc69b9bb--8s7c2-eth0" Mar 4 08:54:11.657729 containerd[1618]: time="2026-03-04T08:54:11.657680338Z" level=info msg="connecting to shim 5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1" address="unix:///run/containerd/s/f380309a84a80178655bd14d3384f2972e36a81ce53941740cd531cf77bdbf10" namespace=k8s.io protocol=ttrpc version=3 Mar 4 08:54:11.683358 systemd[1]: Started cri-containerd-5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1.scope - libcontainer container 5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1. Mar 4 08:54:11.723849 containerd[1618]: time="2026-03-04T08:54:11.723807953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bfc69b9bb-8s7c2,Uid:9ca55130-90a4-4b1d-bd37-3e2f7b0e3b05,Namespace:calico-system,Attempt:0,} returns sandbox id \"5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1\"" Mar 4 08:54:11.732120 containerd[1618]: time="2026-03-04T08:54:11.732017955Z" level=info msg="CreateContainer within sandbox \"5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 4 08:54:11.742200 containerd[1618]: time="2026-03-04T08:54:11.741780564Z" level=info msg="Container 7030d06b833320de51d7ac9a7947037a3287f1ddcaee1b88b757aaf0bf82bf2d: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:11.756751 containerd[1618]: time="2026-03-04T08:54:11.756633079Z" level=info msg="CreateContainer within sandbox \"5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"7030d06b833320de51d7ac9a7947037a3287f1ddcaee1b88b757aaf0bf82bf2d\"" Mar 4 08:54:11.758899 containerd[1618]: time="2026-03-04T08:54:11.758385728Z" level=info msg="StartContainer for \"7030d06b833320de51d7ac9a7947037a3287f1ddcaee1b88b757aaf0bf82bf2d\"" Mar 4 08:54:11.764177 containerd[1618]: time="2026-03-04T08:54:11.761257343Z" level=info msg="connecting to shim 7030d06b833320de51d7ac9a7947037a3287f1ddcaee1b88b757aaf0bf82bf2d" address="unix:///run/containerd/s/f380309a84a80178655bd14d3384f2972e36a81ce53941740cd531cf77bdbf10" protocol=ttrpc version=3 Mar 4 08:54:11.818456 systemd[1]: Started cri-containerd-7030d06b833320de51d7ac9a7947037a3287f1ddcaee1b88b757aaf0bf82bf2d.scope - libcontainer container 7030d06b833320de51d7ac9a7947037a3287f1ddcaee1b88b757aaf0bf82bf2d. Mar 4 08:54:11.857996 containerd[1618]: time="2026-03-04T08:54:11.857948833Z" level=info msg="StartContainer for \"7030d06b833320de51d7ac9a7947037a3287f1ddcaee1b88b757aaf0bf82bf2d\" returns successfully" Mar 4 08:54:11.864025 containerd[1618]: time="2026-03-04T08:54:11.863964663Z" level=info msg="CreateContainer within sandbox \"5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 4 08:54:11.874129 containerd[1618]: time="2026-03-04T08:54:11.873967594Z" level=info msg="Container ba36e49b891be1ee7f2bccf0fbe016eccee28c70034cf646dca9c6fe57cd920f: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:11.887398 containerd[1618]: time="2026-03-04T08:54:11.887354062Z" level=info msg="CreateContainer within sandbox \"5a27089b8987ae183a7a9b8efd713ea712affb582f2724b7d4edfefb6bea16e1\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"ba36e49b891be1ee7f2bccf0fbe016eccee28c70034cf646dca9c6fe57cd920f\"" Mar 4 08:54:11.888321 containerd[1618]: time="2026-03-04T08:54:11.888282747Z" level=info msg="StartContainer for \"ba36e49b891be1ee7f2bccf0fbe016eccee28c70034cf646dca9c6fe57cd920f\"" Mar 4 08:54:11.890852 containerd[1618]: time="2026-03-04T08:54:11.890796119Z" level=info msg="connecting to shim ba36e49b891be1ee7f2bccf0fbe016eccee28c70034cf646dca9c6fe57cd920f" address="unix:///run/containerd/s/f380309a84a80178655bd14d3384f2972e36a81ce53941740cd531cf77bdbf10" protocol=ttrpc version=3 Mar 4 08:54:11.895051 containerd[1618]: time="2026-03-04T08:54:11.895010581Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:11.895715 containerd[1618]: time="2026-03-04T08:54:11.895684184Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 4 08:54:11.896751 containerd[1618]: time="2026-03-04T08:54:11.896725109Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:11.899770 containerd[1618]: time="2026-03-04T08:54:11.899652004Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:11.900263 containerd[1618]: time="2026-03-04T08:54:11.900227887Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 5.025386986s" Mar 4 08:54:11.900321 containerd[1618]: time="2026-03-04T08:54:11.900269767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 4 08:54:11.907833 containerd[1618]: time="2026-03-04T08:54:11.907785685Z" level=info msg="CreateContainer within sandbox \"1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 4 08:54:11.916279 containerd[1618]: time="2026-03-04T08:54:11.916242768Z" level=info msg="Container 3d16178d9838320f2fcf3eb49b71fa33f3938d7be5036860f070e2ca9ff4bf1e: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:11.918463 systemd[1]: Started cri-containerd-ba36e49b891be1ee7f2bccf0fbe016eccee28c70034cf646dca9c6fe57cd920f.scope - libcontainer container ba36e49b891be1ee7f2bccf0fbe016eccee28c70034cf646dca9c6fe57cd920f. Mar 4 08:54:11.925038 containerd[1618]: time="2026-03-04T08:54:11.924979493Z" level=info msg="CreateContainer within sandbox \"1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"3d16178d9838320f2fcf3eb49b71fa33f3938d7be5036860f070e2ca9ff4bf1e\"" Mar 4 08:54:11.925727 containerd[1618]: time="2026-03-04T08:54:11.925700376Z" level=info msg="StartContainer for \"3d16178d9838320f2fcf3eb49b71fa33f3938d7be5036860f070e2ca9ff4bf1e\"" Mar 4 08:54:11.927207 containerd[1618]: time="2026-03-04T08:54:11.927130183Z" level=info msg="connecting to shim 3d16178d9838320f2fcf3eb49b71fa33f3938d7be5036860f070e2ca9ff4bf1e" address="unix:///run/containerd/s/691952b35b35c5db4bda4256bd55b7eb8810df9530729d316bdaeda2bb1f9301" protocol=ttrpc version=3 Mar 4 08:54:11.949490 systemd[1]: Started cri-containerd-3d16178d9838320f2fcf3eb49b71fa33f3938d7be5036860f070e2ca9ff4bf1e.scope - libcontainer container 3d16178d9838320f2fcf3eb49b71fa33f3938d7be5036860f070e2ca9ff4bf1e. Mar 4 08:54:11.963367 containerd[1618]: time="2026-03-04T08:54:11.963323167Z" level=info msg="StartContainer for \"ba36e49b891be1ee7f2bccf0fbe016eccee28c70034cf646dca9c6fe57cd920f\" returns successfully" Mar 4 08:54:12.021086 containerd[1618]: time="2026-03-04T08:54:12.020948939Z" level=info msg="StartContainer for \"3d16178d9838320f2fcf3eb49b71fa33f3938d7be5036860f070e2ca9ff4bf1e\" returns successfully" Mar 4 08:54:12.023598 containerd[1618]: time="2026-03-04T08:54:12.023558592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 4 08:54:12.137981 kubelet[2894]: I0304 08:54:12.137921 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-bfc69b9bb-8s7c2" podStartSLOduration=1.137903892 podStartE2EDuration="1.137903892s" podCreationTimestamp="2026-03-04 08:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 08:54:12.13759945 +0000 UTC m=+63.507928302" watchObservedRunningTime="2026-03-04 08:54:12.137903892 +0000 UTC m=+63.508232704" Mar 4 08:54:12.362434 systemd-networkd[1441]: cali7f450fe3056: Gained IPv6LL Mar 4 08:54:12.903055 kubelet[2894]: I0304 08:54:12.902729 2894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7abfa40-2bcb-4a6d-9983-687f32ff2ccc" path="/var/lib/kubelet/pods/b7abfa40-2bcb-4a6d-9983-687f32ff2ccc/volumes" Mar 4 08:54:13.322349 systemd-networkd[1441]: cali77dfcf50224: Gained IPv6LL Mar 4 08:54:13.822575 containerd[1618]: time="2026-03-04T08:54:13.821962385Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:13.823400 containerd[1618]: time="2026-03-04T08:54:13.823358993Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 4 08:54:13.824629 containerd[1618]: time="2026-03-04T08:54:13.824580399Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:13.827122 containerd[1618]: time="2026-03-04T08:54:13.827064931Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 08:54:13.828449 containerd[1618]: time="2026-03-04T08:54:13.828419018Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.804818466s" Mar 4 08:54:13.828503 containerd[1618]: time="2026-03-04T08:54:13.828452498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 4 08:54:13.832216 containerd[1618]: time="2026-03-04T08:54:13.832186557Z" level=info msg="CreateContainer within sandbox \"1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 4 08:54:13.846836 containerd[1618]: time="2026-03-04T08:54:13.846780551Z" level=info msg="Container a506aca42e42308e6ef19795fac72c13ed9072b8dda5481e4780974a6cfb56bf: CDI devices from CRI Config.CDIDevices: []" Mar 4 08:54:13.856756 containerd[1618]: time="2026-03-04T08:54:13.856701162Z" level=info msg="CreateContainer within sandbox \"1a884c25a2c47ee3d5f513513b1ca6c81d6137b26b9fc5e89667ed6db1fad961\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a506aca42e42308e6ef19795fac72c13ed9072b8dda5481e4780974a6cfb56bf\"" Mar 4 08:54:13.857510 containerd[1618]: time="2026-03-04T08:54:13.857473085Z" level=info msg="StartContainer for \"a506aca42e42308e6ef19795fac72c13ed9072b8dda5481e4780974a6cfb56bf\"" Mar 4 08:54:13.860548 containerd[1618]: time="2026-03-04T08:54:13.860518501Z" level=info msg="connecting to shim a506aca42e42308e6ef19795fac72c13ed9072b8dda5481e4780974a6cfb56bf" address="unix:///run/containerd/s/691952b35b35c5db4bda4256bd55b7eb8810df9530729d316bdaeda2bb1f9301" protocol=ttrpc version=3 Mar 4 08:54:13.879348 systemd[1]: Started cri-containerd-a506aca42e42308e6ef19795fac72c13ed9072b8dda5481e4780974a6cfb56bf.scope - libcontainer container a506aca42e42308e6ef19795fac72c13ed9072b8dda5481e4780974a6cfb56bf. Mar 4 08:54:13.962086 containerd[1618]: time="2026-03-04T08:54:13.962044255Z" level=info msg="StartContainer for \"a506aca42e42308e6ef19795fac72c13ed9072b8dda5481e4780974a6cfb56bf\" returns successfully" Mar 4 08:54:14.149974 kubelet[2894]: I0304 08:54:14.149366 2894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-hcf7z" podStartSLOduration=38.495501272 podStartE2EDuration="47.149349125s" podCreationTimestamp="2026-03-04 08:53:27 +0000 UTC" firstStartedPulling="2026-03-04 08:54:05.175188888 +0000 UTC m=+56.545517700" lastFinishedPulling="2026-03-04 08:54:13.829036741 +0000 UTC m=+65.199365553" observedRunningTime="2026-03-04 08:54:14.14649395 +0000 UTC m=+65.516822722" watchObservedRunningTime="2026-03-04 08:54:14.149349125 +0000 UTC m=+65.519677977" Mar 4 08:54:14.969731 kubelet[2894]: I0304 08:54:14.969549 2894 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 4 08:54:14.969731 kubelet[2894]: I0304 08:54:14.969581 2894 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 4 08:54:17.059042 kubelet[2894]: I0304 08:54:17.058990 2894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 08:54:24.750800 systemd[1]: Started sshd@11-10.0.9.143:22-87.236.176.43:51461.service - OpenSSH per-connection server daemon (87.236.176.43:51461). Mar 4 08:54:26.733721 sshd[5635]: Connection closed by 87.236.176.43 port 51461 Mar 4 08:54:26.735098 systemd[1]: sshd@11-10.0.9.143:22-87.236.176.43:51461.service: Deactivated successfully. Mar 4 08:54:26.775043 systemd[1]: Started sshd@12-10.0.9.143:22-87.236.176.43:35479.service - OpenSSH per-connection server daemon (87.236.176.43:35479). Mar 4 08:54:26.862471 sshd[5665]: Connection closed by 87.236.176.43 port 35479 [preauth] Mar 4 08:54:26.863973 systemd[1]: sshd@12-10.0.9.143:22-87.236.176.43:35479.service: Deactivated successfully. Mar 4 08:55:08.889289 containerd[1618]: time="2026-03-04T08:55:08.889243439Z" level=info msg="StopPodSandbox for \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\"" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.924 [WARNING][5853] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.924 [INFO][5853] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.924 [INFO][5853] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" iface="eth0" netns="" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.924 [INFO][5853] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.924 [INFO][5853] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.945 [INFO][5863] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.946 [INFO][5863] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.946 [INFO][5863] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.955 [WARNING][5863] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.956 [INFO][5863] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.958 [INFO][5863] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:55:08.961978 containerd[1618]: 2026-03-04 08:55:08.960 [INFO][5853] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:08.962565 containerd[1618]: time="2026-03-04T08:55:08.962007488Z" level=info msg="TearDown network for sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" successfully" Mar 4 08:55:08.962565 containerd[1618]: time="2026-03-04T08:55:08.962031928Z" level=info msg="StopPodSandbox for \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" returns successfully" Mar 4 08:55:08.962565 containerd[1618]: time="2026-03-04T08:55:08.962455570Z" level=info msg="RemovePodSandbox for \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\"" Mar 4 08:55:08.962565 containerd[1618]: time="2026-03-04T08:55:08.962484130Z" level=info msg="Forcibly stopping sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\"" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:08.996 [WARNING][5881] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" WorkloadEndpoint="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:08.996 [INFO][5881] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:08.997 [INFO][5881] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" iface="eth0" netns="" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:08.997 [INFO][5881] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:08.997 [INFO][5881] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:09.015 [INFO][5891] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:09.015 [INFO][5891] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:09.015 [INFO][5891] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:09.026 [WARNING][5891] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:09.026 [INFO][5891] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" HandleID="k8s-pod-network.47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Workload="ci--4459--2--4--2--039fb286b9-k8s-whisker--74d466c56f--v7njq-eth0" Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:09.028 [INFO][5891] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 08:55:09.031842 containerd[1618]: 2026-03-04 08:55:09.030 [INFO][5881] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea" Mar 4 08:55:09.032261 containerd[1618]: time="2026-03-04T08:55:09.031881122Z" level=info msg="TearDown network for sandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" successfully" Mar 4 08:55:09.033573 containerd[1618]: time="2026-03-04T08:55:09.033531851Z" level=info msg="Ensure that sandbox 47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea in task-service has been cleanup successfully" Mar 4 08:55:09.038242 containerd[1618]: time="2026-03-04T08:55:09.038199394Z" level=info msg="RemovePodSandbox \"47da3b0a044190a89f809b86bfa1247b41d1e23e331fde1c1c3e30f4d96296ea\" returns successfully" Mar 4 08:55:55.287533 update_engine[1610]: I20260304 08:55:55.287402 1610 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 4 08:55:55.287533 update_engine[1610]: I20260304 08:55:55.287499 1610 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 4 08:55:55.288130 update_engine[1610]: I20260304 08:55:55.287869 1610 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 4 08:55:55.288290 update_engine[1610]: I20260304 08:55:55.288260 1610 omaha_request_params.cc:62] Current group set to stable Mar 4 08:55:55.288403 update_engine[1610]: I20260304 08:55:55.288362 1610 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 4 08:55:55.288403 update_engine[1610]: I20260304 08:55:55.288376 1610 update_attempter.cc:643] Scheduling an action processor start. Mar 4 08:55:55.288403 update_engine[1610]: I20260304 08:55:55.288393 1610 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 4 08:55:55.288483 update_engine[1610]: I20260304 08:55:55.288420 1610 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 4 08:55:55.288483 update_engine[1610]: I20260304 08:55:55.288471 1610 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 4 08:55:55.288483 update_engine[1610]: I20260304 08:55:55.288478 1610 omaha_request_action.cc:272] Request: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288483 update_engine[1610]: Mar 4 08:55:55.288677 update_engine[1610]: I20260304 08:55:55.288484 1610 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 4 08:55:55.290485 locksmithd[1654]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 4 08:55:55.290682 update_engine[1610]: I20260304 08:55:55.290489 1610 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 4 08:55:55.291358 update_engine[1610]: I20260304 08:55:55.291309 1610 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 4 08:55:55.298454 update_engine[1610]: E20260304 08:55:55.298399 1610 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 4 08:55:55.298531 update_engine[1610]: I20260304 08:55:55.298484 1610 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 4 08:55:56.386745 systemd[1]: Started sshd@13-10.0.9.143:22-20.161.92.111:48882.service - OpenSSH per-connection server daemon (20.161.92.111:48882). Mar 4 08:55:56.922661 sshd[6108]: Accepted publickey for core from 20.161.92.111 port 48882 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:55:56.924049 sshd-session[6108]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:55:56.929444 systemd-logind[1606]: New session 12 of user core. Mar 4 08:55:56.935374 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 4 08:55:57.274913 sshd[6111]: Connection closed by 20.161.92.111 port 48882 Mar 4 08:55:57.275321 sshd-session[6108]: pam_unix(sshd:session): session closed for user core Mar 4 08:55:57.280119 systemd[1]: sshd@13-10.0.9.143:22-20.161.92.111:48882.service: Deactivated successfully. Mar 4 08:55:57.281860 systemd[1]: session-12.scope: Deactivated successfully. Mar 4 08:55:57.283158 systemd-logind[1606]: Session 12 logged out. Waiting for processes to exit. Mar 4 08:55:57.284709 systemd-logind[1606]: Removed session 12. Mar 4 08:56:02.383659 systemd[1]: Started sshd@14-10.0.9.143:22-20.161.92.111:37918.service - OpenSSH per-connection server daemon (20.161.92.111:37918). Mar 4 08:56:02.898726 sshd[6149]: Accepted publickey for core from 20.161.92.111 port 37918 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:02.900486 sshd-session[6149]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:02.905687 systemd-logind[1606]: New session 13 of user core. Mar 4 08:56:02.918374 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 4 08:56:03.249329 sshd[6152]: Connection closed by 20.161.92.111 port 37918 Mar 4 08:56:03.249748 sshd-session[6149]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:03.252651 systemd[1]: sshd@14-10.0.9.143:22-20.161.92.111:37918.service: Deactivated successfully. Mar 4 08:56:03.254918 systemd[1]: session-13.scope: Deactivated successfully. Mar 4 08:56:03.256149 systemd-logind[1606]: Session 13 logged out. Waiting for processes to exit. Mar 4 08:56:03.257347 systemd-logind[1606]: Removed session 13. Mar 4 08:56:05.221268 update_engine[1610]: I20260304 08:56:05.221191 1610 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 4 08:56:05.221268 update_engine[1610]: I20260304 08:56:05.221278 1610 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 4 08:56:05.221671 update_engine[1610]: I20260304 08:56:05.221632 1610 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 4 08:56:05.226375 update_engine[1610]: E20260304 08:56:05.226317 1610 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 4 08:56:05.226465 update_engine[1610]: I20260304 08:56:05.226415 1610 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 4 08:56:08.364610 systemd[1]: Started sshd@15-10.0.9.143:22-20.161.92.111:37928.service - OpenSSH per-connection server daemon (20.161.92.111:37928). Mar 4 08:56:08.890146 sshd[6190]: Accepted publickey for core from 20.161.92.111 port 37928 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:08.891560 sshd-session[6190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:08.895290 systemd-logind[1606]: New session 14 of user core. Mar 4 08:56:08.903323 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 4 08:56:09.246274 sshd[6195]: Connection closed by 20.161.92.111 port 37928 Mar 4 08:56:09.245485 sshd-session[6190]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:09.250864 systemd[1]: sshd@15-10.0.9.143:22-20.161.92.111:37928.service: Deactivated successfully. Mar 4 08:56:09.252914 systemd[1]: session-14.scope: Deactivated successfully. Mar 4 08:56:09.253676 systemd-logind[1606]: Session 14 logged out. Waiting for processes to exit. Mar 4 08:56:09.255943 systemd-logind[1606]: Removed session 14. Mar 4 08:56:14.351157 systemd[1]: Started sshd@16-10.0.9.143:22-20.161.92.111:37136.service - OpenSSH per-connection server daemon (20.161.92.111:37136). Mar 4 08:56:14.866319 sshd[6248]: Accepted publickey for core from 20.161.92.111 port 37136 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:14.867662 sshd-session[6248]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:14.871511 systemd-logind[1606]: New session 15 of user core. Mar 4 08:56:14.887469 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 4 08:56:15.211757 sshd[6251]: Connection closed by 20.161.92.111 port 37136 Mar 4 08:56:15.212378 sshd-session[6248]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:15.216149 systemd[1]: sshd@16-10.0.9.143:22-20.161.92.111:37136.service: Deactivated successfully. Mar 4 08:56:15.219015 systemd[1]: session-15.scope: Deactivated successfully. Mar 4 08:56:15.219301 update_engine[1610]: I20260304 08:56:15.219230 1610 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 4 08:56:15.219301 update_engine[1610]: I20260304 08:56:15.219298 1610 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 4 08:56:15.219690 update_engine[1610]: I20260304 08:56:15.219588 1610 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 4 08:56:15.219887 systemd-logind[1606]: Session 15 logged out. Waiting for processes to exit. Mar 4 08:56:15.221256 systemd-logind[1606]: Removed session 15. Mar 4 08:56:15.224552 update_engine[1610]: E20260304 08:56:15.224500 1610 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 4 08:56:15.224671 update_engine[1610]: I20260304 08:56:15.224633 1610 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 4 08:56:15.317471 systemd[1]: Started sshd@17-10.0.9.143:22-20.161.92.111:37146.service - OpenSSH per-connection server daemon (20.161.92.111:37146). Mar 4 08:56:15.836588 sshd[6266]: Accepted publickey for core from 20.161.92.111 port 37146 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:15.837893 sshd-session[6266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:15.842406 systemd-logind[1606]: New session 16 of user core. Mar 4 08:56:15.853440 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 4 08:56:16.211668 sshd[6271]: Connection closed by 20.161.92.111 port 37146 Mar 4 08:56:16.212021 sshd-session[6266]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:16.216406 systemd[1]: sshd@17-10.0.9.143:22-20.161.92.111:37146.service: Deactivated successfully. Mar 4 08:56:16.218209 systemd[1]: session-16.scope: Deactivated successfully. Mar 4 08:56:16.219481 systemd-logind[1606]: Session 16 logged out. Waiting for processes to exit. Mar 4 08:56:16.220681 systemd-logind[1606]: Removed session 16. Mar 4 08:56:16.315769 systemd[1]: Started sshd@18-10.0.9.143:22-20.161.92.111:37152.service - OpenSSH per-connection server daemon (20.161.92.111:37152). Mar 4 08:56:16.836134 sshd[6282]: Accepted publickey for core from 20.161.92.111 port 37152 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:16.837459 sshd-session[6282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:16.841874 systemd-logind[1606]: New session 17 of user core. Mar 4 08:56:16.851345 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 4 08:56:17.182281 sshd[6285]: Connection closed by 20.161.92.111 port 37152 Mar 4 08:56:17.182577 sshd-session[6282]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:17.186080 systemd[1]: sshd@18-10.0.9.143:22-20.161.92.111:37152.service: Deactivated successfully. Mar 4 08:56:17.188036 systemd[1]: session-17.scope: Deactivated successfully. Mar 4 08:56:17.188942 systemd-logind[1606]: Session 17 logged out. Waiting for processes to exit. Mar 4 08:56:17.190066 systemd-logind[1606]: Removed session 17. Mar 4 08:56:22.287908 systemd[1]: Started sshd@19-10.0.9.143:22-20.161.92.111:60130.service - OpenSSH per-connection server daemon (20.161.92.111:60130). Mar 4 08:56:22.815239 sshd[6298]: Accepted publickey for core from 20.161.92.111 port 60130 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:22.816555 sshd-session[6298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:22.821044 systemd-logind[1606]: New session 18 of user core. Mar 4 08:56:22.834331 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 4 08:56:23.158436 sshd[6301]: Connection closed by 20.161.92.111 port 60130 Mar 4 08:56:23.158723 sshd-session[6298]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:23.162319 systemd[1]: sshd@19-10.0.9.143:22-20.161.92.111:60130.service: Deactivated successfully. Mar 4 08:56:23.164159 systemd[1]: session-18.scope: Deactivated successfully. Mar 4 08:56:23.165072 systemd-logind[1606]: Session 18 logged out. Waiting for processes to exit. Mar 4 08:56:23.166214 systemd-logind[1606]: Removed session 18. Mar 4 08:56:23.262533 systemd[1]: Started sshd@20-10.0.9.143:22-20.161.92.111:60132.service - OpenSSH per-connection server daemon (20.161.92.111:60132). Mar 4 08:56:23.777858 sshd[6314]: Accepted publickey for core from 20.161.92.111 port 60132 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:23.779202 sshd-session[6314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:23.783151 systemd-logind[1606]: New session 19 of user core. Mar 4 08:56:23.793537 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 4 08:56:24.167736 sshd[6317]: Connection closed by 20.161.92.111 port 60132 Mar 4 08:56:24.168305 sshd-session[6314]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:24.171860 systemd[1]: sshd@20-10.0.9.143:22-20.161.92.111:60132.service: Deactivated successfully. Mar 4 08:56:24.173847 systemd[1]: session-19.scope: Deactivated successfully. Mar 4 08:56:24.174586 systemd-logind[1606]: Session 19 logged out. Waiting for processes to exit. Mar 4 08:56:24.175871 systemd-logind[1606]: Removed session 19. Mar 4 08:56:24.275902 systemd[1]: Started sshd@21-10.0.9.143:22-20.161.92.111:60144.service - OpenSSH per-connection server daemon (20.161.92.111:60144). Mar 4 08:56:24.795747 sshd[6328]: Accepted publickey for core from 20.161.92.111 port 60144 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:24.797076 sshd-session[6328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:24.800813 systemd-logind[1606]: New session 20 of user core. Mar 4 08:56:24.809357 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 4 08:56:25.219971 update_engine[1610]: I20260304 08:56:25.219261 1610 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 4 08:56:25.219971 update_engine[1610]: I20260304 08:56:25.219403 1610 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 4 08:56:25.219971 update_engine[1610]: I20260304 08:56:25.219928 1610 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 4 08:56:25.224594 update_engine[1610]: E20260304 08:56:25.224526 1610 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 4 08:56:25.224766 update_engine[1610]: I20260304 08:56:25.224745 1610 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 4 08:56:25.224828 update_engine[1610]: I20260304 08:56:25.224813 1610 omaha_request_action.cc:617] Omaha request response: Mar 4 08:56:25.224967 update_engine[1610]: E20260304 08:56:25.224947 1610 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 4 08:56:25.225033 update_engine[1610]: I20260304 08:56:25.225019 1610 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 4 08:56:25.225224 update_engine[1610]: I20260304 08:56:25.225066 1610 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 4 08:56:25.225224 update_engine[1610]: I20260304 08:56:25.225077 1610 update_attempter.cc:306] Processing Done. Mar 4 08:56:25.225224 update_engine[1610]: E20260304 08:56:25.225090 1610 update_attempter.cc:619] Update failed. Mar 4 08:56:25.225224 update_engine[1610]: I20260304 08:56:25.225095 1610 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 4 08:56:25.225224 update_engine[1610]: I20260304 08:56:25.225098 1610 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 4 08:56:25.225224 update_engine[1610]: I20260304 08:56:25.225103 1610 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 4 08:56:25.225807 update_engine[1610]: I20260304 08:56:25.225426 1610 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 4 08:56:25.225807 update_engine[1610]: I20260304 08:56:25.225458 1610 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 4 08:56:25.225807 update_engine[1610]: I20260304 08:56:25.225463 1610 omaha_request_action.cc:272] Request: Mar 4 08:56:25.225807 update_engine[1610]: Mar 4 08:56:25.225807 update_engine[1610]: Mar 4 08:56:25.225807 update_engine[1610]: Mar 4 08:56:25.225807 update_engine[1610]: Mar 4 08:56:25.225807 update_engine[1610]: Mar 4 08:56:25.225807 update_engine[1610]: Mar 4 08:56:25.225807 update_engine[1610]: I20260304 08:56:25.225469 1610 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 4 08:56:25.225807 update_engine[1610]: I20260304 08:56:25.225488 1610 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 4 08:56:25.225807 update_engine[1610]: I20260304 08:56:25.225768 1610 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 4 08:56:25.226147 locksmithd[1654]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 4 08:56:25.233050 update_engine[1610]: E20260304 08:56:25.232676 1610 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 4 08:56:25.233050 update_engine[1610]: I20260304 08:56:25.232815 1610 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 4 08:56:25.233050 update_engine[1610]: I20260304 08:56:25.232839 1610 omaha_request_action.cc:617] Omaha request response: Mar 4 08:56:25.233050 update_engine[1610]: I20260304 08:56:25.232856 1610 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 4 08:56:25.233050 update_engine[1610]: I20260304 08:56:25.232869 1610 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 4 08:56:25.233050 update_engine[1610]: I20260304 08:56:25.232882 1610 update_attempter.cc:306] Processing Done. Mar 4 08:56:25.233050 update_engine[1610]: I20260304 08:56:25.232896 1610 update_attempter.cc:310] Error event sent. Mar 4 08:56:25.233050 update_engine[1610]: I20260304 08:56:25.232954 1610 update_check_scheduler.cc:74] Next update check in 46m43s Mar 4 08:56:25.233299 locksmithd[1654]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 4 08:56:25.647987 sshd[6331]: Connection closed by 20.161.92.111 port 60144 Mar 4 08:56:25.648288 sshd-session[6328]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:25.651948 systemd[1]: sshd@21-10.0.9.143:22-20.161.92.111:60144.service: Deactivated successfully. Mar 4 08:56:25.654192 systemd[1]: session-20.scope: Deactivated successfully. Mar 4 08:56:25.655666 systemd-logind[1606]: Session 20 logged out. Waiting for processes to exit. Mar 4 08:56:25.656985 systemd-logind[1606]: Removed session 20. Mar 4 08:56:25.755740 systemd[1]: Started sshd@22-10.0.9.143:22-20.161.92.111:60160.service - OpenSSH per-connection server daemon (20.161.92.111:60160). Mar 4 08:56:26.281008 sshd[6385]: Accepted publickey for core from 20.161.92.111 port 60160 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:26.282585 sshd-session[6385]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:26.286385 systemd-logind[1606]: New session 21 of user core. Mar 4 08:56:26.294340 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 4 08:56:26.778183 sshd[6388]: Connection closed by 20.161.92.111 port 60160 Mar 4 08:56:26.777653 sshd-session[6385]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:26.781809 systemd[1]: sshd@22-10.0.9.143:22-20.161.92.111:60160.service: Deactivated successfully. Mar 4 08:56:26.784678 systemd[1]: session-21.scope: Deactivated successfully. Mar 4 08:56:26.785627 systemd-logind[1606]: Session 21 logged out. Waiting for processes to exit. Mar 4 08:56:26.787070 systemd-logind[1606]: Removed session 21. Mar 4 08:56:26.888551 systemd[1]: Started sshd@23-10.0.9.143:22-20.161.92.111:60170.service - OpenSSH per-connection server daemon (20.161.92.111:60170). Mar 4 08:56:27.404965 sshd[6400]: Accepted publickey for core from 20.161.92.111 port 60170 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:27.406240 sshd-session[6400]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:27.410234 systemd-logind[1606]: New session 22 of user core. Mar 4 08:56:27.417554 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 4 08:56:27.749590 sshd[6403]: Connection closed by 20.161.92.111 port 60170 Mar 4 08:56:27.750142 sshd-session[6400]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:27.753778 systemd[1]: sshd@23-10.0.9.143:22-20.161.92.111:60170.service: Deactivated successfully. Mar 4 08:56:27.755454 systemd[1]: session-22.scope: Deactivated successfully. Mar 4 08:56:27.756145 systemd-logind[1606]: Session 22 logged out. Waiting for processes to exit. Mar 4 08:56:27.757107 systemd-logind[1606]: Removed session 22. Mar 4 08:56:32.865812 systemd[1]: Started sshd@24-10.0.9.143:22-20.161.92.111:45994.service - OpenSSH per-connection server daemon (20.161.92.111:45994). Mar 4 08:56:33.383327 sshd[6439]: Accepted publickey for core from 20.161.92.111 port 45994 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:33.384609 sshd-session[6439]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:33.389273 systemd-logind[1606]: New session 23 of user core. Mar 4 08:56:33.395418 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 4 08:56:33.724982 sshd[6465]: Connection closed by 20.161.92.111 port 45994 Mar 4 08:56:33.725652 sshd-session[6439]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:33.729079 systemd[1]: sshd@24-10.0.9.143:22-20.161.92.111:45994.service: Deactivated successfully. Mar 4 08:56:33.730811 systemd[1]: session-23.scope: Deactivated successfully. Mar 4 08:56:33.731870 systemd-logind[1606]: Session 23 logged out. Waiting for processes to exit. Mar 4 08:56:33.733046 systemd-logind[1606]: Removed session 23. Mar 4 08:56:38.834687 systemd[1]: Started sshd@25-10.0.9.143:22-20.161.92.111:46008.service - OpenSSH per-connection server daemon (20.161.92.111:46008). Mar 4 08:56:39.358277 sshd[6488]: Accepted publickey for core from 20.161.92.111 port 46008 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:39.361783 sshd-session[6488]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:39.376382 systemd-logind[1606]: New session 24 of user core. Mar 4 08:56:39.383387 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 4 08:56:39.699540 sshd[6491]: Connection closed by 20.161.92.111 port 46008 Mar 4 08:56:39.699932 sshd-session[6488]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:39.703806 systemd[1]: sshd@25-10.0.9.143:22-20.161.92.111:46008.service: Deactivated successfully. Mar 4 08:56:39.705653 systemd[1]: session-24.scope: Deactivated successfully. Mar 4 08:56:39.706492 systemd-logind[1606]: Session 24 logged out. Waiting for processes to exit. Mar 4 08:56:39.707718 systemd-logind[1606]: Removed session 24. Mar 4 08:56:44.807667 systemd[1]: Started sshd@26-10.0.9.143:22-20.161.92.111:45298.service - OpenSSH per-connection server daemon (20.161.92.111:45298). Mar 4 08:56:45.323726 sshd[6529]: Accepted publickey for core from 20.161.92.111 port 45298 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:45.324937 sshd-session[6529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:45.328813 systemd-logind[1606]: New session 25 of user core. Mar 4 08:56:45.336389 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 4 08:56:45.664145 sshd[6533]: Connection closed by 20.161.92.111 port 45298 Mar 4 08:56:45.664579 sshd-session[6529]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:45.668598 systemd[1]: sshd@26-10.0.9.143:22-20.161.92.111:45298.service: Deactivated successfully. Mar 4 08:56:45.670371 systemd[1]: session-25.scope: Deactivated successfully. Mar 4 08:56:45.672116 systemd-logind[1606]: Session 25 logged out. Waiting for processes to exit. Mar 4 08:56:45.673480 systemd-logind[1606]: Removed session 25. Mar 4 08:56:50.773597 systemd[1]: Started sshd@27-10.0.9.143:22-20.161.92.111:39854.service - OpenSSH per-connection server daemon (20.161.92.111:39854). Mar 4 08:56:51.292007 sshd[6548]: Accepted publickey for core from 20.161.92.111 port 39854 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:51.293540 sshd-session[6548]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:51.301296 systemd-logind[1606]: New session 26 of user core. Mar 4 08:56:51.306494 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 4 08:56:51.637376 sshd[6553]: Connection closed by 20.161.92.111 port 39854 Mar 4 08:56:51.638106 sshd-session[6548]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:51.642844 systemd[1]: sshd@27-10.0.9.143:22-20.161.92.111:39854.service: Deactivated successfully. Mar 4 08:56:51.644628 systemd[1]: session-26.scope: Deactivated successfully. Mar 4 08:56:51.645395 systemd-logind[1606]: Session 26 logged out. Waiting for processes to exit. Mar 4 08:56:51.646524 systemd-logind[1606]: Removed session 26. Mar 4 08:56:56.751399 systemd[1]: Started sshd@28-10.0.9.143:22-20.161.92.111:39860.service - OpenSSH per-connection server daemon (20.161.92.111:39860). Mar 4 08:56:57.261886 sshd[6598]: Accepted publickey for core from 20.161.92.111 port 39860 ssh2: RSA SHA256:na+AC0ZXHqKEfpCsWMOZUJeYWnyBsv4hTIpBZOho1QE Mar 4 08:56:57.263482 sshd-session[6598]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 08:56:57.267643 systemd-logind[1606]: New session 27 of user core. Mar 4 08:56:57.277518 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 4 08:56:57.611975 sshd[6602]: Connection closed by 20.161.92.111 port 39860 Mar 4 08:56:57.612554 sshd-session[6598]: pam_unix(sshd:session): session closed for user core Mar 4 08:56:57.615994 systemd[1]: sshd@28-10.0.9.143:22-20.161.92.111:39860.service: Deactivated successfully. Mar 4 08:56:57.617903 systemd[1]: session-27.scope: Deactivated successfully. Mar 4 08:56:57.620321 systemd-logind[1606]: Session 27 logged out. Waiting for processes to exit. Mar 4 08:56:57.622060 systemd-logind[1606]: Removed session 27. Mar 4 08:57:13.493207 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec