Jan 13 23:45:38.435288 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 13 23:45:38.435312 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Jan 13 22:00:26 -00 2026 Jan 13 23:45:38.435323 kernel: KASLR enabled Jan 13 23:45:38.435329 kernel: efi: EFI v2.7 by EDK II Jan 13 23:45:38.435335 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 13 23:45:38.435341 kernel: random: crng init done Jan 13 23:45:38.435348 kernel: secureboot: Secure boot disabled Jan 13 23:45:38.435354 kernel: ACPI: Early table checksum verification disabled Jan 13 23:45:38.435363 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 13 23:45:38.435370 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 13 23:45:38.435376 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435382 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435388 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435394 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435403 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435410 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435417 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435424 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435430 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435437 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 23:45:38.435444 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 13 23:45:38.435451 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 13 23:45:38.435457 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 13 23:45:38.435465 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 13 23:45:38.435474 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 13 23:45:38.435480 kernel: Zone ranges: Jan 13 23:45:38.435489 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 13 23:45:38.435495 kernel: DMA32 empty Jan 13 23:45:38.435501 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 13 23:45:38.435508 kernel: Device empty Jan 13 23:45:38.435514 kernel: Movable zone start for each node Jan 13 23:45:38.435520 kernel: Early memory node ranges Jan 13 23:45:38.435527 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 13 23:45:38.435533 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 13 23:45:38.435540 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 13 23:45:38.435547 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 13 23:45:38.435554 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 13 23:45:38.435560 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 13 23:45:38.435567 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 13 23:45:38.435573 kernel: psci: probing for conduit method from ACPI. Jan 13 23:45:38.435582 kernel: psci: PSCIv1.3 detected in firmware. Jan 13 23:45:38.435591 kernel: psci: Using standard PSCI v0.2 function IDs Jan 13 23:45:38.435598 kernel: psci: Trusted OS migration not required Jan 13 23:45:38.435604 kernel: psci: SMC Calling Convention v1.1 Jan 13 23:45:38.435611 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 13 23:45:38.435618 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 13 23:45:38.435625 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 13 23:45:38.435632 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 13 23:45:38.435639 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 13 23:45:38.435647 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 13 23:45:38.435653 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 13 23:45:38.435661 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 13 23:45:38.435668 kernel: Detected PIPT I-cache on CPU0 Jan 13 23:45:38.435675 kernel: CPU features: detected: GIC system register CPU interface Jan 13 23:45:38.435682 kernel: CPU features: detected: Spectre-v4 Jan 13 23:45:38.435689 kernel: CPU features: detected: Spectre-BHB Jan 13 23:45:38.435695 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 13 23:45:38.435702 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 13 23:45:38.435709 kernel: CPU features: detected: ARM erratum 1418040 Jan 13 23:45:38.435716 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 13 23:45:38.435724 kernel: alternatives: applying boot alternatives Jan 13 23:45:38.435732 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=3d3f73de8d2693594dfefd279d2c8d77c282a05a4cbc54177503d31784261f6b Jan 13 23:45:38.435739 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 13 23:45:38.435746 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 13 23:45:38.435753 kernel: Fallback order for Node 0: 0 Jan 13 23:45:38.435760 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 13 23:45:38.435766 kernel: Policy zone: Normal Jan 13 23:45:38.435773 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 23:45:38.435780 kernel: software IO TLB: area num 4. Jan 13 23:45:38.435787 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 13 23:45:38.435795 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 13 23:45:38.435802 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 23:45:38.435809 kernel: rcu: RCU event tracing is enabled. Jan 13 23:45:38.435816 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 13 23:45:38.435823 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 23:45:38.435830 kernel: Tracing variant of Tasks RCU enabled. Jan 13 23:45:38.435837 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 23:45:38.435844 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 13 23:45:38.435851 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 13 23:45:38.435858 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 13 23:45:38.435865 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 13 23:45:38.435873 kernel: GICv3: 256 SPIs implemented Jan 13 23:45:38.435879 kernel: GICv3: 0 Extended SPIs implemented Jan 13 23:45:38.435886 kernel: Root IRQ handler: gic_handle_irq Jan 13 23:45:38.435893 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 13 23:45:38.435900 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 13 23:45:38.435906 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 13 23:45:38.435913 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 13 23:45:38.435920 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 13 23:45:38.435927 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 13 23:45:38.435934 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 13 23:45:38.435941 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 13 23:45:38.435948 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 13 23:45:38.435956 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 13 23:45:38.435963 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 13 23:45:38.435970 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 13 23:45:38.435977 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 13 23:45:38.435984 kernel: arm-pv: using stolen time PV Jan 13 23:45:38.435991 kernel: Console: colour dummy device 80x25 Jan 13 23:45:38.435999 kernel: ACPI: Core revision 20240827 Jan 13 23:45:38.436006 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 13 23:45:38.436015 kernel: pid_max: default: 32768 minimum: 301 Jan 13 23:45:38.436022 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 13 23:45:38.436029 kernel: landlock: Up and running. Jan 13 23:45:38.436036 kernel: SELinux: Initializing. Jan 13 23:45:38.436044 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 23:45:38.436051 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 23:45:38.436058 kernel: rcu: Hierarchical SRCU implementation. Jan 13 23:45:38.436075 kernel: rcu: Max phase no-delay instances is 400. Jan 13 23:45:38.436084 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 13 23:45:38.436091 kernel: Remapping and enabling EFI services. Jan 13 23:45:38.436099 kernel: smp: Bringing up secondary CPUs ... Jan 13 23:45:38.436106 kernel: Detected PIPT I-cache on CPU1 Jan 13 23:45:38.436114 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 13 23:45:38.436121 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 13 23:45:38.436128 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 13 23:45:38.436137 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 13 23:45:38.436144 kernel: Detected PIPT I-cache on CPU2 Jan 13 23:45:38.436156 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 13 23:45:38.436166 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 13 23:45:38.436174 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 13 23:45:38.436181 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 13 23:45:38.436189 kernel: Detected PIPT I-cache on CPU3 Jan 13 23:45:38.436196 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 13 23:45:38.436206 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 13 23:45:38.436213 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 13 23:45:38.436220 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 13 23:45:38.436234 kernel: smp: Brought up 1 node, 4 CPUs Jan 13 23:45:38.436242 kernel: SMP: Total of 4 processors activated. Jan 13 23:45:38.436249 kernel: CPU: All CPU(s) started at EL1 Jan 13 23:45:38.436259 kernel: CPU features: detected: 32-bit EL0 Support Jan 13 23:45:38.436267 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 13 23:45:38.436274 kernel: CPU features: detected: Common not Private translations Jan 13 23:45:38.436282 kernel: CPU features: detected: CRC32 instructions Jan 13 23:45:38.436289 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 13 23:45:38.436296 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 13 23:45:38.436304 kernel: CPU features: detected: LSE atomic instructions Jan 13 23:45:38.436313 kernel: CPU features: detected: Privileged Access Never Jan 13 23:45:38.436321 kernel: CPU features: detected: RAS Extension Support Jan 13 23:45:38.436328 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 13 23:45:38.436335 kernel: alternatives: applying system-wide alternatives Jan 13 23:45:38.436343 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 13 23:45:38.436351 kernel: Memory: 16324432K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 12480K init, 1038K bss, 430000K reserved, 16384K cma-reserved) Jan 13 23:45:38.436359 kernel: devtmpfs: initialized Jan 13 23:45:38.436366 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 23:45:38.436375 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 13 23:45:38.436383 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 13 23:45:38.436390 kernel: 0 pages in range for non-PLT usage Jan 13 23:45:38.436398 kernel: 515168 pages in range for PLT usage Jan 13 23:45:38.436405 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 23:45:38.436412 kernel: SMBIOS 3.0.0 present. Jan 13 23:45:38.436420 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 13 23:45:38.436428 kernel: DMI: Memory slots populated: 1/1 Jan 13 23:45:38.436436 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 23:45:38.436444 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 13 23:45:38.436451 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 13 23:45:38.436459 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 13 23:45:38.436466 kernel: audit: initializing netlink subsys (disabled) Jan 13 23:45:38.436474 kernel: audit: type=2000 audit(0.038:1): state=initialized audit_enabled=0 res=1 Jan 13 23:45:38.436493 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 23:45:38.436501 kernel: cpuidle: using governor menu Jan 13 23:45:38.436508 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 13 23:45:38.436516 kernel: ASID allocator initialised with 32768 entries Jan 13 23:45:38.436523 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 23:45:38.436531 kernel: Serial: AMBA PL011 UART driver Jan 13 23:45:38.436539 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 23:45:38.436547 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 23:45:38.436556 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 13 23:45:38.436563 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 13 23:45:38.436571 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 23:45:38.436578 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 23:45:38.436586 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 13 23:45:38.436593 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 13 23:45:38.436600 kernel: ACPI: Added _OSI(Module Device) Jan 13 23:45:38.436609 kernel: ACPI: Added _OSI(Processor Device) Jan 13 23:45:38.436617 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 23:45:38.436624 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 23:45:38.436632 kernel: ACPI: Interpreter enabled Jan 13 23:45:38.436639 kernel: ACPI: Using GIC for interrupt routing Jan 13 23:45:38.436647 kernel: ACPI: MCFG table detected, 1 entries Jan 13 23:45:38.436654 kernel: ACPI: CPU0 has been hot-added Jan 13 23:45:38.436663 kernel: ACPI: CPU1 has been hot-added Jan 13 23:45:38.436670 kernel: ACPI: CPU2 has been hot-added Jan 13 23:45:38.436678 kernel: ACPI: CPU3 has been hot-added Jan 13 23:45:38.436685 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 13 23:45:38.436693 kernel: printk: legacy console [ttyAMA0] enabled Jan 13 23:45:38.436701 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 13 23:45:38.436869 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 23:45:38.436965 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 13 23:45:38.437048 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 13 23:45:38.437152 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 13 23:45:38.437246 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 13 23:45:38.437256 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 13 23:45:38.437264 kernel: PCI host bridge to bus 0000:00 Jan 13 23:45:38.437368 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 13 23:45:38.437446 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 13 23:45:38.437521 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 13 23:45:38.437594 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 13 23:45:38.437690 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 13 23:45:38.437785 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.437875 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 13 23:45:38.437959 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 23:45:38.438042 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 13 23:45:38.438163 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 13 23:45:38.438261 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.438349 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 13 23:45:38.438431 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 13 23:45:38.438513 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 13 23:45:38.438617 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.438700 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 13 23:45:38.438781 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 13 23:45:38.438866 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 13 23:45:38.438948 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 13 23:45:38.439037 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.439148 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 13 23:45:38.439232 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 13 23:45:38.439316 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 13 23:45:38.439409 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.439511 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 13 23:45:38.439624 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 13 23:45:38.439706 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 13 23:45:38.439787 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 13 23:45:38.439876 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.439961 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 13 23:45:38.440043 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 13 23:45:38.440136 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 13 23:45:38.440226 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 13 23:45:38.440315 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.440402 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 13 23:45:38.440496 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 13 23:45:38.440592 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.440676 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 13 23:45:38.440765 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 13 23:45:38.440856 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.440942 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 13 23:45:38.441023 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 13 23:45:38.441134 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.441221 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 13 23:45:38.441305 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 13 23:45:38.441401 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.441487 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 13 23:45:38.441568 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 13 23:45:38.441658 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.441746 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 13 23:45:38.441841 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 13 23:45:38.441933 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.442016 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 13 23:45:38.442133 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 13 23:45:38.442228 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.442311 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 13 23:45:38.442397 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 13 23:45:38.442486 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.442567 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 13 23:45:38.442647 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 13 23:45:38.442751 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.442839 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 13 23:45:38.442923 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 13 23:45:38.443024 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.443140 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 13 23:45:38.443224 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 13 23:45:38.443313 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.443398 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 13 23:45:38.443479 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 13 23:45:38.443560 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 13 23:45:38.443641 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 13 23:45:38.443727 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.443808 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 13 23:45:38.443891 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 13 23:45:38.443980 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 13 23:45:38.444081 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 13 23:45:38.444213 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.444299 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 13 23:45:38.444380 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 13 23:45:38.444467 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 13 23:45:38.444565 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 13 23:45:38.444657 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.444740 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 13 23:45:38.444821 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 13 23:45:38.444933 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 13 23:45:38.445025 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 13 23:45:38.445134 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.445221 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 13 23:45:38.445309 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 13 23:45:38.445392 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 13 23:45:38.445476 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 13 23:45:38.445568 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.445651 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 13 23:45:38.445734 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 13 23:45:38.445816 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 13 23:45:38.445897 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 13 23:45:38.445987 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.446082 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 13 23:45:38.446168 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 13 23:45:38.446249 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 13 23:45:38.446331 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 13 23:45:38.446420 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.446503 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 13 23:45:38.446600 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 13 23:45:38.446692 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 13 23:45:38.446777 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 13 23:45:38.446888 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.446982 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 13 23:45:38.447087 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 13 23:45:38.447192 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 13 23:45:38.447290 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 13 23:45:38.447381 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.447466 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 13 23:45:38.447548 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 13 23:45:38.447628 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 13 23:45:38.447711 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 13 23:45:38.447799 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.447887 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 13 23:45:38.447969 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 13 23:45:38.448049 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 13 23:45:38.448146 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 13 23:45:38.448245 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.448330 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 13 23:45:38.448412 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 13 23:45:38.448511 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 13 23:45:38.448595 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 13 23:45:38.448685 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.448767 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 13 23:45:38.448847 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 13 23:45:38.448928 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 13 23:45:38.449013 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 13 23:45:38.449141 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.449233 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 13 23:45:38.449316 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 13 23:45:38.449401 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 13 23:45:38.449481 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 13 23:45:38.449576 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.449661 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 13 23:45:38.449743 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 13 23:45:38.449825 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 13 23:45:38.449909 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 13 23:45:38.450002 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 13 23:45:38.450108 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 13 23:45:38.450196 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 13 23:45:38.450281 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 13 23:45:38.450365 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 13 23:45:38.450459 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 13 23:45:38.450543 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 13 23:45:38.450630 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 13 23:45:38.450713 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 13 23:45:38.450805 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 13 23:45:38.450889 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 13 23:45:38.450981 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 13 23:45:38.451076 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 13 23:45:38.451190 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 13 23:45:38.451292 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 13 23:45:38.451379 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 13 23:45:38.451475 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 13 23:45:38.451568 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 13 23:45:38.451658 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 13 23:45:38.451750 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 13 23:45:38.451845 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 13 23:45:38.451932 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 13 23:45:38.452020 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 13 23:45:38.452146 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 13 23:45:38.452236 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 13 23:45:38.452327 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 13 23:45:38.452437 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 13 23:45:38.452542 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 13 23:45:38.452631 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 13 23:45:38.452713 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 13 23:45:38.452795 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 13 23:45:38.452880 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 13 23:45:38.452962 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 13 23:45:38.453048 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 13 23:45:38.453161 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 13 23:45:38.453248 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 13 23:45:38.453335 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 13 23:45:38.453432 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 23:45:38.453522 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 13 23:45:38.453608 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 13 23:45:38.453709 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 23:45:38.453792 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 13 23:45:38.453875 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 13 23:45:38.453976 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 23:45:38.454077 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 13 23:45:38.454184 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 13 23:45:38.454273 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 23:45:38.454356 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 13 23:45:38.454439 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 13 23:45:38.454526 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 23:45:38.454614 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 13 23:45:38.454698 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 13 23:45:38.454788 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 13 23:45:38.454874 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 23:45:38.454983 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 13 23:45:38.455104 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 13 23:45:38.455200 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 13 23:45:38.455284 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 13 23:45:38.455371 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 13 23:45:38.455452 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 13 23:45:38.455537 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 13 23:45:38.455631 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 23:45:38.455743 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 13 23:45:38.455829 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 13 23:45:38.455918 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 23:45:38.456009 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 13 23:45:38.456111 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 13 23:45:38.456208 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 23:45:38.456290 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 13 23:45:38.456371 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 13 23:45:38.456455 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 23:45:38.456553 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 13 23:45:38.456658 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 13 23:45:38.456745 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 23:45:38.456839 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 13 23:45:38.456927 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 13 23:45:38.457016 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 13 23:45:38.457136 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 13 23:45:38.457230 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 13 23:45:38.457318 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 13 23:45:38.457401 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 13 23:45:38.457483 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 13 23:45:38.457569 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 13 23:45:38.457653 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 13 23:45:38.457735 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 13 23:45:38.457820 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 23:45:38.457903 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 13 23:45:38.457984 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 13 23:45:38.458083 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 23:45:38.458171 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 13 23:45:38.458253 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 13 23:45:38.458338 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 23:45:38.458420 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 13 23:45:38.458501 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 13 23:45:38.458587 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 23:45:38.458672 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 13 23:45:38.458753 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 13 23:45:38.458837 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 23:45:38.458919 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 13 23:45:38.459000 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 13 23:45:38.459099 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 13 23:45:38.459184 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 13 23:45:38.459265 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 13 23:45:38.459350 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 13 23:45:38.459431 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 13 23:45:38.459512 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 13 23:45:38.459606 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 23:45:38.459688 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 13 23:45:38.459769 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 13 23:45:38.459854 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 23:45:38.459936 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 13 23:45:38.460020 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 13 23:45:38.460116 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 23:45:38.460203 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 13 23:45:38.460288 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 13 23:45:38.460375 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 23:45:38.460458 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 13 23:45:38.460563 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 13 23:45:38.460660 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 23:45:38.460743 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 13 23:45:38.460824 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 13 23:45:38.460908 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 13 23:45:38.460990 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 13 23:45:38.461121 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 13 23:45:38.461209 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 13 23:45:38.461294 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 13 23:45:38.461375 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 13 23:45:38.461460 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 13 23:45:38.461541 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 13 23:45:38.461629 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 13 23:45:38.461711 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 13 23:45:38.461794 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 13 23:45:38.461875 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 13 23:45:38.461958 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 13 23:45:38.462040 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 13 23:45:38.462134 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 13 23:45:38.462235 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 13 23:45:38.462321 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 13 23:45:38.462403 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 13 23:45:38.462487 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 13 23:45:38.462577 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 13 23:45:38.462662 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 13 23:45:38.462754 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 13 23:45:38.462839 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 13 23:45:38.462921 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 13 23:45:38.463004 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 13 23:45:38.463106 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 13 23:45:38.463192 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 13 23:45:38.463282 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 13 23:45:38.463365 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 13 23:45:38.463449 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 13 23:45:38.463533 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 13 23:45:38.463621 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 13 23:45:38.463721 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 13 23:45:38.463812 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 13 23:45:38.463900 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 13 23:45:38.463984 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 13 23:45:38.464076 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 13 23:45:38.464177 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 13 23:45:38.464261 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 13 23:45:38.464343 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 13 23:45:38.464429 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 13 23:45:38.464531 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 13 23:45:38.464617 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 13 23:45:38.464698 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 13 23:45:38.464780 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 13 23:45:38.464861 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 13 23:45:38.464947 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 13 23:45:38.465028 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 13 23:45:38.465133 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 13 23:45:38.465221 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 13 23:45:38.465305 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 13 23:45:38.465391 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 13 23:45:38.465478 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 13 23:45:38.465561 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 13 23:45:38.465646 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 13 23:45:38.465729 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 13 23:45:38.465814 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 13 23:45:38.465906 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 13 23:45:38.465993 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 13 23:45:38.466096 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 13 23:45:38.466213 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 13 23:45:38.466304 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 13 23:45:38.466394 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 13 23:45:38.466479 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 13 23:45:38.466585 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 13 23:45:38.466672 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 13 23:45:38.466761 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 13 23:45:38.466845 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 13 23:45:38.466931 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 13 23:45:38.467020 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 13 23:45:38.467143 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 13 23:45:38.467231 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 13 23:45:38.467320 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 13 23:45:38.467401 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 13 23:45:38.467483 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 13 23:45:38.467569 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 13 23:45:38.467655 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 13 23:45:38.467741 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 13 23:45:38.467827 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 13 23:45:38.467915 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 13 23:45:38.467999 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 13 23:45:38.468110 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 13 23:45:38.468203 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 13 23:45:38.468306 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 13 23:45:38.468393 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 13 23:45:38.468476 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 13 23:45:38.468579 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 13 23:45:38.468663 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 13 23:45:38.468757 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 13 23:45:38.468839 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 13 23:45:38.468922 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 13 23:45:38.469008 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 13 23:45:38.469121 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 13 23:45:38.469209 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 13 23:45:38.469295 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 13 23:45:38.469382 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 13 23:45:38.469467 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 13 23:45:38.469555 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.469639 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.469723 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 13 23:45:38.469804 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.469887 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.469973 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 13 23:45:38.470055 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.470170 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.470256 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 13 23:45:38.470339 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.470421 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.470508 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 13 23:45:38.470599 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.470683 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.470769 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 13 23:45:38.470858 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.470947 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.471037 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 13 23:45:38.471155 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.471242 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.471325 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 13 23:45:38.471408 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.471489 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.471573 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 13 23:45:38.471659 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.471743 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.471827 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 13 23:45:38.471910 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.471993 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.472094 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 13 23:45:38.472185 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.472269 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.472355 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 13 23:45:38.472440 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.472546 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.472635 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 13 23:45:38.472725 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.472819 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.472924 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 13 23:45:38.473010 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.473127 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.473221 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 13 23:45:38.473306 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.473392 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.473477 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 13 23:45:38.473560 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.473642 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.473726 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 13 23:45:38.473807 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.474012 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.474146 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 13 23:45:38.474232 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.474314 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.474397 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 13 23:45:38.474481 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 13 23:45:38.474568 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 13 23:45:38.474654 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 13 23:45:38.474737 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 13 23:45:38.474820 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 13 23:45:38.474905 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 13 23:45:38.474988 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 13 23:45:38.475101 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 13 23:45:38.475194 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 13 23:45:38.475290 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 13 23:45:38.475380 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 13 23:45:38.475466 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 13 23:45:38.475555 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 13 23:45:38.475637 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 13 23:45:38.475722 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.475829 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.475921 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.476006 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.476119 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.476209 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.476293 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.476374 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.476466 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.476569 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.476653 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.476735 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.476819 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.476906 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.476991 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.477104 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.477203 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.477289 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.477374 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.477456 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.477545 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.477629 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.477724 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.477808 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.477891 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.477987 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.478109 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.478203 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.478288 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.478371 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.478459 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.478541 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.478641 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.478736 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.478827 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 13 23:45:38.478909 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 13 23:45:38.478999 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 13 23:45:38.479100 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 13 23:45:38.479193 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 13 23:45:38.479278 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 23:45:38.479361 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 13 23:45:38.479444 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 13 23:45:38.479544 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 13 23:45:38.479630 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 13 23:45:38.479719 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 13 23:45:38.479802 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 13 23:45:38.479890 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 13 23:45:38.479976 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 13 23:45:38.480070 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 13 23:45:38.480175 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 13 23:45:38.480261 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 13 23:45:38.480361 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 13 23:45:38.480447 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 13 23:45:38.480542 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 13 23:45:38.480626 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 13 23:45:38.480716 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 13 23:45:38.480805 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 13 23:45:38.480897 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 13 23:45:38.480981 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 13 23:45:38.481078 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 13 23:45:38.481171 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 13 23:45:38.481257 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 13 23:45:38.481346 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 13 23:45:38.481437 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 13 23:45:38.481520 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 13 23:45:38.481603 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 13 23:45:38.481693 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 13 23:45:38.481776 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 13 23:45:38.481862 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 13 23:45:38.481944 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 13 23:45:38.482028 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 13 23:45:38.482121 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 13 23:45:38.482208 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 13 23:45:38.482292 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 13 23:45:38.482377 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 13 23:45:38.482460 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 13 23:45:38.482547 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 13 23:45:38.482632 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 13 23:45:38.482713 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 13 23:45:38.482802 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 13 23:45:38.482887 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 13 23:45:38.482969 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 13 23:45:38.483050 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 13 23:45:38.483147 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 13 23:45:38.483235 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 13 23:45:38.483320 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 13 23:45:38.483404 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 13 23:45:38.483486 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 13 23:45:38.483566 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 13 23:45:38.483650 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 13 23:45:38.483733 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 13 23:45:38.483814 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 13 23:45:38.483897 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 13 23:45:38.483979 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 13 23:45:38.484076 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 13 23:45:38.484169 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 13 23:45:38.484284 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 13 23:45:38.484372 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 13 23:45:38.484455 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 13 23:45:38.484562 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 13 23:45:38.484650 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 13 23:45:38.484742 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 13 23:45:38.484825 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 13 23:45:38.484906 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 13 23:45:38.484987 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 13 23:45:38.485112 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 13 23:45:38.485203 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 13 23:45:38.485285 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 13 23:45:38.485373 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 13 23:45:38.485458 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 13 23:45:38.485540 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 13 23:45:38.485635 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 13 23:45:38.485719 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 13 23:45:38.485812 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 13 23:45:38.485896 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 13 23:45:38.485981 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 13 23:45:38.486072 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 13 23:45:38.486192 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 13 23:45:38.486279 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 13 23:45:38.486362 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 13 23:45:38.486444 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 13 23:45:38.486533 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 13 23:45:38.486616 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 13 23:45:38.486698 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 13 23:45:38.486789 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 13 23:45:38.486874 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 13 23:45:38.486963 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 13 23:45:38.487051 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 13 23:45:38.487165 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 13 23:45:38.487255 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 13 23:45:38.487342 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 13 23:45:38.487426 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 13 23:45:38.487528 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 13 23:45:38.487615 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 13 23:45:38.487700 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 13 23:45:38.487784 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 13 23:45:38.487866 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 13 23:45:38.487949 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 13 23:45:38.488031 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 13 23:45:38.488126 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 13 23:45:38.488211 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 13 23:45:38.488299 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 13 23:45:38.488381 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 13 23:45:38.488462 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 13 23:45:38.488572 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 13 23:45:38.488670 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 13 23:45:38.488758 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 13 23:45:38.488840 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 13 23:45:38.488923 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 13 23:45:38.489008 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 13 23:45:38.489107 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 13 23:45:38.489193 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 13 23:45:38.489293 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 13 23:45:38.489378 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 13 23:45:38.489461 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 13 23:45:38.489545 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 13 23:45:38.489626 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 13 23:45:38.489711 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 13 23:45:38.489793 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 13 23:45:38.489874 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 13 23:45:38.489954 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 13 23:45:38.490039 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 13 23:45:38.490143 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 13 23:45:38.490220 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 13 23:45:38.490308 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 13 23:45:38.490386 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 13 23:45:38.490471 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 13 23:45:38.490551 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 13 23:45:38.490635 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 13 23:45:38.490712 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 13 23:45:38.490795 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 13 23:45:38.490872 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 13 23:45:38.490966 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 13 23:45:38.491047 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 13 23:45:38.491153 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 13 23:45:38.491233 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 13 23:45:38.491325 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 13 23:45:38.491407 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 13 23:45:38.491490 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 13 23:45:38.491566 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 13 23:45:38.491649 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 13 23:45:38.491726 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 13 23:45:38.491811 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 13 23:45:38.491887 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 13 23:45:38.491971 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 13 23:45:38.492048 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 13 23:45:38.492172 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 13 23:45:38.492260 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 13 23:45:38.492345 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 13 23:45:38.492422 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 13 23:45:38.492529 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 13 23:45:38.492609 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 13 23:45:38.492694 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 13 23:45:38.492784 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 13 23:45:38.492877 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 13 23:45:38.492954 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 13 23:45:38.493040 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 13 23:45:38.493138 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 13 23:45:38.493229 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 13 23:45:38.493306 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 13 23:45:38.493389 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 13 23:45:38.493466 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 13 23:45:38.493544 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 13 23:45:38.493627 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 13 23:45:38.493704 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 13 23:45:38.493791 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 13 23:45:38.493878 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 13 23:45:38.493955 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 13 23:45:38.494045 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 13 23:45:38.494168 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 13 23:45:38.494250 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 13 23:45:38.494327 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 13 23:45:38.494409 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 13 23:45:38.494502 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 13 23:45:38.494582 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 13 23:45:38.494672 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 13 23:45:38.494757 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 13 23:45:38.494835 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 13 23:45:38.494918 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 13 23:45:38.494997 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 13 23:45:38.495094 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 13 23:45:38.495185 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 13 23:45:38.495262 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 13 23:45:38.495338 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 13 23:45:38.495421 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 23:45:38.495500 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 13 23:45:38.495576 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 13 23:45:38.495658 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 13 23:45:38.495743 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 13 23:45:38.495824 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 13 23:45:38.495909 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 13 23:45:38.495988 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 13 23:45:38.496086 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 13 23:45:38.496187 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 13 23:45:38.496266 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 13 23:45:38.496342 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 13 23:45:38.496429 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 13 23:45:38.496525 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 13 23:45:38.496604 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 13 23:45:38.496686 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 13 23:45:38.496773 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 13 23:45:38.496850 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 13 23:45:38.496943 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 13 23:45:38.497021 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 13 23:45:38.497126 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 13 23:45:38.497139 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 13 23:45:38.497147 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 13 23:45:38.497156 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 13 23:45:38.497166 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 13 23:45:38.497174 kernel: iommu: Default domain type: Translated Jan 13 23:45:38.497182 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 13 23:45:38.497190 kernel: efivars: Registered efivars operations Jan 13 23:45:38.497199 kernel: vgaarb: loaded Jan 13 23:45:38.497207 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 13 23:45:38.497215 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 23:45:38.497224 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 23:45:38.497232 kernel: pnp: PnP ACPI init Jan 13 23:45:38.497336 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 13 23:45:38.497349 kernel: pnp: PnP ACPI: found 1 devices Jan 13 23:45:38.497357 kernel: NET: Registered PF_INET protocol family Jan 13 23:45:38.497365 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 23:45:38.497374 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 13 23:45:38.497384 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 23:45:38.497392 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 23:45:38.497400 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 13 23:45:38.497408 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 13 23:45:38.497417 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 13 23:45:38.497425 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 13 23:45:38.497433 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 23:45:38.497525 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 13 23:45:38.497537 kernel: PCI: CLS 0 bytes, default 64 Jan 13 23:45:38.497545 kernel: kvm [1]: HYP mode not available Jan 13 23:45:38.497553 kernel: Initialise system trusted keyrings Jan 13 23:45:38.497561 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 13 23:45:38.497569 kernel: Key type asymmetric registered Jan 13 23:45:38.497577 kernel: Asymmetric key parser 'x509' registered Jan 13 23:45:38.497587 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 13 23:45:38.497596 kernel: io scheduler mq-deadline registered Jan 13 23:45:38.497604 kernel: io scheduler kyber registered Jan 13 23:45:38.497611 kernel: io scheduler bfq registered Jan 13 23:45:38.497620 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 13 23:45:38.497703 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 13 23:45:38.497793 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 13 23:45:38.497878 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.497962 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 13 23:45:38.498044 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 13 23:45:38.498157 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.498244 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 13 23:45:38.498327 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 13 23:45:38.498420 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.498510 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 13 23:45:38.498593 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 13 23:45:38.498674 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.498762 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 13 23:45:38.498845 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 13 23:45:38.498926 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.499013 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 13 23:45:38.499117 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 13 23:45:38.499202 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.499288 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 13 23:45:38.499370 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 13 23:45:38.499451 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.499538 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 13 23:45:38.499619 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 13 23:45:38.499700 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.499710 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 13 23:45:38.499790 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 13 23:45:38.499871 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 13 23:45:38.499954 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.500038 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 13 23:45:38.500135 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 13 23:45:38.500218 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.500307 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 13 23:45:38.500390 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 13 23:45:38.500472 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.500582 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 13 23:45:38.500671 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 13 23:45:38.500764 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.500848 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 13 23:45:38.500930 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 13 23:45:38.501011 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.501134 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 13 23:45:38.501222 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 13 23:45:38.501303 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.501386 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 13 23:45:38.501467 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 13 23:45:38.501548 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.501634 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 13 23:45:38.501716 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 13 23:45:38.501808 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.501821 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 13 23:45:38.501903 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 13 23:45:38.501985 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 13 23:45:38.502087 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.502179 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 13 23:45:38.502262 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 13 23:45:38.502349 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.502437 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 13 23:45:38.502518 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 13 23:45:38.502599 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.502684 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 13 23:45:38.502772 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 13 23:45:38.502854 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.502938 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 13 23:45:38.503019 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 13 23:45:38.503133 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.503225 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 13 23:45:38.503307 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 13 23:45:38.503388 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.503471 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 13 23:45:38.503552 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 13 23:45:38.503633 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.503716 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 13 23:45:38.503812 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 13 23:45:38.503895 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.503907 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 13 23:45:38.503987 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 13 23:45:38.504091 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 13 23:45:38.504180 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.504268 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 13 23:45:38.504353 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 13 23:45:38.504435 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.504538 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 13 23:45:38.504630 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 13 23:45:38.504714 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.504811 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 13 23:45:38.504895 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 13 23:45:38.504977 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.505071 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 13 23:45:38.505159 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 13 23:45:38.505242 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.505327 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 13 23:45:38.505413 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 13 23:45:38.505495 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.505580 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 13 23:45:38.505663 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 13 23:45:38.505744 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.505829 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 13 23:45:38.505933 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 13 23:45:38.506020 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.506118 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 13 23:45:38.506205 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 13 23:45:38.506308 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 23:45:38.506318 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 13 23:45:38.506329 kernel: ACPI: button: Power Button [PWRB] Jan 13 23:45:38.506418 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 13 23:45:38.506506 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 13 23:45:38.506517 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 23:45:38.506525 kernel: thunder_xcv, ver 1.0 Jan 13 23:45:38.506533 kernel: thunder_bgx, ver 1.0 Jan 13 23:45:38.506541 kernel: nicpf, ver 1.0 Jan 13 23:45:38.506551 kernel: nicvf, ver 1.0 Jan 13 23:45:38.506645 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 13 23:45:38.506729 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-13T23:45:37 UTC (1768347937) Jan 13 23:45:38.506740 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 13 23:45:38.506749 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 13 23:45:38.506757 kernel: watchdog: NMI not fully supported Jan 13 23:45:38.506765 kernel: watchdog: Hard watchdog permanently disabled Jan 13 23:45:38.506776 kernel: NET: Registered PF_INET6 protocol family Jan 13 23:45:38.506784 kernel: Segment Routing with IPv6 Jan 13 23:45:38.506792 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 23:45:38.506801 kernel: NET: Registered PF_PACKET protocol family Jan 13 23:45:38.506808 kernel: Key type dns_resolver registered Jan 13 23:45:38.506816 kernel: registered taskstats version 1 Jan 13 23:45:38.506825 kernel: Loading compiled-in X.509 certificates Jan 13 23:45:38.506834 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: d16d100cda59d8093883df975a5384fda36b7d35' Jan 13 23:45:38.506842 kernel: Demotion targets for Node 0: null Jan 13 23:45:38.506850 kernel: Key type .fscrypt registered Jan 13 23:45:38.506858 kernel: Key type fscrypt-provisioning registered Jan 13 23:45:38.506866 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 23:45:38.506874 kernel: ima: Allocated hash algorithm: sha1 Jan 13 23:45:38.506883 kernel: ima: No architecture policies found Jan 13 23:45:38.506893 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 13 23:45:38.506901 kernel: clk: Disabling unused clocks Jan 13 23:45:38.506908 kernel: PM: genpd: Disabling unused power domains Jan 13 23:45:38.506916 kernel: Freeing unused kernel memory: 12480K Jan 13 23:45:38.506925 kernel: Run /init as init process Jan 13 23:45:38.506932 kernel: with arguments: Jan 13 23:45:38.506940 kernel: /init Jan 13 23:45:38.506948 kernel: with environment: Jan 13 23:45:38.506958 kernel: HOME=/ Jan 13 23:45:38.506965 kernel: TERM=linux Jan 13 23:45:38.506973 kernel: ACPI: bus type USB registered Jan 13 23:45:38.506981 kernel: usbcore: registered new interface driver usbfs Jan 13 23:45:38.506990 kernel: usbcore: registered new interface driver hub Jan 13 23:45:38.506998 kernel: usbcore: registered new device driver usb Jan 13 23:45:38.507104 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 13 23:45:38.507196 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 13 23:45:38.507282 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 13 23:45:38.507377 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 13 23:45:38.507465 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 13 23:45:38.507549 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 13 23:45:38.507674 kernel: hub 1-0:1.0: USB hub found Jan 13 23:45:38.507792 kernel: hub 1-0:1.0: 4 ports detected Jan 13 23:45:38.507900 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 13 23:45:38.508010 kernel: hub 2-0:1.0: USB hub found Jan 13 23:45:38.508124 kernel: hub 2-0:1.0: 4 ports detected Jan 13 23:45:38.508226 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 13 23:45:38.508315 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 13 23:45:38.508326 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 13 23:45:38.508335 kernel: GPT:25804799 != 104857599 Jan 13 23:45:38.508343 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 13 23:45:38.508351 kernel: GPT:25804799 != 104857599 Jan 13 23:45:38.508359 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 13 23:45:38.508370 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 13 23:45:38.508378 kernel: SCSI subsystem initialized Jan 13 23:45:38.508387 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 23:45:38.508396 kernel: device-mapper: uevent: version 1.0.3 Jan 13 23:45:38.508405 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 13 23:45:38.508413 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 13 23:45:38.508422 kernel: raid6: neonx8 gen() 15695 MB/s Jan 13 23:45:38.508431 kernel: raid6: neonx4 gen() 15638 MB/s Jan 13 23:45:38.508440 kernel: raid6: neonx2 gen() 13331 MB/s Jan 13 23:45:38.508448 kernel: raid6: neonx1 gen() 10494 MB/s Jan 13 23:45:38.508456 kernel: raid6: int64x8 gen() 6798 MB/s Jan 13 23:45:38.508465 kernel: raid6: int64x4 gen() 7334 MB/s Jan 13 23:45:38.508473 kernel: raid6: int64x2 gen() 6102 MB/s Jan 13 23:45:38.508493 kernel: raid6: int64x1 gen() 5049 MB/s Jan 13 23:45:38.508505 kernel: raid6: using algorithm neonx8 gen() 15695 MB/s Jan 13 23:45:38.508513 kernel: raid6: .... xor() 12041 MB/s, rmw enabled Jan 13 23:45:38.508521 kernel: raid6: using neon recovery algorithm Jan 13 23:45:38.508530 kernel: xor: measuring software checksum speed Jan 13 23:45:38.508540 kernel: 8regs : 21596 MB/sec Jan 13 23:45:38.508548 kernel: 32regs : 21722 MB/sec Jan 13 23:45:38.508557 kernel: arm64_neon : 28157 MB/sec Jan 13 23:45:38.508573 kernel: xor: using function: arm64_neon (28157 MB/sec) Jan 13 23:45:38.508691 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 13 23:45:38.508705 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 23:45:38.508715 kernel: BTRFS: device fsid 68b1ce8e-a637-4e91-acf8-5a2e05e289e5 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (277) Jan 13 23:45:38.508724 kernel: BTRFS info (device dm-0): first mount of filesystem 68b1ce8e-a637-4e91-acf8-5a2e05e289e5 Jan 13 23:45:38.508734 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 13 23:45:38.508745 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 23:45:38.508753 kernel: BTRFS info (device dm-0): enabling free space tree Jan 13 23:45:38.508762 kernel: loop: module loaded Jan 13 23:45:38.508770 kernel: loop0: detected capacity change from 0 to 91832 Jan 13 23:45:38.508778 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 23:45:38.508887 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 13 23:45:38.508903 systemd[1]: Successfully made /usr/ read-only. Jan 13 23:45:38.508915 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 13 23:45:38.508924 systemd[1]: Detected virtualization kvm. Jan 13 23:45:38.508932 systemd[1]: Detected architecture arm64. Jan 13 23:45:38.508941 systemd[1]: Running in initrd. Jan 13 23:45:38.508949 systemd[1]: No hostname configured, using default hostname. Jan 13 23:45:38.508960 systemd[1]: Hostname set to . Jan 13 23:45:38.508969 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 13 23:45:38.508978 systemd[1]: Queued start job for default target initrd.target. Jan 13 23:45:38.508987 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 13 23:45:38.508996 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 23:45:38.509004 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 23:45:38.509014 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 23:45:38.509025 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 23:45:38.509034 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 23:45:38.509043 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 23:45:38.509052 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 23:45:38.509087 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 23:45:38.509100 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 13 23:45:38.509109 systemd[1]: Reached target paths.target - Path Units. Jan 13 23:45:38.509118 systemd[1]: Reached target slices.target - Slice Units. Jan 13 23:45:38.509127 systemd[1]: Reached target swap.target - Swaps. Jan 13 23:45:38.509136 systemd[1]: Reached target timers.target - Timer Units. Jan 13 23:45:38.509145 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 23:45:38.509153 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 23:45:38.509164 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 13 23:45:38.509173 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 23:45:38.509182 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 13 23:45:38.509190 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 23:45:38.509202 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 23:45:38.509212 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 23:45:38.509223 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 23:45:38.509236 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 13 23:45:38.509246 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 23:45:38.509256 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 23:45:38.509265 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 23:45:38.509275 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 13 23:45:38.509283 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 23:45:38.509294 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 23:45:38.509303 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 23:45:38.509312 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 23:45:38.509321 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 23:45:38.509332 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 23:45:38.509341 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 23:45:38.509350 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 23:45:38.509384 systemd-journald[417]: Collecting audit messages is enabled. Jan 13 23:45:38.509408 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 23:45:38.509417 kernel: Bridge firewalling registered Jan 13 23:45:38.509425 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 23:45:38.509435 kernel: audit: type=1130 audit(1768347938.444:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.509443 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 23:45:38.509452 kernel: audit: type=1130 audit(1768347938.449:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.509463 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 23:45:38.509472 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 23:45:38.509481 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 23:45:38.509491 kernel: audit: type=1130 audit(1768347938.459:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.509499 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 23:45:38.509509 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 23:45:38.509520 kernel: audit: type=1130 audit(1768347938.475:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.509529 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 23:45:38.509538 kernel: audit: type=1130 audit(1768347938.482:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.509546 kernel: audit: type=1334 audit(1768347938.484:7): prog-id=6 op=LOAD Jan 13 23:45:38.509555 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 23:45:38.509564 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 23:45:38.509575 kernel: audit: type=1130 audit(1768347938.492:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.509584 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 23:45:38.509593 systemd-journald[417]: Journal started Jan 13 23:45:38.509612 systemd-journald[417]: Runtime Journal (/run/log/journal/dde8ce1b05f64a6c8e71eebbabf5dbd7) is 8M, max 319.5M, 311.5M free. Jan 13 23:45:38.444000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.449000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.475000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.482000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.484000 audit: BPF prog-id=6 op=LOAD Jan 13 23:45:38.492000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.440084 systemd-modules-load[418]: Inserted module 'br_netfilter' Jan 13 23:45:38.511298 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 23:45:38.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.515091 kernel: audit: type=1130 audit(1768347938.512:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.515642 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 23:45:38.522918 dracut-cmdline[447]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=3d3f73de8d2693594dfefd279d2c8d77c282a05a4cbc54177503d31784261f6b Jan 13 23:45:38.523894 systemd-tmpfiles[460]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 13 23:45:38.530000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.529219 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 23:45:38.533467 kernel: audit: type=1130 audit(1768347938.530:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.553561 systemd-resolved[443]: Positive Trust Anchors: Jan 13 23:45:38.553578 systemd-resolved[443]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 23:45:38.553582 systemd-resolved[443]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 13 23:45:38.553612 systemd-resolved[443]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 23:45:38.578912 systemd-resolved[443]: Defaulting to hostname 'linux'. Jan 13 23:45:38.580000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.579799 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 23:45:38.580839 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 23:45:38.613099 kernel: Loading iSCSI transport class v2.0-870. Jan 13 23:45:38.625107 kernel: iscsi: registered transport (tcp) Jan 13 23:45:38.639102 kernel: iscsi: registered transport (qla4xxx) Jan 13 23:45:38.639134 kernel: QLogic iSCSI HBA Driver Jan 13 23:45:38.661372 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 23:45:38.682367 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 23:45:38.683000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.684391 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 23:45:38.728699 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 23:45:38.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.731796 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 23:45:38.733359 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 23:45:38.766600 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 23:45:38.767000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.768000 audit: BPF prog-id=7 op=LOAD Jan 13 23:45:38.768000 audit: BPF prog-id=8 op=LOAD Jan 13 23:45:38.769086 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 23:45:38.808282 systemd-udevd[693]: Using default interface naming scheme 'v257'. Jan 13 23:45:38.816215 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 23:45:38.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.818830 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 23:45:38.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.832027 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 23:45:38.834000 audit: BPF prog-id=9 op=LOAD Jan 13 23:45:38.834823 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 23:45:38.842785 dracut-pre-trigger[782]: rd.md=0: removing MD RAID activation Jan 13 23:45:38.865250 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 23:45:38.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.867491 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 23:45:38.877302 systemd-networkd[801]: lo: Link UP Jan 13 23:45:38.877310 systemd-networkd[801]: lo: Gained carrier Jan 13 23:45:38.879000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.877754 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 23:45:38.879388 systemd[1]: Reached target network.target - Network. Jan 13 23:45:38.960118 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 23:45:38.961000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:38.962799 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 23:45:39.040459 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 13 23:45:39.054871 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 13 23:45:39.057078 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 13 23:45:39.060083 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 13 23:45:39.063078 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 13 23:45:39.064449 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 13 23:45:39.066650 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 23:45:39.076576 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 13 23:45:39.084268 disk-uuid[877]: Primary Header is updated. Jan 13 23:45:39.084268 disk-uuid[877]: Secondary Entries is updated. Jan 13 23:45:39.084268 disk-uuid[877]: Secondary Header is updated. Jan 13 23:45:39.103873 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 23:45:39.105544 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 23:45:39.108000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:39.108767 systemd-networkd[801]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 13 23:45:39.108782 systemd-networkd[801]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 23:45:39.118278 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 13 23:45:39.118488 kernel: usbcore: registered new interface driver usbhid Jan 13 23:45:39.108874 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 23:45:39.109705 systemd-networkd[801]: eth0: Link UP Jan 13 23:45:39.110151 systemd-networkd[801]: eth0: Gained carrier Jan 13 23:45:39.110163 systemd-networkd[801]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 13 23:45:39.113746 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 23:45:39.124761 kernel: usbhid: USB HID core driver Jan 13 23:45:39.148170 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 23:45:39.149000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:39.174188 systemd-networkd[801]: eth0: DHCPv4 address 10.0.21.248/25, gateway 10.0.21.129 acquired from 10.0.21.129 Jan 13 23:45:39.184534 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 23:45:39.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:39.185957 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 23:45:39.187171 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 23:45:39.188950 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 23:45:39.191554 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 23:45:39.217092 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 23:45:39.217000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.127987 disk-uuid[878]: Warning: The kernel is still using the old partition table. Jan 13 23:45:40.127987 disk-uuid[878]: The new table will be used at the next reboot or after you Jan 13 23:45:40.127987 disk-uuid[878]: run partprobe(8) or kpartx(8) Jan 13 23:45:40.127987 disk-uuid[878]: The operation has completed successfully. Jan 13 23:45:40.133303 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 23:45:40.134000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.134000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.133407 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 23:45:40.135350 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 23:45:40.167104 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (911) Jan 13 23:45:40.170304 kernel: BTRFS info (device vda6): first mount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 13 23:45:40.170342 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 13 23:45:40.178094 kernel: BTRFS info (device vda6): turning on async discard Jan 13 23:45:40.178114 kernel: BTRFS info (device vda6): enabling free space tree Jan 13 23:45:40.183100 kernel: BTRFS info (device vda6): last unmount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 13 23:45:40.184336 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 23:45:40.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.188167 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 23:45:40.338162 ignition[930]: Ignition 2.24.0 Jan 13 23:45:40.338178 ignition[930]: Stage: fetch-offline Jan 13 23:45:40.338217 ignition[930]: no configs at "/usr/lib/ignition/base.d" Jan 13 23:45:40.338227 ignition[930]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 23:45:40.338387 ignition[930]: parsed url from cmdline: "" Jan 13 23:45:40.338390 ignition[930]: no config URL provided Jan 13 23:45:40.338395 ignition[930]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 23:45:40.338402 ignition[930]: no config at "/usr/lib/ignition/user.ign" Jan 13 23:45:40.344000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.342622 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 23:45:40.338408 ignition[930]: failed to fetch config: resource requires networking Jan 13 23:45:40.345235 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 13 23:45:40.338639 ignition[930]: Ignition finished successfully Jan 13 23:45:40.379292 ignition[940]: Ignition 2.24.0 Jan 13 23:45:40.379312 ignition[940]: Stage: fetch Jan 13 23:45:40.379561 ignition[940]: no configs at "/usr/lib/ignition/base.d" Jan 13 23:45:40.379570 ignition[940]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 23:45:40.379649 ignition[940]: parsed url from cmdline: "" Jan 13 23:45:40.379652 ignition[940]: no config URL provided Jan 13 23:45:40.379659 ignition[940]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 23:45:40.379665 ignition[940]: no config at "/usr/lib/ignition/user.ign" Jan 13 23:45:40.380019 ignition[940]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 13 23:45:40.380033 ignition[940]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 13 23:45:40.380365 ignition[940]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 13 23:45:40.772105 ignition[940]: GET result: OK Jan 13 23:45:40.772343 ignition[940]: parsing config with SHA512: 43c75ffc094c68ec23a8d21f5bb5f298a997d902ec312d49c0ed59151018039be91930b95320a6bb6ae75a1ea8065db2aab737e198223ed87eddc94ea0b26db1 Jan 13 23:45:40.777256 unknown[940]: fetched base config from "system" Jan 13 23:45:40.777267 unknown[940]: fetched base config from "system" Jan 13 23:45:40.777590 ignition[940]: fetch: fetch complete Jan 13 23:45:40.777272 unknown[940]: fetched user config from "openstack" Jan 13 23:45:40.777594 ignition[940]: fetch: fetch passed Jan 13 23:45:40.777636 ignition[940]: Ignition finished successfully Jan 13 23:45:40.785685 kernel: kauditd_printk_skb: 20 callbacks suppressed Jan 13 23:45:40.785708 kernel: audit: type=1130 audit(1768347940.782:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.781336 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 13 23:45:40.783482 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 23:45:40.810970 ignition[948]: Ignition 2.24.0 Jan 13 23:45:40.810992 ignition[948]: Stage: kargs Jan 13 23:45:40.811157 ignition[948]: no configs at "/usr/lib/ignition/base.d" Jan 13 23:45:40.811166 ignition[948]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 23:45:40.811888 ignition[948]: kargs: kargs passed Jan 13 23:45:40.811929 ignition[948]: Ignition finished successfully Jan 13 23:45:40.815959 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 23:45:40.817988 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 23:45:40.821726 kernel: audit: type=1130 audit(1768347940.816:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.816000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.839583 ignition[955]: Ignition 2.24.0 Jan 13 23:45:40.839604 ignition[955]: Stage: disks Jan 13 23:45:40.839745 ignition[955]: no configs at "/usr/lib/ignition/base.d" Jan 13 23:45:40.839753 ignition[955]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 23:45:40.840514 ignition[955]: disks: disks passed Jan 13 23:45:40.842776 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 23:45:40.843000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.840557 ignition[955]: Ignition finished successfully Jan 13 23:45:40.848415 kernel: audit: type=1130 audit(1768347940.843:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.844112 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 23:45:40.847843 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 23:45:40.849352 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 23:45:40.850810 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 23:45:40.852503 systemd[1]: Reached target basic.target - Basic System. Jan 13 23:45:40.854729 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 23:45:40.869196 systemd-networkd[801]: eth0: Gained IPv6LL Jan 13 23:45:40.908705 systemd-fsck[963]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 13 23:45:40.913861 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 23:45:40.916000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:40.918653 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 23:45:40.920831 kernel: audit: type=1130 audit(1768347940.916:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:41.037095 kernel: EXT4-fs (vda9): mounted filesystem db887ae3-d64c-46de-9f1e-de51a801ae44 r/w with ordered data mode. Quota mode: none. Jan 13 23:45:41.037328 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 23:45:41.038461 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 23:45:41.043445 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 23:45:41.045363 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 23:45:41.046252 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 23:45:41.046866 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 13 23:45:41.050018 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 23:45:41.050049 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 23:45:41.062105 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 23:45:41.064975 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 23:45:41.073124 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (971) Jan 13 23:45:41.076372 kernel: BTRFS info (device vda6): first mount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 13 23:45:41.076440 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 13 23:45:41.081425 kernel: BTRFS info (device vda6): turning on async discard Jan 13 23:45:41.081460 kernel: BTRFS info (device vda6): enabling free space tree Jan 13 23:45:41.082465 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 23:45:41.128110 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:41.260360 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 23:45:41.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:41.262642 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 23:45:41.265867 kernel: audit: type=1130 audit(1768347941.261:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:41.265815 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 23:45:41.283655 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 23:45:41.285617 kernel: BTRFS info (device vda6): last unmount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 13 23:45:41.305914 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 23:45:41.306000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:41.310095 kernel: audit: type=1130 audit(1768347941.306:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:41.314460 ignition[1076]: INFO : Ignition 2.24.0 Jan 13 23:45:41.314460 ignition[1076]: INFO : Stage: mount Jan 13 23:45:41.315812 ignition[1076]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 23:45:41.315812 ignition[1076]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 23:45:41.315812 ignition[1076]: INFO : mount: mount passed Jan 13 23:45:41.315812 ignition[1076]: INFO : Ignition finished successfully Jan 13 23:45:41.321790 kernel: audit: type=1130 audit(1768347941.318:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:41.318000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:41.317003 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 23:45:42.179098 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:44.184156 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:48.188172 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:48.192752 coreos-metadata[973]: Jan 13 23:45:48.192 WARN failed to locate config-drive, using the metadata service API instead Jan 13 23:45:48.211686 coreos-metadata[973]: Jan 13 23:45:48.211 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 13 23:45:48.350149 coreos-metadata[973]: Jan 13 23:45:48.350 INFO Fetch successful Jan 13 23:45:48.350989 coreos-metadata[973]: Jan 13 23:45:48.350 INFO wrote hostname ci-4547-0-0-n-660efdb355 to /sysroot/etc/hostname Jan 13 23:45:48.353024 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 13 23:45:48.355098 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 13 23:45:48.356000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:48.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:48.357173 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 23:45:48.363504 kernel: audit: type=1130 audit(1768347948.356:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:48.363529 kernel: audit: type=1131 audit(1768347948.356:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:48.378734 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 23:45:48.416087 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1092) Jan 13 23:45:48.419101 kernel: BTRFS info (device vda6): first mount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 13 23:45:48.419133 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 13 23:45:48.423492 kernel: BTRFS info (device vda6): turning on async discard Jan 13 23:45:48.423565 kernel: BTRFS info (device vda6): enabling free space tree Jan 13 23:45:48.424948 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 23:45:48.451692 ignition[1110]: INFO : Ignition 2.24.0 Jan 13 23:45:48.451692 ignition[1110]: INFO : Stage: files Jan 13 23:45:48.453271 ignition[1110]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 23:45:48.453271 ignition[1110]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 23:45:48.453271 ignition[1110]: DEBUG : files: compiled without relabeling support, skipping Jan 13 23:45:48.456331 ignition[1110]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 23:45:48.456331 ignition[1110]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 23:45:48.458649 ignition[1110]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 23:45:48.459787 ignition[1110]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 23:45:48.459787 ignition[1110]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 23:45:48.459214 unknown[1110]: wrote ssh authorized keys file for user: core Jan 13 23:45:48.463948 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 13 23:45:48.463948 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Jan 13 23:45:48.637513 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 13 23:45:48.777307 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 13 23:45:48.777307 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 13 23:45:48.780733 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Jan 13 23:45:49.321530 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 13 23:45:50.577583 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 13 23:45:50.577583 ignition[1110]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 13 23:45:50.580940 ignition[1110]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 23:45:50.583367 ignition[1110]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 23:45:50.583367 ignition[1110]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 13 23:45:50.583367 ignition[1110]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 13 23:45:50.589310 ignition[1110]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 23:45:50.589310 ignition[1110]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 23:45:50.589310 ignition[1110]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 23:45:50.589310 ignition[1110]: INFO : files: files passed Jan 13 23:45:50.589310 ignition[1110]: INFO : Ignition finished successfully Jan 13 23:45:50.599418 kernel: audit: type=1130 audit(1768347950.590:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.590000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.588566 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 23:45:50.591095 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 23:45:50.592621 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 23:45:50.608643 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 23:45:50.608738 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 23:45:50.615419 kernel: audit: type=1130 audit(1768347950.610:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.615451 kernel: audit: type=1131 audit(1768347950.610:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.610000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.610000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.620117 initrd-setup-root-after-ignition[1143]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 23:45:50.620117 initrd-setup-root-after-ignition[1143]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 23:45:50.622830 initrd-setup-root-after-ignition[1147]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 23:45:50.623437 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 23:45:50.625000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.625562 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 23:45:50.630363 kernel: audit: type=1130 audit(1768347950.625:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.630321 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 23:45:50.675821 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 23:45:50.675947 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 23:45:50.683380 kernel: audit: type=1130 audit(1768347950.677:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.683408 kernel: audit: type=1131 audit(1768347950.677:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.677000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.678022 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 23:45:50.684192 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 23:45:50.685949 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 23:45:50.686868 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 23:45:50.727237 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 23:45:50.728000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.729516 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 23:45:50.733030 kernel: audit: type=1130 audit(1768347950.728:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.758545 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 13 23:45:50.758755 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 23:45:50.760774 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 23:45:50.762516 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 23:45:50.763950 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 23:45:50.765000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.764088 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 23:45:50.769174 kernel: audit: type=1131 audit(1768347950.765:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.768558 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 23:45:50.769992 systemd[1]: Stopped target basic.target - Basic System. Jan 13 23:45:50.771460 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 23:45:50.773109 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 23:45:50.774851 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 23:45:50.776582 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 13 23:45:50.778158 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 23:45:50.779835 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 23:45:50.781561 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 23:45:50.783183 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 23:45:50.784694 systemd[1]: Stopped target swap.target - Swaps. Jan 13 23:45:50.785949 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 23:45:50.787000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.786090 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 23:45:50.788075 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 23:45:50.789868 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 23:45:50.791541 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 23:45:50.791623 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 23:45:50.794000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.793344 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 23:45:50.793458 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 23:45:50.797000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.795977 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 23:45:50.799000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.796110 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 23:45:50.797867 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 23:45:50.797965 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 23:45:50.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.800232 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 23:45:50.801849 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 23:45:50.801974 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 23:45:50.807000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.804420 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 23:45:50.805795 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 23:45:50.809000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.811000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.805922 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 23:45:50.807624 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 23:45:50.807727 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 23:45:50.809431 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 23:45:50.809524 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 23:45:50.820313 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 23:45:50.821246 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 23:45:50.822000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.822000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.823316 ignition[1167]: INFO : Ignition 2.24.0 Jan 13 23:45:50.823316 ignition[1167]: INFO : Stage: umount Jan 13 23:45:50.823316 ignition[1167]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 23:45:50.823316 ignition[1167]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 23:45:50.826000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.828123 ignition[1167]: INFO : umount: umount passed Jan 13 23:45:50.828123 ignition[1167]: INFO : Ignition finished successfully Jan 13 23:45:50.829000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.830000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.825335 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 23:45:50.825438 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 23:45:50.832000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.826949 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 23:45:50.835000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.826995 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 23:45:50.829124 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 23:45:50.829174 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 23:45:50.830872 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 13 23:45:50.830920 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 13 23:45:50.832345 systemd[1]: Stopped target network.target - Network. Jan 13 23:45:50.833739 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 23:45:50.833790 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 23:45:50.835364 systemd[1]: Stopped target paths.target - Path Units. Jan 13 23:45:50.836719 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 23:45:50.840095 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 23:45:50.841496 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 23:45:50.850000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.842854 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 23:45:50.852000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.844381 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 23:45:50.844418 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 23:45:50.845889 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 23:45:50.845920 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 23:45:50.847294 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 13 23:45:50.859000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.860000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.847315 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 13 23:45:50.848936 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 23:45:50.848986 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 23:45:50.850940 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 23:45:50.850981 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 23:45:50.852503 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 23:45:50.853957 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 23:45:50.856889 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 23:45:50.857366 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 23:45:50.857456 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 23:45:50.859610 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 23:45:50.859698 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 23:45:50.868281 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 23:45:50.872000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.870918 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 23:45:50.874556 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 23:45:50.874651 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 23:45:50.876000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.878000 audit: BPF prog-id=6 op=UNLOAD Jan 13 23:45:50.878577 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 13 23:45:50.880459 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 23:45:50.880525 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 23:45:50.882971 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 23:45:50.883791 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 23:45:50.883850 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 23:45:50.885000 audit: BPF prog-id=9 op=UNLOAD Jan 13 23:45:50.885000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.887000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.885694 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 23:45:50.889000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.885738 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 23:45:50.887204 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 23:45:50.887245 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 23:45:50.889153 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 23:45:50.904568 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 23:45:50.906103 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 23:45:50.907000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.907453 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 23:45:50.907492 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 23:45:50.908989 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 23:45:50.911000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.909018 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 23:45:50.910556 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 23:45:50.914000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.910604 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 23:45:50.912856 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 23:45:50.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.912908 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 23:45:50.915149 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 23:45:50.915198 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 23:45:50.929896 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 23:45:50.930907 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 13 23:45:50.932000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.930972 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 23:45:50.934000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.932994 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 23:45:50.936000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.933042 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 23:45:50.938000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.934880 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 23:45:50.934928 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 23:45:50.941000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.941000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:50.937541 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 23:45:50.937656 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 23:45:50.939842 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 23:45:50.939946 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 23:45:50.941973 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 23:45:50.943947 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 23:45:50.974719 systemd[1]: Switching root. Jan 13 23:45:51.022821 systemd-journald[417]: Journal stopped Jan 13 23:45:51.983535 systemd-journald[417]: Received SIGTERM from PID 1 (systemd). Jan 13 23:45:51.983619 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 23:45:51.983641 kernel: SELinux: policy capability open_perms=1 Jan 13 23:45:51.983655 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 23:45:51.983666 kernel: SELinux: policy capability always_check_network=0 Jan 13 23:45:51.983684 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 23:45:51.983695 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 23:45:51.983706 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 23:45:51.983716 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 23:45:51.983726 kernel: SELinux: policy capability userspace_initial_context=0 Jan 13 23:45:51.983739 systemd[1]: Successfully loaded SELinux policy in 66.043ms. Jan 13 23:45:51.983762 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.854ms. Jan 13 23:45:51.983777 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 13 23:45:51.983789 systemd[1]: Detected virtualization kvm. Jan 13 23:45:51.983800 systemd[1]: Detected architecture arm64. Jan 13 23:45:51.983812 systemd[1]: Detected first boot. Jan 13 23:45:51.983823 systemd[1]: Hostname set to . Jan 13 23:45:51.983834 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 13 23:45:51.983846 zram_generator::config[1215]: No configuration found. Jan 13 23:45:51.983861 kernel: NET: Registered PF_VSOCK protocol family Jan 13 23:45:51.983872 systemd[1]: Populated /etc with preset unit settings. Jan 13 23:45:51.983883 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 13 23:45:51.983894 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 13 23:45:51.983905 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 13 23:45:51.983916 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 23:45:51.984007 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 23:45:51.984022 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 23:45:51.984033 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 23:45:51.984044 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 23:45:51.984090 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 23:45:51.984105 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 23:45:51.984122 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 23:45:51.984133 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 23:45:51.984145 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 23:45:51.984156 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 23:45:51.984168 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 23:45:51.984179 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 23:45:51.984190 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 23:45:51.984200 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 13 23:45:51.984213 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 23:45:51.984224 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 23:45:51.984235 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 13 23:45:51.984246 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 13 23:45:51.984259 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 13 23:45:51.984270 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 23:45:51.984281 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 23:45:51.984292 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 23:45:51.984303 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 13 23:45:51.984314 systemd[1]: Reached target slices.target - Slice Units. Jan 13 23:45:51.984325 systemd[1]: Reached target swap.target - Swaps. Jan 13 23:45:51.984336 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 23:45:51.984361 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 23:45:51.984376 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 13 23:45:51.984388 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 13 23:45:51.984400 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 13 23:45:51.984411 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 23:45:51.984423 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 13 23:45:51.984434 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 13 23:45:51.984458 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 23:45:51.984470 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 23:45:51.984482 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 23:45:51.984495 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 23:45:51.984506 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 23:45:51.984516 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 23:45:51.984527 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 23:45:51.984544 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 23:45:51.984557 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 23:45:51.984569 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 23:45:51.984583 systemd[1]: Reached target machines.target - Containers. Jan 13 23:45:51.984594 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 23:45:51.984605 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 23:45:51.984617 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 23:45:51.984629 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 23:45:51.984640 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 23:45:51.984651 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 23:45:51.984664 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 23:45:51.984675 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 23:45:51.984686 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 23:45:51.984697 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 23:45:51.984707 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 13 23:45:51.984719 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 13 23:45:51.984730 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 13 23:45:51.984743 systemd[1]: Stopped systemd-fsck-usr.service. Jan 13 23:45:51.984754 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 13 23:45:51.984768 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 23:45:51.984780 kernel: fuse: init (API version 7.41) Jan 13 23:45:51.984791 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 23:45:51.984801 kernel: ACPI: bus type drm_connector registered Jan 13 23:45:51.984812 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 23:45:51.984825 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 23:45:51.984836 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 13 23:45:51.984847 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 23:45:51.984858 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 23:45:51.984869 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 23:45:51.984880 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 23:45:51.984891 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 23:45:51.984903 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 23:45:51.984914 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 23:45:51.984929 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 23:45:51.984941 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 23:45:51.984957 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 23:45:51.984994 systemd-journald[1282]: Collecting audit messages is enabled. Jan 13 23:45:51.985024 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 23:45:51.985035 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 23:45:51.985046 systemd-journald[1282]: Journal started Jan 13 23:45:51.985091 systemd-journald[1282]: Runtime Journal (/run/log/journal/dde8ce1b05f64a6c8e71eebbabf5dbd7) is 8M, max 319.5M, 311.5M free. Jan 13 23:45:51.832000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 13 23:45:51.922000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.924000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.927000 audit: BPF prog-id=14 op=UNLOAD Jan 13 23:45:51.927000 audit: BPF prog-id=13 op=UNLOAD Jan 13 23:45:51.928000 audit: BPF prog-id=15 op=LOAD Jan 13 23:45:51.928000 audit: BPF prog-id=16 op=LOAD Jan 13 23:45:51.928000 audit: BPF prog-id=17 op=LOAD Jan 13 23:45:51.978000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.980000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 13 23:45:51.980000 audit[1282]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffc5f6f040 a2=4000 a3=0 items=0 ppid=1 pid=1282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:45:51.980000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 13 23:45:51.982000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.982000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.985000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.985000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.743184 systemd[1]: Queued start job for default target multi-user.target. Jan 13 23:45:51.764484 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 13 23:45:51.764977 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 13 23:45:51.987122 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 23:45:51.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.987953 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 23:45:51.988177 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 23:45:51.989000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.989000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.989376 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 23:45:51.989544 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 23:45:51.990000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.990000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.990949 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 23:45:51.991120 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 23:45:51.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.992000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.992534 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 23:45:51.992688 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 23:45:51.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.993000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.994196 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 23:45:51.995000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:51.997196 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 23:45:51.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.000143 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 23:45:52.001000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.001877 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 23:45:52.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.003461 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 13 23:45:52.004000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.015712 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 23:45:52.017640 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 13 23:45:52.019798 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 23:45:52.021792 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 23:45:52.022868 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 23:45:52.022898 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 23:45:52.024769 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 13 23:45:52.026045 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 23:45:52.026161 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 13 23:45:52.037227 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 23:45:52.039146 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 23:45:52.040172 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 23:45:52.041245 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 23:45:52.042229 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 23:45:52.043591 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 23:45:52.046300 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 23:45:52.053000 systemd-journald[1282]: Time spent on flushing to /var/log/journal/dde8ce1b05f64a6c8e71eebbabf5dbd7 is 27.250ms for 1810 entries. Jan 13 23:45:52.053000 systemd-journald[1282]: System Journal (/var/log/journal/dde8ce1b05f64a6c8e71eebbabf5dbd7) is 8M, max 588.1M, 580.1M free. Jan 13 23:45:52.090157 systemd-journald[1282]: Received client request to flush runtime journal. Jan 13 23:45:52.090208 kernel: loop1: detected capacity change from 0 to 45344 Jan 13 23:45:52.055000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.065000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.050396 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 23:45:52.054077 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 23:45:52.056000 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 23:45:52.057517 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 23:45:52.064592 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 23:45:52.065920 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 23:45:52.068414 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 13 23:45:52.086337 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 23:45:52.092000 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 23:45:52.093000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.106031 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 23:45:52.107000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.108000 audit: BPF prog-id=18 op=LOAD Jan 13 23:45:52.109000 audit: BPF prog-id=19 op=LOAD Jan 13 23:45:52.109000 audit: BPF prog-id=20 op=LOAD Jan 13 23:45:52.110288 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 13 23:45:52.112000 audit: BPF prog-id=21 op=LOAD Jan 13 23:45:52.113468 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 23:45:52.117245 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 23:45:52.119083 kernel: loop2: detected capacity change from 0 to 207008 Jan 13 23:45:52.119636 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 13 23:45:52.121000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.123000 audit: BPF prog-id=22 op=LOAD Jan 13 23:45:52.123000 audit: BPF prog-id=23 op=LOAD Jan 13 23:45:52.123000 audit: BPF prog-id=24 op=LOAD Jan 13 23:45:52.124898 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 13 23:45:52.126000 audit: BPF prog-id=25 op=LOAD Jan 13 23:45:52.126000 audit: BPF prog-id=26 op=LOAD Jan 13 23:45:52.126000 audit: BPF prog-id=27 op=LOAD Jan 13 23:45:52.127355 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 23:45:52.153109 kernel: loop3: detected capacity change from 0 to 100192 Jan 13 23:45:52.165530 systemd-tmpfiles[1354]: ACLs are not supported, ignoring. Jan 13 23:45:52.165552 systemd-tmpfiles[1354]: ACLs are not supported, ignoring. Jan 13 23:45:52.170143 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 23:45:52.171000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.172793 systemd-nsresourced[1357]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 13 23:45:52.175000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.176000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.174438 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 13 23:45:52.175951 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 23:45:52.213808 kernel: loop4: detected capacity change from 0 to 1648 Jan 13 23:45:52.234860 systemd-oomd[1351]: No swap; memory pressure usage will be degraded Jan 13 23:45:52.235335 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 13 23:45:52.236000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.238145 systemd-resolved[1352]: Positive Trust Anchors: Jan 13 23:45:52.238167 systemd-resolved[1352]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 23:45:52.238170 systemd-resolved[1352]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 13 23:45:52.238201 systemd-resolved[1352]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 23:45:52.246084 kernel: loop5: detected capacity change from 0 to 45344 Jan 13 23:45:52.247273 systemd-resolved[1352]: Using system hostname 'ci-4547-0-0-n-660efdb355'. Jan 13 23:45:52.248580 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 23:45:52.249000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.249642 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 23:45:52.264088 kernel: loop6: detected capacity change from 0 to 207008 Jan 13 23:45:52.282096 kernel: loop7: detected capacity change from 0 to 100192 Jan 13 23:45:52.296091 kernel: loop1: detected capacity change from 0 to 1648 Jan 13 23:45:52.300512 (sd-merge)[1379]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 13 23:45:52.303543 (sd-merge)[1379]: Merged extensions into '/usr'. Jan 13 23:45:52.307671 systemd[1]: Reload requested from client PID 1335 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 23:45:52.307686 systemd[1]: Reloading... Jan 13 23:45:52.365119 zram_generator::config[1409]: No configuration found. Jan 13 23:45:52.518661 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 23:45:52.518821 systemd[1]: Reloading finished in 210 ms. Jan 13 23:45:52.556119 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 23:45:52.557000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.557446 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 23:45:52.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.571474 systemd[1]: Starting ensure-sysext.service... Jan 13 23:45:52.573209 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 23:45:52.574000 audit: BPF prog-id=8 op=UNLOAD Jan 13 23:45:52.574000 audit: BPF prog-id=7 op=UNLOAD Jan 13 23:45:52.574000 audit: BPF prog-id=28 op=LOAD Jan 13 23:45:52.574000 audit: BPF prog-id=29 op=LOAD Jan 13 23:45:52.575570 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 23:45:52.577000 audit: BPF prog-id=30 op=LOAD Jan 13 23:45:52.577000 audit: BPF prog-id=18 op=UNLOAD Jan 13 23:45:52.577000 audit: BPF prog-id=31 op=LOAD Jan 13 23:45:52.577000 audit: BPF prog-id=32 op=LOAD Jan 13 23:45:52.577000 audit: BPF prog-id=19 op=UNLOAD Jan 13 23:45:52.577000 audit: BPF prog-id=20 op=UNLOAD Jan 13 23:45:52.578000 audit: BPF prog-id=33 op=LOAD Jan 13 23:45:52.578000 audit: BPF prog-id=25 op=UNLOAD Jan 13 23:45:52.578000 audit: BPF prog-id=34 op=LOAD Jan 13 23:45:52.578000 audit: BPF prog-id=35 op=LOAD Jan 13 23:45:52.578000 audit: BPF prog-id=26 op=UNLOAD Jan 13 23:45:52.578000 audit: BPF prog-id=27 op=UNLOAD Jan 13 23:45:52.579000 audit: BPF prog-id=36 op=LOAD Jan 13 23:45:52.579000 audit: BPF prog-id=22 op=UNLOAD Jan 13 23:45:52.579000 audit: BPF prog-id=37 op=LOAD Jan 13 23:45:52.579000 audit: BPF prog-id=38 op=LOAD Jan 13 23:45:52.579000 audit: BPF prog-id=23 op=UNLOAD Jan 13 23:45:52.579000 audit: BPF prog-id=24 op=UNLOAD Jan 13 23:45:52.580000 audit: BPF prog-id=39 op=LOAD Jan 13 23:45:52.580000 audit: BPF prog-id=15 op=UNLOAD Jan 13 23:45:52.580000 audit: BPF prog-id=40 op=LOAD Jan 13 23:45:52.580000 audit: BPF prog-id=41 op=LOAD Jan 13 23:45:52.580000 audit: BPF prog-id=16 op=UNLOAD Jan 13 23:45:52.580000 audit: BPF prog-id=17 op=UNLOAD Jan 13 23:45:52.580000 audit: BPF prog-id=42 op=LOAD Jan 13 23:45:52.580000 audit: BPF prog-id=21 op=UNLOAD Jan 13 23:45:52.585084 systemd[1]: Reload requested from client PID 1446 ('systemctl') (unit ensure-sysext.service)... Jan 13 23:45:52.585104 systemd[1]: Reloading... Jan 13 23:45:52.589928 systemd-tmpfiles[1447]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 13 23:45:52.589967 systemd-tmpfiles[1447]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 13 23:45:52.590266 systemd-tmpfiles[1447]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 23:45:52.591238 systemd-tmpfiles[1447]: ACLs are not supported, ignoring. Jan 13 23:45:52.591293 systemd-tmpfiles[1447]: ACLs are not supported, ignoring. Jan 13 23:45:52.597234 systemd-tmpfiles[1447]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 23:45:52.597249 systemd-tmpfiles[1447]: Skipping /boot Jan 13 23:45:52.599935 systemd-udevd[1448]: Using default interface naming scheme 'v257'. Jan 13 23:45:52.603588 systemd-tmpfiles[1447]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 23:45:52.603607 systemd-tmpfiles[1447]: Skipping /boot Jan 13 23:45:52.646112 zram_generator::config[1480]: No configuration found. Jan 13 23:45:52.782088 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 23:45:52.834813 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 13 23:45:52.838087 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 13 23:45:52.838162 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 13 23:45:52.838176 kernel: [drm] features: -context_init Jan 13 23:45:52.836445 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 13 23:45:52.836563 systemd[1]: Reloading finished in 251 ms. Jan 13 23:45:52.844128 kernel: [drm] number of scanouts: 1 Jan 13 23:45:52.844205 kernel: [drm] number of cap sets: 0 Jan 13 23:45:52.848082 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 13 23:45:52.852112 kernel: Console: switching to colour frame buffer device 160x50 Jan 13 23:45:52.864100 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 23:45:52.868088 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 13 23:45:52.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.871000 audit: BPF prog-id=43 op=LOAD Jan 13 23:45:52.871000 audit: BPF prog-id=36 op=UNLOAD Jan 13 23:45:52.871000 audit: BPF prog-id=44 op=LOAD Jan 13 23:45:52.871000 audit: BPF prog-id=45 op=LOAD Jan 13 23:45:52.871000 audit: BPF prog-id=37 op=UNLOAD Jan 13 23:45:52.871000 audit: BPF prog-id=38 op=UNLOAD Jan 13 23:45:52.871000 audit: BPF prog-id=46 op=LOAD Jan 13 23:45:52.871000 audit: BPF prog-id=33 op=UNLOAD Jan 13 23:45:52.872000 audit: BPF prog-id=47 op=LOAD Jan 13 23:45:52.872000 audit: BPF prog-id=48 op=LOAD Jan 13 23:45:52.872000 audit: BPF prog-id=34 op=UNLOAD Jan 13 23:45:52.872000 audit: BPF prog-id=35 op=UNLOAD Jan 13 23:45:52.873000 audit: BPF prog-id=49 op=LOAD Jan 13 23:45:52.873000 audit: BPF prog-id=39 op=UNLOAD Jan 13 23:45:52.873000 audit: BPF prog-id=50 op=LOAD Jan 13 23:45:52.873000 audit: BPF prog-id=51 op=LOAD Jan 13 23:45:52.873000 audit: BPF prog-id=40 op=UNLOAD Jan 13 23:45:52.873000 audit: BPF prog-id=41 op=UNLOAD Jan 13 23:45:52.874000 audit: BPF prog-id=52 op=LOAD Jan 13 23:45:52.874000 audit: BPF prog-id=30 op=UNLOAD Jan 13 23:45:52.874000 audit: BPF prog-id=53 op=LOAD Jan 13 23:45:52.874000 audit: BPF prog-id=54 op=LOAD Jan 13 23:45:52.874000 audit: BPF prog-id=31 op=UNLOAD Jan 13 23:45:52.874000 audit: BPF prog-id=32 op=UNLOAD Jan 13 23:45:52.874000 audit: BPF prog-id=55 op=LOAD Jan 13 23:45:52.874000 audit: BPF prog-id=56 op=LOAD Jan 13 23:45:52.874000 audit: BPF prog-id=28 op=UNLOAD Jan 13 23:45:52.874000 audit: BPF prog-id=29 op=UNLOAD Jan 13 23:45:52.875000 audit: BPF prog-id=57 op=LOAD Jan 13 23:45:52.875000 audit: BPF prog-id=42 op=UNLOAD Jan 13 23:45:52.881662 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 23:45:52.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.909120 systemd[1]: Finished ensure-sysext.service. Jan 13 23:45:52.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.926458 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 23:45:52.929012 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 23:45:52.930222 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 23:45:52.939235 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 23:45:52.941193 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 23:45:52.943929 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 23:45:52.946055 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 23:45:52.948463 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 13 23:45:52.949663 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 23:45:52.949770 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 13 23:45:52.951015 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 23:45:52.953284 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 23:45:52.954435 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 13 23:45:52.957376 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 23:45:52.966000 audit: BPF prog-id=58 op=LOAD Jan 13 23:45:52.967017 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 23:45:52.968002 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 23:45:52.971048 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 23:45:52.973165 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 23:45:52.976710 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 23:45:52.979988 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 13 23:45:52.980090 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 13 23:45:52.982296 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 23:45:52.984000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.984000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.984681 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 23:45:52.984876 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 23:45:52.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.986000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.986468 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 23:45:52.987608 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 23:45:52.988000 audit[1592]: SYSTEM_BOOT pid=1592 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.989000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.989000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.991095 kernel: PTP clock support registered Jan 13 23:45:52.990488 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 23:45:52.992088 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 23:45:52.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.993000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.993707 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 23:45:52.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.996501 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 13 23:45:52.996701 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 13 23:45:52.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:52.998000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:53.007196 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 23:45:53.007344 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 23:45:53.012801 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 23:45:53.014000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:53.014701 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 23:45:53.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:45:53.023000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 13 23:45:53.023000 audit[1613]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc27d8830 a2=420 a3=0 items=0 ppid=1568 pid=1613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:45:53.023000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 13 23:45:53.023416 augenrules[1613]: No rules Jan 13 23:45:53.024918 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 23:45:53.025268 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 23:45:53.059401 systemd-networkd[1591]: lo: Link UP Jan 13 23:45:53.059410 systemd-networkd[1591]: lo: Gained carrier Jan 13 23:45:53.060683 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 23:45:53.061092 systemd-networkd[1591]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 13 23:45:53.061103 systemd-networkd[1591]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 23:45:53.061558 systemd-networkd[1591]: eth0: Link UP Jan 13 23:45:53.061762 systemd-networkd[1591]: eth0: Gained carrier Jan 13 23:45:53.061781 systemd-networkd[1591]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 13 23:45:53.063187 systemd[1]: Reached target network.target - Network. Jan 13 23:45:53.065778 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 13 23:45:53.068373 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 23:45:53.078285 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 23:45:53.082167 systemd-networkd[1591]: eth0: DHCPv4 address 10.0.21.248/25, gateway 10.0.21.129 acquired from 10.0.21.129 Jan 13 23:45:53.090368 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 13 23:45:53.093183 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 23:45:53.095265 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 23:45:53.482134 ldconfig[1579]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 23:45:53.486119 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 23:45:53.488396 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 23:45:53.509648 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 23:45:53.510865 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 23:45:53.513258 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 23:45:53.514245 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 23:45:53.515365 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 23:45:53.516784 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 23:45:53.517931 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 13 23:45:53.519143 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 13 23:45:53.520017 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 23:45:53.521154 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 23:45:53.521186 systemd[1]: Reached target paths.target - Path Units. Jan 13 23:45:53.521893 systemd[1]: Reached target timers.target - Timer Units. Jan 13 23:45:53.525117 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 23:45:53.528287 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 23:45:53.531055 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 13 23:45:53.532233 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 13 23:45:53.533291 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 13 23:45:53.540048 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 23:45:53.541219 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 13 23:45:53.542720 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 23:45:53.543748 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 23:45:53.544610 systemd[1]: Reached target basic.target - Basic System. Jan 13 23:45:53.545428 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 23:45:53.545456 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 23:45:53.550087 systemd[1]: Starting chronyd.service - NTP client/server... Jan 13 23:45:53.551722 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 23:45:53.553777 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 13 23:45:53.557222 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 23:45:53.558993 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 23:45:53.560986 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 23:45:53.562083 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:53.565253 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 23:45:53.566179 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 23:45:53.578413 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 23:45:53.582617 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 13 23:45:53.583659 jq[1637]: false Jan 13 23:45:53.586583 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 23:45:53.591236 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 23:45:53.595096 extend-filesystems[1639]: Found /dev/vda6 Jan 13 23:45:53.595157 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 23:45:53.596037 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 23:45:53.596642 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 23:45:53.597780 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 23:45:53.602665 extend-filesystems[1639]: Found /dev/vda9 Jan 13 23:45:53.606729 extend-filesystems[1639]: Checking size of /dev/vda9 Jan 13 23:45:53.603529 chronyd[1631]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 13 23:45:53.603956 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 23:45:53.606528 chronyd[1631]: Loaded seccomp filter (level 2) Jan 13 23:45:53.610829 systemd[1]: Started chronyd.service - NTP client/server. Jan 13 23:45:53.612907 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 23:45:53.614416 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 23:45:53.616495 jq[1655]: true Jan 13 23:45:53.616117 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 23:45:53.616424 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 23:45:53.616629 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 23:45:53.618973 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 23:45:53.619198 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 23:45:53.634859 extend-filesystems[1639]: Resized partition /dev/vda9 Jan 13 23:45:53.638309 jq[1670]: true Jan 13 23:45:53.647211 extend-filesystems[1683]: resize2fs 1.47.3 (8-Jul-2025) Jan 13 23:45:53.656976 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 13 23:45:53.657125 update_engine[1652]: I20260113 23:45:53.651873 1652 main.cc:92] Flatcar Update Engine starting Jan 13 23:45:53.657837 tar[1668]: linux-arm64/LICENSE Jan 13 23:45:53.657837 tar[1668]: linux-arm64/helm Jan 13 23:45:53.684756 dbus-daemon[1634]: [system] SELinux support is enabled Jan 13 23:45:53.685250 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 23:45:53.689397 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 23:45:53.689441 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 23:45:53.690795 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 23:45:53.690826 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 23:45:53.700096 update_engine[1652]: I20260113 23:45:53.698164 1652 update_check_scheduler.cc:74] Next update check in 6m0s Jan 13 23:45:53.698745 systemd-logind[1650]: New seat seat0. Jan 13 23:45:53.700593 systemd-logind[1650]: Watching system buttons on /dev/input/event0 (Power Button) Jan 13 23:45:53.700622 systemd-logind[1650]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 13 23:45:53.701261 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 23:45:53.709401 dbus-daemon[1634]: [system] Successfully activated service 'org.freedesktop.systemd1' Jan 13 23:45:53.709613 systemd[1]: Started update-engine.service - Update Engine. Jan 13 23:45:53.714204 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 23:45:53.777349 locksmithd[1702]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 23:45:53.805374 containerd[1673]: time="2026-01-13T23:45:53Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 13 23:45:53.808630 containerd[1673]: time="2026-01-13T23:45:53.808586160Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 13 23:45:53.819593 containerd[1673]: time="2026-01-13T23:45:53.819537320Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.68µs" Jan 13 23:45:53.819593 containerd[1673]: time="2026-01-13T23:45:53.819581680Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 13 23:45:53.819593 containerd[1673]: time="2026-01-13T23:45:53.819630120Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 13 23:45:53.819593 containerd[1673]: time="2026-01-13T23:45:53.819641600Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 13 23:45:53.820751 containerd[1673]: time="2026-01-13T23:45:53.819867680Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 13 23:45:53.820751 containerd[1673]: time="2026-01-13T23:45:53.819885160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 13 23:45:53.820751 containerd[1673]: time="2026-01-13T23:45:53.820401920Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 13 23:45:53.820751 containerd[1673]: time="2026-01-13T23:45:53.820421320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 13 23:45:53.820837 containerd[1673]: time="2026-01-13T23:45:53.820787920Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 13 23:45:53.820837 containerd[1673]: time="2026-01-13T23:45:53.820804400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 13 23:45:53.820837 containerd[1673]: time="2026-01-13T23:45:53.820815240Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 13 23:45:53.820837 containerd[1673]: time="2026-01-13T23:45:53.820824200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 13 23:45:53.821357 containerd[1673]: time="2026-01-13T23:45:53.820969080Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 13 23:45:53.821357 containerd[1673]: time="2026-01-13T23:45:53.820991640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 13 23:45:53.821357 containerd[1673]: time="2026-01-13T23:45:53.821082800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 13 23:45:53.821357 containerd[1673]: time="2026-01-13T23:45:53.821250440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 13 23:45:53.821357 containerd[1673]: time="2026-01-13T23:45:53.821277200Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 13 23:45:53.821357 containerd[1673]: time="2026-01-13T23:45:53.821286520Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 13 23:45:53.821357 containerd[1673]: time="2026-01-13T23:45:53.821325680Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 13 23:45:53.821506 bash[1703]: Updated "/home/core/.ssh/authorized_keys" Jan 13 23:45:53.821689 containerd[1673]: time="2026-01-13T23:45:53.821534600Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 13 23:45:53.821689 containerd[1673]: time="2026-01-13T23:45:53.821623440Z" level=info msg="metadata content store policy set" policy=shared Jan 13 23:45:53.823639 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 23:45:53.827282 systemd[1]: Starting sshkeys.service... Jan 13 23:45:53.848712 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 13 23:45:53.851908 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 13 23:45:53.858413 containerd[1673]: time="2026-01-13T23:45:53.858358360Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 13 23:45:53.858482 containerd[1673]: time="2026-01-13T23:45:53.858425200Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 13 23:45:53.858536 containerd[1673]: time="2026-01-13T23:45:53.858511880Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 13 23:45:53.858536 containerd[1673]: time="2026-01-13T23:45:53.858532360Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 13 23:45:53.858585 containerd[1673]: time="2026-01-13T23:45:53.858548320Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 13 23:45:53.858585 containerd[1673]: time="2026-01-13T23:45:53.858561000Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 13 23:45:53.858585 containerd[1673]: time="2026-01-13T23:45:53.858574480Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 13 23:45:53.858585 containerd[1673]: time="2026-01-13T23:45:53.858584560Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 13 23:45:53.858667 containerd[1673]: time="2026-01-13T23:45:53.858596520Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 13 23:45:53.858667 containerd[1673]: time="2026-01-13T23:45:53.858608360Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 13 23:45:53.858667 containerd[1673]: time="2026-01-13T23:45:53.858627320Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 13 23:45:53.858667 containerd[1673]: time="2026-01-13T23:45:53.858640200Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 13 23:45:53.858667 containerd[1673]: time="2026-01-13T23:45:53.858650520Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 13 23:45:53.858667 containerd[1673]: time="2026-01-13T23:45:53.858662560Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 13 23:45:53.858810 containerd[1673]: time="2026-01-13T23:45:53.858788280Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 13 23:45:53.858838 containerd[1673]: time="2026-01-13T23:45:53.858815920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 13 23:45:53.858838 containerd[1673]: time="2026-01-13T23:45:53.858838880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 13 23:45:53.858893 containerd[1673]: time="2026-01-13T23:45:53.858855400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 13 23:45:53.858893 containerd[1673]: time="2026-01-13T23:45:53.858866320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 13 23:45:53.858893 containerd[1673]: time="2026-01-13T23:45:53.858875800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 13 23:45:53.858893 containerd[1673]: time="2026-01-13T23:45:53.858886800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 13 23:45:53.858975 containerd[1673]: time="2026-01-13T23:45:53.858898360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 13 23:45:53.858975 containerd[1673]: time="2026-01-13T23:45:53.858910560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 13 23:45:53.858975 containerd[1673]: time="2026-01-13T23:45:53.858920880Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 13 23:45:53.858975 containerd[1673]: time="2026-01-13T23:45:53.858937840Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 13 23:45:53.858975 containerd[1673]: time="2026-01-13T23:45:53.858962960Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 13 23:45:53.859257 containerd[1673]: time="2026-01-13T23:45:53.859001200Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 13 23:45:53.859257 containerd[1673]: time="2026-01-13T23:45:53.859016560Z" level=info msg="Start snapshots syncer" Jan 13 23:45:53.859257 containerd[1673]: time="2026-01-13T23:45:53.859048040Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 13 23:45:53.859415 containerd[1673]: time="2026-01-13T23:45:53.859310520Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 13 23:45:53.859415 containerd[1673]: time="2026-01-13T23:45:53.859358000Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859413120Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859513520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859533920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859543360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859553400Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859565480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859576840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859593360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859606480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 13 23:45:53.859619 containerd[1673]: time="2026-01-13T23:45:53.859617120Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859663440Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859679880Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859688560Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859697760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859705720Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859715520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859728920Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859901560Z" level=info msg="runtime interface created" Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859907280Z" level=info msg="created NRI interface" Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859915360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859925400Z" level=info msg="Connect containerd service" Jan 13 23:45:53.860001 containerd[1673]: time="2026-01-13T23:45:53.859951800Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 23:45:53.860693 containerd[1673]: time="2026-01-13T23:45:53.860651040Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 23:45:53.867300 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:53.952637 containerd[1673]: time="2026-01-13T23:45:53.952531600Z" level=info msg="Start subscribing containerd event" Jan 13 23:45:53.952637 containerd[1673]: time="2026-01-13T23:45:53.952622040Z" level=info msg="Start recovering state" Jan 13 23:45:53.952881 containerd[1673]: time="2026-01-13T23:45:53.952751520Z" level=info msg="Start event monitor" Jan 13 23:45:53.952881 containerd[1673]: time="2026-01-13T23:45:53.952769800Z" level=info msg="Start cni network conf syncer for default" Jan 13 23:45:53.952881 containerd[1673]: time="2026-01-13T23:45:53.952777760Z" level=info msg="Start streaming server" Jan 13 23:45:53.952881 containerd[1673]: time="2026-01-13T23:45:53.952787360Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 13 23:45:53.953079 containerd[1673]: time="2026-01-13T23:45:53.952799920Z" level=info msg="runtime interface starting up..." Jan 13 23:45:53.953112 containerd[1673]: time="2026-01-13T23:45:53.953087320Z" level=info msg="starting plugins..." Jan 13 23:45:53.953131 containerd[1673]: time="2026-01-13T23:45:53.953116160Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 13 23:45:53.954182 containerd[1673]: time="2026-01-13T23:45:53.954042280Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 23:45:53.954565 containerd[1673]: time="2026-01-13T23:45:53.954546800Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 23:45:53.954807 containerd[1673]: time="2026-01-13T23:45:53.954790920Z" level=info msg="containerd successfully booted in 0.149821s" Jan 13 23:45:53.954984 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 23:45:53.964292 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 13 23:45:53.980257 extend-filesystems[1683]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 13 23:45:53.980257 extend-filesystems[1683]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 13 23:45:53.980257 extend-filesystems[1683]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 13 23:45:53.983845 extend-filesystems[1639]: Resized filesystem in /dev/vda9 Jan 13 23:45:53.983337 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 23:45:53.984290 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 23:45:54.097154 tar[1668]: linux-arm64/README.md Jan 13 23:45:54.118110 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 13 23:45:54.207367 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 23:45:54.371521 sshd_keygen[1656]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 23:45:54.391025 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 23:45:54.395158 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 23:45:54.397164 systemd[1]: Started sshd@0-10.0.21.248:22-20.161.92.111:47582.service - OpenSSH per-connection server daemon (20.161.92.111:47582). Jan 13 23:45:54.415987 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 23:45:54.416316 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 23:45:54.419049 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 23:45:54.439114 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 23:45:54.442835 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 23:45:54.446140 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 13 23:45:54.447992 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 23:45:54.571118 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:54.628332 systemd-networkd[1591]: eth0: Gained IPv6LL Jan 13 23:45:54.631574 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 23:45:54.633789 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 23:45:54.636551 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:45:54.638757 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 23:45:54.678621 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 23:45:54.806715 systemd[1]: Started sshd@1-10.0.21.248:22-45.140.17.124:44996.service - OpenSSH per-connection server daemon (45.140.17.124:44996). Jan 13 23:45:54.875114 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:54.971805 sshd[1750]: Accepted publickey for core from 20.161.92.111 port 47582 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:45:54.972467 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:45:54.982770 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 23:45:54.984955 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 23:45:54.988009 systemd-logind[1650]: New session 1 of user core. Jan 13 23:45:55.007558 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 23:45:55.011081 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 23:45:55.030314 (systemd)[1780]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:45:55.033971 systemd-logind[1650]: New session 2 of user core. Jan 13 23:45:55.147818 systemd[1780]: Queued start job for default target default.target. Jan 13 23:45:55.167267 systemd[1780]: Created slice app.slice - User Application Slice. Jan 13 23:45:55.167303 systemd[1780]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 13 23:45:55.167316 systemd[1780]: Reached target paths.target - Paths. Jan 13 23:45:55.167366 systemd[1780]: Reached target timers.target - Timers. Jan 13 23:45:55.168550 systemd[1780]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 23:45:55.169291 systemd[1780]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 13 23:45:55.178619 systemd[1780]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 23:45:55.178673 systemd[1780]: Reached target sockets.target - Sockets. Jan 13 23:45:55.180315 systemd[1780]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 13 23:45:55.180469 systemd[1780]: Reached target basic.target - Basic System. Jan 13 23:45:55.180516 systemd[1780]: Reached target default.target - Main User Target. Jan 13 23:45:55.180541 systemd[1780]: Startup finished in 141ms. Jan 13 23:45:55.180627 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 23:45:55.182876 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 23:45:55.497331 systemd[1]: Started sshd@2-10.0.21.248:22-20.161.92.111:47594.service - OpenSSH per-connection server daemon (20.161.92.111:47594). Jan 13 23:45:55.508254 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:45:55.512167 (kubelet)[1801]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 23:45:56.015177 sshd[1797]: Accepted publickey for core from 20.161.92.111 port 47594 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:45:56.018393 sshd-session[1797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:45:56.023405 systemd-logind[1650]: New session 3 of user core. Jan 13 23:45:56.034276 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 23:45:56.051750 kubelet[1801]: E0113 23:45:56.051689 1801 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 23:45:56.054490 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 23:45:56.054624 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 23:45:56.054974 systemd[1]: kubelet.service: Consumed 771ms CPU time, 256.7M memory peak. Jan 13 23:45:56.308760 sshd[1811]: Connection closed by 20.161.92.111 port 47594 Jan 13 23:45:56.309302 sshd-session[1797]: pam_unix(sshd:session): session closed for user core Jan 13 23:45:56.314117 systemd[1]: sshd@2-10.0.21.248:22-20.161.92.111:47594.service: Deactivated successfully. Jan 13 23:45:56.315807 systemd[1]: session-3.scope: Deactivated successfully. Jan 13 23:45:56.318518 systemd-logind[1650]: Session 3 logged out. Waiting for processes to exit. Jan 13 23:45:56.319536 systemd-logind[1650]: Removed session 3. Jan 13 23:45:56.428811 systemd[1]: Started sshd@3-10.0.21.248:22-20.161.92.111:47608.service - OpenSSH per-connection server daemon (20.161.92.111:47608). Jan 13 23:45:56.583130 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:56.856129 unix_chkpwd[1823]: password check failed for user (root) Jan 13 23:45:56.856549 sshd-session[1814]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.140.17.124 user=root Jan 13 23:45:56.881099 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:45:56.945949 sshd[1819]: Accepted publickey for core from 20.161.92.111 port 47608 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:45:56.947299 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:45:56.952877 systemd-logind[1650]: New session 4 of user core. Jan 13 23:45:56.965485 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 23:45:57.239232 sshd[1826]: Connection closed by 20.161.92.111 port 47608 Jan 13 23:45:57.239545 sshd-session[1819]: pam_unix(sshd:session): session closed for user core Jan 13 23:45:57.243444 systemd[1]: sshd@3-10.0.21.248:22-20.161.92.111:47608.service: Deactivated successfully. Jan 13 23:45:57.247605 systemd[1]: session-4.scope: Deactivated successfully. Jan 13 23:45:57.249176 systemd-logind[1650]: Session 4 logged out. Waiting for processes to exit. Jan 13 23:45:57.250113 systemd-logind[1650]: Removed session 4. Jan 13 23:45:59.297091 sshd[1774]: PAM: Authentication failure for root from 45.140.17.124 Jan 13 23:45:59.578926 sshd[1774]: Connection reset by authenticating user root 45.140.17.124 port 44996 [preauth] Jan 13 23:45:59.580688 systemd[1]: sshd@1-10.0.21.248:22-45.140.17.124:44996.service: Deactivated successfully. Jan 13 23:45:59.662698 systemd[1]: Started sshd@4-10.0.21.248:22-45.140.17.124:43352.service - OpenSSH per-connection server daemon (45.140.17.124:43352). Jan 13 23:46:00.593092 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:46:00.599082 coreos-metadata[1633]: Jan 13 23:46:00.598 WARN failed to locate config-drive, using the metadata service API instead Jan 13 23:46:00.616580 coreos-metadata[1633]: Jan 13 23:46:00.616 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 13 23:46:00.664933 sshd[1836]: Invalid user kali from 45.140.17.124 port 43352 Jan 13 23:46:00.892150 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 13 23:46:00.902486 coreos-metadata[1717]: Jan 13 23:46:00.902 WARN failed to locate config-drive, using the metadata service API instead Jan 13 23:46:00.915797 coreos-metadata[1717]: Jan 13 23:46:00.915 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 13 23:46:00.927376 sshd[1836]: PAM user mismatch Jan 13 23:46:00.929933 systemd[1]: sshd@4-10.0.21.248:22-45.140.17.124:43352.service: Deactivated successfully. Jan 13 23:46:01.015803 coreos-metadata[1633]: Jan 13 23:46:01.015 INFO Fetch successful Jan 13 23:46:01.015803 coreos-metadata[1633]: Jan 13 23:46:01.015 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 13 23:46:01.181847 coreos-metadata[1717]: Jan 13 23:46:01.181 INFO Fetch successful Jan 13 23:46:01.181847 coreos-metadata[1717]: Jan 13 23:46:01.181 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 13 23:46:01.279774 coreos-metadata[1633]: Jan 13 23:46:01.279 INFO Fetch successful Jan 13 23:46:01.279774 coreos-metadata[1633]: Jan 13 23:46:01.279 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 13 23:46:01.447157 coreos-metadata[1717]: Jan 13 23:46:01.446 INFO Fetch successful Jan 13 23:46:01.452070 unknown[1717]: wrote ssh authorized keys file for user: core Jan 13 23:46:01.455411 coreos-metadata[1633]: Jan 13 23:46:01.455 INFO Fetch successful Jan 13 23:46:01.455411 coreos-metadata[1633]: Jan 13 23:46:01.455 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 13 23:46:01.481911 update-ssh-keys[1848]: Updated "/home/core/.ssh/authorized_keys" Jan 13 23:46:01.482962 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 13 23:46:01.486135 systemd[1]: Finished sshkeys.service. Jan 13 23:46:01.595907 coreos-metadata[1633]: Jan 13 23:46:01.595 INFO Fetch successful Jan 13 23:46:01.595907 coreos-metadata[1633]: Jan 13 23:46:01.595 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 13 23:46:01.736354 coreos-metadata[1633]: Jan 13 23:46:01.736 INFO Fetch successful Jan 13 23:46:01.736354 coreos-metadata[1633]: Jan 13 23:46:01.736 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 13 23:46:01.877376 coreos-metadata[1633]: Jan 13 23:46:01.877 INFO Fetch successful Jan 13 23:46:01.906024 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 13 23:46:01.907185 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 23:46:01.908147 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 23:46:01.908605 systemd[1]: Startup finished in 2.449s (kernel) + 13.022s (initrd) + 10.783s (userspace) = 26.255s. Jan 13 23:46:06.248530 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 23:46:06.250307 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:46:06.393500 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:06.397032 (kubelet)[1864]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 23:46:06.435959 kubelet[1864]: E0113 23:46:06.435895 1864 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 23:46:06.438881 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 23:46:06.439021 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 23:46:06.439399 systemd[1]: kubelet.service: Consumed 146ms CPU time, 107.8M memory peak. Jan 13 23:46:07.354392 systemd[1]: Started sshd@5-10.0.21.248:22-20.161.92.111:55696.service - OpenSSH per-connection server daemon (20.161.92.111:55696). Jan 13 23:46:07.904052 sshd[1874]: Accepted publickey for core from 20.161.92.111 port 55696 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:46:07.905284 sshd-session[1874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:46:07.909358 systemd-logind[1650]: New session 5 of user core. Jan 13 23:46:07.918235 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 23:46:08.207186 sshd[1878]: Connection closed by 20.161.92.111 port 55696 Jan 13 23:46:08.207922 sshd-session[1874]: pam_unix(sshd:session): session closed for user core Jan 13 23:46:08.211983 systemd[1]: sshd@5-10.0.21.248:22-20.161.92.111:55696.service: Deactivated successfully. Jan 13 23:46:08.214898 systemd[1]: session-5.scope: Deactivated successfully. Jan 13 23:46:08.215621 systemd-logind[1650]: Session 5 logged out. Waiting for processes to exit. Jan 13 23:46:08.216791 systemd-logind[1650]: Removed session 5. Jan 13 23:46:08.326407 systemd[1]: Started sshd@6-10.0.21.248:22-20.161.92.111:55704.service - OpenSSH per-connection server daemon (20.161.92.111:55704). Jan 13 23:46:08.868114 sshd[1884]: Accepted publickey for core from 20.161.92.111 port 55704 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:46:08.869108 sshd-session[1884]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:46:08.873651 systemd-logind[1650]: New session 6 of user core. Jan 13 23:46:08.883412 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 23:46:09.157743 sshd[1888]: Connection closed by 20.161.92.111 port 55704 Jan 13 23:46:09.158405 sshd-session[1884]: pam_unix(sshd:session): session closed for user core Jan 13 23:46:09.162356 systemd-logind[1650]: Session 6 logged out. Waiting for processes to exit. Jan 13 23:46:09.162504 systemd[1]: sshd@6-10.0.21.248:22-20.161.92.111:55704.service: Deactivated successfully. Jan 13 23:46:09.165531 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 23:46:09.166950 systemd-logind[1650]: Removed session 6. Jan 13 23:46:09.268506 systemd[1]: Started sshd@7-10.0.21.248:22-20.161.92.111:55708.service - OpenSSH per-connection server daemon (20.161.92.111:55708). Jan 13 23:46:09.808942 sshd[1894]: Accepted publickey for core from 20.161.92.111 port 55708 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:46:09.809819 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:46:09.813954 systemd-logind[1650]: New session 7 of user core. Jan 13 23:46:09.824441 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 23:46:10.102193 sshd[1898]: Connection closed by 20.161.92.111 port 55708 Jan 13 23:46:10.102727 sshd-session[1894]: pam_unix(sshd:session): session closed for user core Jan 13 23:46:10.107330 systemd[1]: sshd@7-10.0.21.248:22-20.161.92.111:55708.service: Deactivated successfully. Jan 13 23:46:10.108922 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 23:46:10.109719 systemd-logind[1650]: Session 7 logged out. Waiting for processes to exit. Jan 13 23:46:10.111009 systemd-logind[1650]: Removed session 7. Jan 13 23:46:10.208289 systemd[1]: Started sshd@8-10.0.21.248:22-20.161.92.111:55710.service - OpenSSH per-connection server daemon (20.161.92.111:55710). Jan 13 23:46:10.743120 sshd[1904]: Accepted publickey for core from 20.161.92.111 port 55710 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:46:10.743896 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:46:10.747737 systemd-logind[1650]: New session 8 of user core. Jan 13 23:46:10.759312 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 23:46:10.955647 sudo[1909]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 23:46:10.955912 sudo[1909]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 23:46:10.979209 sudo[1909]: pam_unix(sudo:session): session closed for user root Jan 13 23:46:11.076091 sshd[1908]: Connection closed by 20.161.92.111 port 55710 Jan 13 23:46:11.076645 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Jan 13 23:46:11.082201 systemd[1]: sshd@8-10.0.21.248:22-20.161.92.111:55710.service: Deactivated successfully. Jan 13 23:46:11.083792 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 23:46:11.084620 systemd-logind[1650]: Session 8 logged out. Waiting for processes to exit. Jan 13 23:46:11.085750 systemd-logind[1650]: Removed session 8. Jan 13 23:46:11.189480 systemd[1]: Started sshd@9-10.0.21.248:22-20.161.92.111:55722.service - OpenSSH per-connection server daemon (20.161.92.111:55722). Jan 13 23:46:11.731036 sshd[1916]: Accepted publickey for core from 20.161.92.111 port 55722 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:46:11.731991 sshd-session[1916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:46:11.735761 systemd-logind[1650]: New session 9 of user core. Jan 13 23:46:11.748420 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 23:46:11.935239 sudo[1922]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 23:46:11.935504 sudo[1922]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 23:46:11.937873 sudo[1922]: pam_unix(sudo:session): session closed for user root Jan 13 23:46:11.943487 sudo[1921]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 23:46:11.943739 sudo[1921]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 23:46:11.950272 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 23:46:11.984000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 13 23:46:11.986518 kernel: kauditd_printk_skb: 186 callbacks suppressed Jan 13 23:46:11.986648 kernel: audit: type=1305 audit(1768347971.984:230): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 13 23:46:11.986720 kernel: audit: type=1300 audit(1768347971.984:230): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffdc067a80 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:11.984000 audit[1946]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffdc067a80 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:11.989977 augenrules[1946]: No rules Jan 13 23:46:11.990165 kernel: audit: type=1327 audit(1768347971.984:230): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 13 23:46:11.984000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 13 23:46:11.991362 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 23:46:11.991755 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 23:46:11.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:11.994876 sudo[1921]: pam_unix(sudo:session): session closed for user root Jan 13 23:46:11.992000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:11.997957 kernel: audit: type=1130 audit(1768347971.992:231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:11.998003 kernel: audit: type=1131 audit(1768347971.992:232): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:11.998022 kernel: audit: type=1106 audit(1768347971.994:233): pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:11.994000 audit[1921]: USER_END pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:11.994000 audit[1921]: CRED_DISP pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:12.002634 kernel: audit: type=1104 audit(1768347971.994:234): pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:12.095563 sshd[1920]: Connection closed by 20.161.92.111 port 55722 Jan 13 23:46:12.096133 sshd-session[1916]: pam_unix(sshd:session): session closed for user core Jan 13 23:46:12.097000 audit[1916]: USER_END pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.100782 systemd[1]: sshd@9-10.0.21.248:22-20.161.92.111:55722.service: Deactivated successfully. Jan 13 23:46:12.097000 audit[1916]: CRED_DISP pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.102312 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 23:46:12.104098 kernel: audit: type=1106 audit(1768347972.097:235): pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.104142 kernel: audit: type=1104 audit(1768347972.097:236): pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.104160 kernel: audit: type=1131 audit(1768347972.100:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.248:22-20.161.92.111:55722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:12.100000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.248:22-20.161.92.111:55722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:12.104880 systemd-logind[1650]: Session 9 logged out. Waiting for processes to exit. Jan 13 23:46:12.105682 systemd-logind[1650]: Removed session 9. Jan 13 23:46:12.208387 systemd[1]: Started sshd@10-10.0.21.248:22-20.161.92.111:44782.service - OpenSSH per-connection server daemon (20.161.92.111:44782). Jan 13 23:46:12.208000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.248:22-20.161.92.111:44782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:12.736766 sshd[1955]: Accepted publickey for core from 20.161.92.111 port 44782 ssh2: RSA SHA256:jFcSGDFMEazeXj5f81tH2eW+gXMy4FHOTy8E/LTjL+c Jan 13 23:46:12.736000 audit[1955]: USER_ACCT pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.737000 audit[1955]: CRED_ACQ pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.737000 audit[1955]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeb9a0d80 a2=3 a3=0 items=0 ppid=1 pid=1955 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:12.737000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 13 23:46:12.738178 sshd-session[1955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 23:46:12.742115 systemd-logind[1650]: New session 10 of user core. Jan 13 23:46:12.750465 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 23:46:12.752000 audit[1955]: USER_START pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.753000 audit[1959]: CRED_ACQ pid=1959 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:12.935683 sudo[1960]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 23:46:12.935950 sudo[1960]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 23:46:12.935000 audit[1960]: USER_ACCT pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:12.935000 audit[1960]: CRED_REFR pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:12.935000 audit[1960]: USER_START pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:13.247644 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 13 23:46:13.270612 (dockerd)[1982]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 13 23:46:13.528350 dockerd[1982]: time="2026-01-13T23:46:13.528110120Z" level=info msg="Starting up" Jan 13 23:46:13.529101 dockerd[1982]: time="2026-01-13T23:46:13.529058680Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 13 23:46:13.539772 dockerd[1982]: time="2026-01-13T23:46:13.539709720Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 13 23:46:13.582461 dockerd[1982]: time="2026-01-13T23:46:13.582403520Z" level=info msg="Loading containers: start." Jan 13 23:46:13.591100 kernel: Initializing XFRM netlink socket Jan 13 23:46:13.639000 audit[2033]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.639000 audit[2033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffd7c324e0 a2=0 a3=0 items=0 ppid=1982 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.639000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 13 23:46:13.641000 audit[2035]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.641000 audit[2035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffca5b5660 a2=0 a3=0 items=0 ppid=1982 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.641000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 13 23:46:13.643000 audit[2037]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.643000 audit[2037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffb1d3650 a2=0 a3=0 items=0 ppid=1982 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.643000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 13 23:46:13.646000 audit[2039]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.646000 audit[2039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc934f850 a2=0 a3=0 items=0 ppid=1982 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.646000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 13 23:46:13.647000 audit[2041]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.647000 audit[2041]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff0897780 a2=0 a3=0 items=0 ppid=1982 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.647000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 13 23:46:13.649000 audit[2043]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.649000 audit[2043]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc9ed8f70 a2=0 a3=0 items=0 ppid=1982 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.649000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 13 23:46:13.651000 audit[2045]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.651000 audit[2045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff08e5c50 a2=0 a3=0 items=0 ppid=1982 pid=2045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.651000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 13 23:46:13.653000 audit[2047]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.653000 audit[2047]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffc4ce2980 a2=0 a3=0 items=0 ppid=1982 pid=2047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.653000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 13 23:46:13.690000 audit[2050]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.690000 audit[2050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffe1571a40 a2=0 a3=0 items=0 ppid=1982 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.690000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 13 23:46:13.692000 audit[2052]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.692000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffffd59ef0 a2=0 a3=0 items=0 ppid=1982 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.692000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 13 23:46:13.694000 audit[2054]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.694000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffce8f4dc0 a2=0 a3=0 items=0 ppid=1982 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.694000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 13 23:46:13.696000 audit[2056]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.696000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=fffff84cf230 a2=0 a3=0 items=0 ppid=1982 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.696000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 13 23:46:13.698000 audit[2058]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2058 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.698000 audit[2058]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffde2fa450 a2=0 a3=0 items=0 ppid=1982 pid=2058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.698000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 13 23:46:13.733000 audit[2088]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.733000 audit[2088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffe7ecbe90 a2=0 a3=0 items=0 ppid=1982 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.733000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 13 23:46:13.735000 audit[2090]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.735000 audit[2090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffe0316400 a2=0 a3=0 items=0 ppid=1982 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.735000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 13 23:46:13.736000 audit[2092]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.736000 audit[2092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffda5e40b0 a2=0 a3=0 items=0 ppid=1982 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.736000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 13 23:46:13.738000 audit[2094]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.738000 audit[2094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe15010f0 a2=0 a3=0 items=0 ppid=1982 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.738000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 13 23:46:13.739000 audit[2096]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.739000 audit[2096]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd8ecc990 a2=0 a3=0 items=0 ppid=1982 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.739000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 13 23:46:13.741000 audit[2098]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.741000 audit[2098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffdf949b40 a2=0 a3=0 items=0 ppid=1982 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.741000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 13 23:46:13.746000 audit[2100]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.746000 audit[2100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff473abe0 a2=0 a3=0 items=0 ppid=1982 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.746000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 13 23:46:13.748000 audit[2102]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.748000 audit[2102]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffd9d812c0 a2=0 a3=0 items=0 ppid=1982 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.748000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 13 23:46:13.751000 audit[2104]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.751000 audit[2104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffd0f50250 a2=0 a3=0 items=0 ppid=1982 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.751000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 13 23:46:13.753000 audit[2106]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.753000 audit[2106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc5d451e0 a2=0 a3=0 items=0 ppid=1982 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.753000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 13 23:46:13.755000 audit[2108]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.755000 audit[2108]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffe7bbd720 a2=0 a3=0 items=0 ppid=1982 pid=2108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.755000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 13 23:46:13.757000 audit[2110]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.757000 audit[2110]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=fffffb509800 a2=0 a3=0 items=0 ppid=1982 pid=2110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.757000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 13 23:46:13.759000 audit[2112]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.759000 audit[2112]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffe0f527b0 a2=0 a3=0 items=0 ppid=1982 pid=2112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.759000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 13 23:46:13.764000 audit[2117]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.764000 audit[2117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffdf2f9d40 a2=0 a3=0 items=0 ppid=1982 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.764000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 13 23:46:13.766000 audit[2119]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.766000 audit[2119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffecde6890 a2=0 a3=0 items=0 ppid=1982 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.766000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 13 23:46:13.767000 audit[2121]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2121 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.767000 audit[2121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffd31949b0 a2=0 a3=0 items=0 ppid=1982 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.767000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 13 23:46:13.769000 audit[2123]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.769000 audit[2123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff69e24b0 a2=0 a3=0 items=0 ppid=1982 pid=2123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.769000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 13 23:46:13.771000 audit[2125]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.771000 audit[2125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffdc72d4e0 a2=0 a3=0 items=0 ppid=1982 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.771000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 13 23:46:13.773000 audit[2127]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:13.773000 audit[2127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffd8a6a6b0 a2=0 a3=0 items=0 ppid=1982 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.773000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 13 23:46:13.809000 audit[2132]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.809000 audit[2132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffe6333d70 a2=0 a3=0 items=0 ppid=1982 pid=2132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.809000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 13 23:46:13.811000 audit[2134]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.811000 audit[2134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffe89af910 a2=0 a3=0 items=0 ppid=1982 pid=2134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.811000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 13 23:46:13.819000 audit[2142]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.819000 audit[2142]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffcce2f7c0 a2=0 a3=0 items=0 ppid=1982 pid=2142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.819000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 13 23:46:13.829000 audit[2148]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.829000 audit[2148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffd227d900 a2=0 a3=0 items=0 ppid=1982 pid=2148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.829000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 13 23:46:13.831000 audit[2150]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.831000 audit[2150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffe56b1b40 a2=0 a3=0 items=0 ppid=1982 pid=2150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.831000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 13 23:46:13.833000 audit[2152]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.833000 audit[2152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe58a5c80 a2=0 a3=0 items=0 ppid=1982 pid=2152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.833000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 13 23:46:13.835000 audit[2154]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.835000 audit[2154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffff6e77fa0 a2=0 a3=0 items=0 ppid=1982 pid=2154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.835000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 13 23:46:13.837000 audit[2156]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:13.837000 audit[2156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffee827900 a2=0 a3=0 items=0 ppid=1982 pid=2156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:13.837000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 13 23:46:13.838090 systemd-networkd[1591]: docker0: Link UP Jan 13 23:46:13.843264 dockerd[1982]: time="2026-01-13T23:46:13.843206920Z" level=info msg="Loading containers: done." Jan 13 23:46:13.856009 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2336183030-merged.mount: Deactivated successfully. Jan 13 23:46:13.868623 dockerd[1982]: time="2026-01-13T23:46:13.868528320Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 13 23:46:13.868623 dockerd[1982]: time="2026-01-13T23:46:13.868620880Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 13 23:46:13.868804 dockerd[1982]: time="2026-01-13T23:46:13.868795000Z" level=info msg="Initializing buildkit" Jan 13 23:46:13.902091 dockerd[1982]: time="2026-01-13T23:46:13.901886320Z" level=info msg="Completed buildkit initialization" Jan 13 23:46:13.906426 dockerd[1982]: time="2026-01-13T23:46:13.906393480Z" level=info msg="Daemon has completed initialization" Jan 13 23:46:13.907083 dockerd[1982]: time="2026-01-13T23:46:13.906657920Z" level=info msg="API listen on /run/docker.sock" Jan 13 23:46:13.907843 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 13 23:46:13.907000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:15.085108 containerd[1673]: time="2026-01-13T23:46:15.085019000Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 13 23:46:15.764480 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3545740735.mount: Deactivated successfully. Jan 13 23:46:16.498042 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 23:46:16.499771 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:46:16.588280 containerd[1673]: time="2026-01-13T23:46:16.588146320Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:16.589759 containerd[1673]: time="2026-01-13T23:46:16.589707160Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=24845792" Jan 13 23:46:16.591165 containerd[1673]: time="2026-01-13T23:46:16.591130360Z" level=info msg="ImageCreate event name:\"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:16.595252 containerd[1673]: time="2026-01-13T23:46:16.595191120Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:16.596114 containerd[1673]: time="2026-01-13T23:46:16.596082800Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"26438581\" in 1.51095284s" Jan 13 23:46:16.596170 containerd[1673]: time="2026-01-13T23:46:16.596121520Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\"" Jan 13 23:46:16.597027 containerd[1673]: time="2026-01-13T23:46:16.597006560Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 13 23:46:16.639052 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:16.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:16.657395 (kubelet)[2264]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 23:46:16.695690 kubelet[2264]: E0113 23:46:16.695613 2264 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 23:46:16.698225 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 23:46:16.698355 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 23:46:16.698000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 13 23:46:16.698734 systemd[1]: kubelet.service: Consumed 146ms CPU time, 107.7M memory peak. Jan 13 23:46:17.410572 chronyd[1631]: Selected source PHC0 Jan 13 23:46:17.970258 containerd[1673]: time="2026-01-13T23:46:17.970213418Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:17.973981 containerd[1673]: time="2026-01-13T23:46:17.973870979Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=22613932" Jan 13 23:46:17.976074 containerd[1673]: time="2026-01-13T23:46:17.976036743Z" level=info msg="ImageCreate event name:\"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:17.979010 containerd[1673]: time="2026-01-13T23:46:17.978975662Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:17.980101 containerd[1673]: time="2026-01-13T23:46:17.980050245Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"24206567\" in 1.383015185s" Jan 13 23:46:17.980165 containerd[1673]: time="2026-01-13T23:46:17.980103025Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\"" Jan 13 23:46:17.980652 containerd[1673]: time="2026-01-13T23:46:17.980609810Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 13 23:46:19.164071 containerd[1673]: time="2026-01-13T23:46:19.163424056Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:19.164670 containerd[1673]: time="2026-01-13T23:46:19.164608793Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=17608611" Jan 13 23:46:19.165822 containerd[1673]: time="2026-01-13T23:46:19.165781440Z" level=info msg="ImageCreate event name:\"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:19.169728 containerd[1673]: time="2026-01-13T23:46:19.169702859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:19.170995 containerd[1673]: time="2026-01-13T23:46:19.170872862Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"19201246\" in 1.190225959s" Jan 13 23:46:19.170995 containerd[1673]: time="2026-01-13T23:46:19.170910606Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\"" Jan 13 23:46:19.171338 containerd[1673]: time="2026-01-13T23:46:19.171316727Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 13 23:46:20.173963 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4230539072.mount: Deactivated successfully. Jan 13 23:46:20.417989 containerd[1673]: time="2026-01-13T23:46:20.417878269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:20.419073 containerd[1673]: time="2026-01-13T23:46:20.419012912Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=17713718" Jan 13 23:46:20.420353 containerd[1673]: time="2026-01-13T23:46:20.420310923Z" level=info msg="ImageCreate event name:\"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:20.422826 containerd[1673]: time="2026-01-13T23:46:20.422785004Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:20.423491 containerd[1673]: time="2026-01-13T23:46:20.423450562Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"27557743\" in 1.252101552s" Jan 13 23:46:20.423491 containerd[1673]: time="2026-01-13T23:46:20.423485749Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\"" Jan 13 23:46:20.424076 containerd[1673]: time="2026-01-13T23:46:20.423995567Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 13 23:46:21.071608 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1764787275.mount: Deactivated successfully. Jan 13 23:46:21.715935 containerd[1673]: time="2026-01-13T23:46:21.715873204Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:21.718055 containerd[1673]: time="2026-01-13T23:46:21.717984385Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=15956282" Jan 13 23:46:21.719468 containerd[1673]: time="2026-01-13T23:46:21.719410133Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:21.722771 containerd[1673]: time="2026-01-13T23:46:21.722728664Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:21.724223 containerd[1673]: time="2026-01-13T23:46:21.724190971Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.30016412s" Jan 13 23:46:21.724263 containerd[1673]: time="2026-01-13T23:46:21.724222291Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jan 13 23:46:21.725293 containerd[1673]: time="2026-01-13T23:46:21.725263242Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 13 23:46:22.280543 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2801844519.mount: Deactivated successfully. Jan 13 23:46:22.288385 containerd[1673]: time="2026-01-13T23:46:22.288306369Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 23:46:22.289287 containerd[1673]: time="2026-01-13T23:46:22.289237481Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 13 23:46:22.290713 containerd[1673]: time="2026-01-13T23:46:22.290682948Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 23:46:22.292813 containerd[1673]: time="2026-01-13T23:46:22.292771650Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 23:46:22.293728 containerd[1673]: time="2026-01-13T23:46:22.293693802Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 568.397ms" Jan 13 23:46:22.293773 containerd[1673]: time="2026-01-13T23:46:22.293727922Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 13 23:46:22.294385 containerd[1673]: time="2026-01-13T23:46:22.294343956Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 13 23:46:22.899850 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3320836528.mount: Deactivated successfully. Jan 13 23:46:24.538919 containerd[1673]: time="2026-01-13T23:46:24.538846306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:24.540185 containerd[1673]: time="2026-01-13T23:46:24.540131029Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=56456774" Jan 13 23:46:24.541482 containerd[1673]: time="2026-01-13T23:46:24.541425233Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:24.544953 containerd[1673]: time="2026-01-13T23:46:24.544895684Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:24.545939 containerd[1673]: time="2026-01-13T23:46:24.545891887Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.251516211s" Jan 13 23:46:24.545939 containerd[1673]: time="2026-01-13T23:46:24.545927727Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Jan 13 23:46:26.748866 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 13 23:46:26.750261 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:46:26.898287 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 13 23:46:26.898375 kernel: audit: type=1130 audit(1768347986.893:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:26.893000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:26.894843 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:26.915725 (kubelet)[2429]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 23:46:26.950779 kubelet[2429]: E0113 23:46:26.950732 2429 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 23:46:26.953373 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 23:46:26.953502 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 23:46:26.954000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 13 23:46:26.955167 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.4M memory peak. Jan 13 23:46:26.958072 kernel: audit: type=1131 audit(1768347986.954:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 13 23:46:28.820528 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:28.819000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:28.819000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:28.820696 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.4M memory peak. Jan 13 23:46:28.822605 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:46:28.826119 kernel: audit: type=1130 audit(1768347988.819:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:28.826191 kernel: audit: type=1131 audit(1768347988.819:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:28.848690 systemd[1]: Reload requested from client PID 2445 ('systemctl') (unit session-10.scope)... Jan 13 23:46:28.848707 systemd[1]: Reloading... Jan 13 23:46:28.925115 zram_generator::config[2491]: No configuration found. Jan 13 23:46:29.104128 systemd[1]: Reloading finished in 255 ms. Jan 13 23:46:29.116000 audit: BPF prog-id=63 op=LOAD Jan 13 23:46:29.116000 audit: BPF prog-id=64 op=LOAD Jan 13 23:46:29.121914 kernel: audit: type=1334 audit(1768347989.116:294): prog-id=63 op=LOAD Jan 13 23:46:29.121966 kernel: audit: type=1334 audit(1768347989.116:295): prog-id=64 op=LOAD Jan 13 23:46:29.121990 kernel: audit: type=1334 audit(1768347989.116:296): prog-id=55 op=UNLOAD Jan 13 23:46:29.116000 audit: BPF prog-id=55 op=UNLOAD Jan 13 23:46:29.116000 audit: BPF prog-id=56 op=UNLOAD Jan 13 23:46:29.117000 audit: BPF prog-id=65 op=LOAD Jan 13 23:46:29.125800 kernel: audit: type=1334 audit(1768347989.116:297): prog-id=56 op=UNLOAD Jan 13 23:46:29.125835 kernel: audit: type=1334 audit(1768347989.117:298): prog-id=65 op=LOAD Jan 13 23:46:29.125854 kernel: audit: type=1334 audit(1768347989.117:299): prog-id=52 op=UNLOAD Jan 13 23:46:29.117000 audit: BPF prog-id=52 op=UNLOAD Jan 13 23:46:29.117000 audit: BPF prog-id=66 op=LOAD Jan 13 23:46:29.117000 audit: BPF prog-id=67 op=LOAD Jan 13 23:46:29.117000 audit: BPF prog-id=53 op=UNLOAD Jan 13 23:46:29.117000 audit: BPF prog-id=54 op=UNLOAD Jan 13 23:46:29.118000 audit: BPF prog-id=68 op=LOAD Jan 13 23:46:29.118000 audit: BPF prog-id=58 op=UNLOAD Jan 13 23:46:29.120000 audit: BPF prog-id=69 op=LOAD Jan 13 23:46:29.120000 audit: BPF prog-id=43 op=UNLOAD Jan 13 23:46:29.121000 audit: BPF prog-id=70 op=LOAD Jan 13 23:46:29.121000 audit: BPF prog-id=71 op=LOAD Jan 13 23:46:29.121000 audit: BPF prog-id=44 op=UNLOAD Jan 13 23:46:29.121000 audit: BPF prog-id=45 op=UNLOAD Jan 13 23:46:29.125000 audit: BPF prog-id=72 op=LOAD Jan 13 23:46:29.125000 audit: BPF prog-id=57 op=UNLOAD Jan 13 23:46:29.126000 audit: BPF prog-id=73 op=LOAD Jan 13 23:46:29.139000 audit: BPF prog-id=60 op=UNLOAD Jan 13 23:46:29.139000 audit: BPF prog-id=74 op=LOAD Jan 13 23:46:29.139000 audit: BPF prog-id=75 op=LOAD Jan 13 23:46:29.139000 audit: BPF prog-id=61 op=UNLOAD Jan 13 23:46:29.139000 audit: BPF prog-id=62 op=UNLOAD Jan 13 23:46:29.140000 audit: BPF prog-id=76 op=LOAD Jan 13 23:46:29.140000 audit: BPF prog-id=46 op=UNLOAD Jan 13 23:46:29.140000 audit: BPF prog-id=77 op=LOAD Jan 13 23:46:29.140000 audit: BPF prog-id=78 op=LOAD Jan 13 23:46:29.140000 audit: BPF prog-id=47 op=UNLOAD Jan 13 23:46:29.140000 audit: BPF prog-id=48 op=UNLOAD Jan 13 23:46:29.141000 audit: BPF prog-id=79 op=LOAD Jan 13 23:46:29.141000 audit: BPF prog-id=59 op=UNLOAD Jan 13 23:46:29.141000 audit: BPF prog-id=80 op=LOAD Jan 13 23:46:29.141000 audit: BPF prog-id=49 op=UNLOAD Jan 13 23:46:29.142000 audit: BPF prog-id=81 op=LOAD Jan 13 23:46:29.142000 audit: BPF prog-id=82 op=LOAD Jan 13 23:46:29.142000 audit: BPF prog-id=50 op=UNLOAD Jan 13 23:46:29.142000 audit: BPF prog-id=51 op=UNLOAD Jan 13 23:46:29.161681 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 13 23:46:29.161758 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 13 23:46:29.162181 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:29.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 13 23:46:29.162295 systemd[1]: kubelet.service: Consumed 92ms CPU time, 95.4M memory peak. Jan 13 23:46:29.163721 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:46:29.281754 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:29.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:29.295369 (kubelet)[2539]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 23:46:29.331603 kubelet[2539]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 23:46:29.331603 kubelet[2539]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 13 23:46:29.331603 kubelet[2539]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 23:46:29.331922 kubelet[2539]: I0113 23:46:29.331654 2539 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 23:46:30.422718 kubelet[2539]: I0113 23:46:30.422652 2539 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 13 23:46:30.422718 kubelet[2539]: I0113 23:46:30.422690 2539 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 23:46:30.423271 kubelet[2539]: I0113 23:46:30.422976 2539 server.go:954] "Client rotation is on, will bootstrap in background" Jan 13 23:46:30.640408 kubelet[2539]: E0113 23:46:30.640370 2539 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.21.248:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.21.248:6443: connect: connection refused" logger="UnhandledError" Jan 13 23:46:30.641298 kubelet[2539]: I0113 23:46:30.641282 2539 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 23:46:30.648945 kubelet[2539]: I0113 23:46:30.648902 2539 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 13 23:46:30.651659 kubelet[2539]: I0113 23:46:30.651632 2539 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 23:46:30.651970 kubelet[2539]: I0113 23:46:30.651921 2539 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 23:46:30.652137 kubelet[2539]: I0113 23:46:30.651954 2539 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-660efdb355","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 13 23:46:30.652244 kubelet[2539]: I0113 23:46:30.652233 2539 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 23:46:30.652273 kubelet[2539]: I0113 23:46:30.652246 2539 container_manager_linux.go:304] "Creating device plugin manager" Jan 13 23:46:30.652512 kubelet[2539]: I0113 23:46:30.652482 2539 state_mem.go:36] "Initialized new in-memory state store" Jan 13 23:46:30.657454 kubelet[2539]: I0113 23:46:30.657424 2539 kubelet.go:446] "Attempting to sync node with API server" Jan 13 23:46:30.657454 kubelet[2539]: I0113 23:46:30.657451 2539 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 23:46:30.657515 kubelet[2539]: I0113 23:46:30.657474 2539 kubelet.go:352] "Adding apiserver pod source" Jan 13 23:46:30.657515 kubelet[2539]: I0113 23:46:30.657484 2539 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 23:46:30.660079 kubelet[2539]: W0113 23:46:30.659536 2539 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.21.248:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-660efdb355&limit=500&resourceVersion=0": dial tcp 10.0.21.248:6443: connect: connection refused Jan 13 23:46:30.660079 kubelet[2539]: E0113 23:46:30.659599 2539 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.21.248:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-660efdb355&limit=500&resourceVersion=0\": dial tcp 10.0.21.248:6443: connect: connection refused" logger="UnhandledError" Jan 13 23:46:30.661700 kubelet[2539]: I0113 23:46:30.661676 2539 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 13 23:46:30.662489 kubelet[2539]: I0113 23:46:30.662467 2539 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 23:46:30.662733 kubelet[2539]: W0113 23:46:30.661693 2539 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.21.248:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.21.248:6443: connect: connection refused Jan 13 23:46:30.662834 kubelet[2539]: E0113 23:46:30.662814 2539 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.21.248:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.21.248:6443: connect: connection refused" logger="UnhandledError" Jan 13 23:46:30.662883 kubelet[2539]: W0113 23:46:30.662703 2539 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 23:46:30.663903 kubelet[2539]: I0113 23:46:30.663880 2539 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 13 23:46:30.664030 kubelet[2539]: I0113 23:46:30.664020 2539 server.go:1287] "Started kubelet" Jan 13 23:46:30.665180 kubelet[2539]: I0113 23:46:30.665130 2539 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 23:46:30.667680 kubelet[2539]: I0113 23:46:30.667618 2539 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 23:46:30.672567 kubelet[2539]: E0113 23:46:30.672263 2539 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.21.248:6443/api/v1/namespaces/default/events\": dial tcp 10.0.21.248:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-n-660efdb355.188a6f20decd7d21 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-n-660efdb355,UID:ci-4547-0-0-n-660efdb355,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-660efdb355,},FirstTimestamp:2026-01-13 23:46:30.663986465 +0000 UTC m=+1.365217722,LastTimestamp:2026-01-13 23:46:30.663986465 +0000 UTC m=+1.365217722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-660efdb355,}" Jan 13 23:46:30.673184 kubelet[2539]: I0113 23:46:30.672892 2539 server.go:479] "Adding debug handlers to kubelet server" Jan 13 23:46:30.673469 kubelet[2539]: I0113 23:46:30.673443 2539 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 23:46:30.674078 kubelet[2539]: I0113 23:46:30.673888 2539 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 23:46:30.674314 kubelet[2539]: I0113 23:46:30.674266 2539 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 13 23:46:30.674832 kubelet[2539]: E0113 23:46:30.674801 2539 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-660efdb355\" not found" Jan 13 23:46:30.674886 kubelet[2539]: I0113 23:46:30.674850 2539 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 13 23:46:30.675027 kubelet[2539]: I0113 23:46:30.675007 2539 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 13 23:46:30.675216 kubelet[2539]: I0113 23:46:30.675147 2539 reconciler.go:26] "Reconciler: start to sync state" Jan 13 23:46:30.675761 kubelet[2539]: W0113 23:46:30.675723 2539 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.21.248:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.21.248:6443: connect: connection refused Jan 13 23:46:30.675807 kubelet[2539]: E0113 23:46:30.675775 2539 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.21.248:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.21.248:6443: connect: connection refused" logger="UnhandledError" Jan 13 23:46:30.676000 audit[2552]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2552 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.676000 audit[2552]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffe973bdf0 a2=0 a3=0 items=0 ppid=2539 pid=2552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.676000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 13 23:46:30.678326 kubelet[2539]: E0113 23:46:30.676950 2539 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.248:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-660efdb355?timeout=10s\": dial tcp 10.0.21.248:6443: connect: connection refused" interval="200ms" Jan 13 23:46:30.678800 kubelet[2539]: I0113 23:46:30.678761 2539 factory.go:221] Registration of the containerd container factory successfully Jan 13 23:46:30.678800 kubelet[2539]: I0113 23:46:30.678784 2539 factory.go:221] Registration of the systemd container factory successfully Jan 13 23:46:30.678882 kubelet[2539]: I0113 23:46:30.678871 2539 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 23:46:30.678969 kubelet[2539]: E0113 23:46:30.678934 2539 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 23:46:30.680000 audit[2553]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.680000 audit[2553]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe0a0b290 a2=0 a3=0 items=0 ppid=2539 pid=2553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.680000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 13 23:46:30.682000 audit[2555]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2555 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.682000 audit[2555]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd3f02130 a2=0 a3=0 items=0 ppid=2539 pid=2555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.682000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 13 23:46:30.683000 audit[2557]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.683000 audit[2557]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc0cc6420 a2=0 a3=0 items=0 ppid=2539 pid=2557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.683000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 13 23:46:30.689098 kubelet[2539]: I0113 23:46:30.689080 2539 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 13 23:46:30.689198 kubelet[2539]: I0113 23:46:30.689187 2539 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 13 23:46:30.689268 kubelet[2539]: I0113 23:46:30.689260 2539 state_mem.go:36] "Initialized new in-memory state store" Jan 13 23:46:30.691000 audit[2561]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2561 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.691000 audit[2561]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffd7a39e50 a2=0 a3=0 items=0 ppid=2539 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.691000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 13 23:46:30.692427 kubelet[2539]: I0113 23:46:30.692399 2539 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 23:46:30.692767 kubelet[2539]: I0113 23:46:30.692520 2539 policy_none.go:49] "None policy: Start" Jan 13 23:46:30.692767 kubelet[2539]: I0113 23:46:30.692539 2539 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 13 23:46:30.692767 kubelet[2539]: I0113 23:46:30.692550 2539 state_mem.go:35] "Initializing new in-memory state store" Jan 13 23:46:30.692000 audit[2562]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2562 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:30.692000 audit[2562]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc676a080 a2=0 a3=0 items=0 ppid=2539 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.692000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 13 23:46:30.693461 kubelet[2539]: I0113 23:46:30.693414 2539 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 23:46:30.693461 kubelet[2539]: I0113 23:46:30.693439 2539 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 13 23:46:30.693521 kubelet[2539]: I0113 23:46:30.693462 2539 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 13 23:46:30.693521 kubelet[2539]: I0113 23:46:30.693472 2539 kubelet.go:2382] "Starting kubelet main sync loop" Jan 13 23:46:30.693521 kubelet[2539]: E0113 23:46:30.693507 2539 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 23:46:30.693000 audit[2563]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2563 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.693000 audit[2563]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcd2e74d0 a2=0 a3=0 items=0 ppid=2539 pid=2563 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.693000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 13 23:46:30.693000 audit[2564]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2564 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:30.693000 audit[2564]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc540c280 a2=0 a3=0 items=0 ppid=2539 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.693000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 13 23:46:30.694000 audit[2565]: NETFILTER_CFG table=nat:50 family=2 entries=1 op=nft_register_chain pid=2565 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.694000 audit[2565]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe389ee20 a2=0 a3=0 items=0 ppid=2539 pid=2565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.694000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 13 23:46:30.695770 kubelet[2539]: W0113 23:46:30.695154 2539 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.21.248:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.21.248:6443: connect: connection refused Jan 13 23:46:30.695770 kubelet[2539]: E0113 23:46:30.695188 2539 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.21.248:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.21.248:6443: connect: connection refused" logger="UnhandledError" Jan 13 23:46:30.695000 audit[2569]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2569 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:30.695000 audit[2569]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdfde93c0 a2=0 a3=0 items=0 ppid=2539 pid=2569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.695000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 13 23:46:30.696000 audit[2571]: NETFILTER_CFG table=filter:52 family=10 entries=1 op=nft_register_chain pid=2571 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:30.696000 audit[2571]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc4a5dff0 a2=0 a3=0 items=0 ppid=2539 pid=2571 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.696000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 13 23:46:30.696000 audit[2570]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2570 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:30.696000 audit[2570]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffeaf9df50 a2=0 a3=0 items=0 ppid=2539 pid=2570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:30.696000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 13 23:46:30.700596 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 13 23:46:30.719180 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 13 23:46:30.743533 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 13 23:46:30.745110 kubelet[2539]: I0113 23:46:30.745087 2539 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 23:46:30.745302 kubelet[2539]: I0113 23:46:30.745283 2539 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 13 23:46:30.745329 kubelet[2539]: I0113 23:46:30.745302 2539 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 23:46:30.745561 kubelet[2539]: I0113 23:46:30.745544 2539 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 23:46:30.747091 kubelet[2539]: E0113 23:46:30.747048 2539 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 13 23:46:30.747162 kubelet[2539]: E0113 23:46:30.747104 2539 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-n-660efdb355\" not found" Jan 13 23:46:30.802805 systemd[1]: Created slice kubepods-burstable-podfb1d772901d2bbe72665adc1c84d0ea9.slice - libcontainer container kubepods-burstable-podfb1d772901d2bbe72665adc1c84d0ea9.slice. Jan 13 23:46:30.821433 kubelet[2539]: E0113 23:46:30.821147 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.823092 systemd[1]: Created slice kubepods-burstable-podc6e6d5c400fbec52d361aeb3ad50bfa2.slice - libcontainer container kubepods-burstable-podc6e6d5c400fbec52d361aeb3ad50bfa2.slice. Jan 13 23:46:30.825086 kubelet[2539]: E0113 23:46:30.825048 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.846446 systemd[1]: Created slice kubepods-burstable-podc7a79a89255eb2fa49d6bbfe324fa4cd.slice - libcontainer container kubepods-burstable-podc7a79a89255eb2fa49d6bbfe324fa4cd.slice. Jan 13 23:46:30.848586 kubelet[2539]: I0113 23:46:30.848563 2539 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.848978 kubelet[2539]: E0113 23:46:30.848937 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.849612 kubelet[2539]: E0113 23:46:30.849584 2539 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.248:6443/api/v1/nodes\": dial tcp 10.0.21.248:6443: connect: connection refused" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.878680 kubelet[2539]: E0113 23:46:30.878578 2539 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.248:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-660efdb355?timeout=10s\": dial tcp 10.0.21.248:6443: connect: connection refused" interval="400ms" Jan 13 23:46:30.977033 kubelet[2539]: I0113 23:46:30.976921 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977033 kubelet[2539]: I0113 23:46:30.976964 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977033 kubelet[2539]: I0113 23:46:30.976989 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977033 kubelet[2539]: I0113 23:46:30.977007 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c6e6d5c400fbec52d361aeb3ad50bfa2-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-660efdb355\" (UID: \"c6e6d5c400fbec52d361aeb3ad50bfa2\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977381 kubelet[2539]: I0113 23:46:30.977109 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c7a79a89255eb2fa49d6bbfe324fa4cd-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-660efdb355\" (UID: \"c7a79a89255eb2fa49d6bbfe324fa4cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977381 kubelet[2539]: I0113 23:46:30.977186 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c7a79a89255eb2fa49d6bbfe324fa4cd-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-660efdb355\" (UID: \"c7a79a89255eb2fa49d6bbfe324fa4cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977381 kubelet[2539]: I0113 23:46:30.977240 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c7a79a89255eb2fa49d6bbfe324fa4cd-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-660efdb355\" (UID: \"c7a79a89255eb2fa49d6bbfe324fa4cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977381 kubelet[2539]: I0113 23:46:30.977294 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:30.977381 kubelet[2539]: I0113 23:46:30.977341 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:31.053271 kubelet[2539]: I0113 23:46:31.053241 2539 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:31.053665 kubelet[2539]: E0113 23:46:31.053618 2539 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.248:6443/api/v1/nodes\": dial tcp 10.0.21.248:6443: connect: connection refused" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:31.123075 containerd[1673]: time="2026-01-13T23:46:31.122990090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-660efdb355,Uid:fb1d772901d2bbe72665adc1c84d0ea9,Namespace:kube-system,Attempt:0,}" Jan 13 23:46:31.126720 containerd[1673]: time="2026-01-13T23:46:31.126691621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-660efdb355,Uid:c6e6d5c400fbec52d361aeb3ad50bfa2,Namespace:kube-system,Attempt:0,}" Jan 13 23:46:31.149134 containerd[1673]: time="2026-01-13T23:46:31.148669448Z" level=info msg="connecting to shim e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9" address="unix:///run/containerd/s/a5d70e2eea7671738ec9ec5607fd26d7a5f9ad5f54c6e3887d77df30ba51f8de" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:46:31.150736 containerd[1673]: time="2026-01-13T23:46:31.150537693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-660efdb355,Uid:c7a79a89255eb2fa49d6bbfe324fa4cd,Namespace:kube-system,Attempt:0,}" Jan 13 23:46:31.159523 containerd[1673]: time="2026-01-13T23:46:31.159472800Z" level=info msg="connecting to shim b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f" address="unix:///run/containerd/s/a2662928acb8cf6f7bde3880480561138f7ee67f7cc250e7777a374faa5e1059" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:46:31.178493 containerd[1673]: time="2026-01-13T23:46:31.178448457Z" level=info msg="connecting to shim 19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a" address="unix:///run/containerd/s/0d29b3bbba79830d7eb832c76491d63acb8185175d2f54ddf7313124c756afe6" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:46:31.181282 systemd[1]: Started cri-containerd-e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9.scope - libcontainer container e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9. Jan 13 23:46:31.184542 systemd[1]: Started cri-containerd-b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f.scope - libcontainer container b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f. Jan 13 23:46:31.193000 audit: BPF prog-id=83 op=LOAD Jan 13 23:46:31.193000 audit: BPF prog-id=84 op=LOAD Jan 13 23:46:31.193000 audit[2598]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2581 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534316333383963393233646438616439353839353333626539306466 Jan 13 23:46:31.193000 audit: BPF prog-id=84 op=UNLOAD Jan 13 23:46:31.193000 audit[2598]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534316333383963393233646438616439353839353333626539306466 Jan 13 23:46:31.193000 audit: BPF prog-id=85 op=LOAD Jan 13 23:46:31.193000 audit[2598]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2581 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534316333383963393233646438616439353839353333626539306466 Jan 13 23:46:31.194000 audit: BPF prog-id=86 op=LOAD Jan 13 23:46:31.194000 audit[2598]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2581 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.194000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534316333383963393233646438616439353839353333626539306466 Jan 13 23:46:31.194000 audit: BPF prog-id=86 op=UNLOAD Jan 13 23:46:31.194000 audit[2598]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.194000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534316333383963393233646438616439353839353333626539306466 Jan 13 23:46:31.194000 audit: BPF prog-id=85 op=UNLOAD Jan 13 23:46:31.194000 audit[2598]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.194000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534316333383963393233646438616439353839353333626539306466 Jan 13 23:46:31.194000 audit: BPF prog-id=87 op=LOAD Jan 13 23:46:31.194000 audit[2598]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2581 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.194000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534316333383963393233646438616439353839353333626539306466 Jan 13 23:46:31.207272 systemd[1]: Started cri-containerd-19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a.scope - libcontainer container 19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a. Jan 13 23:46:31.207000 audit: BPF prog-id=88 op=LOAD Jan 13 23:46:31.208000 audit: BPF prog-id=89 op=LOAD Jan 13 23:46:31.208000 audit[2623]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2606 pid=2623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.208000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6231653634663563303132356461313834386331306263316363636131 Jan 13 23:46:31.210000 audit: BPF prog-id=89 op=UNLOAD Jan 13 23:46:31.210000 audit[2623]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=2623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.210000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6231653634663563303132356461313834386331306263316363636131 Jan 13 23:46:31.211000 audit: BPF prog-id=90 op=LOAD Jan 13 23:46:31.211000 audit[2623]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2606 pid=2623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6231653634663563303132356461313834386331306263316363636131 Jan 13 23:46:31.211000 audit: BPF prog-id=91 op=LOAD Jan 13 23:46:31.211000 audit[2623]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2606 pid=2623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6231653634663563303132356461313834386331306263316363636131 Jan 13 23:46:31.211000 audit: BPF prog-id=91 op=UNLOAD Jan 13 23:46:31.211000 audit[2623]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=2623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6231653634663563303132356461313834386331306263316363636131 Jan 13 23:46:31.211000 audit: BPF prog-id=90 op=UNLOAD Jan 13 23:46:31.211000 audit[2623]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=2623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6231653634663563303132356461313834386331306263316363636131 Jan 13 23:46:31.212000 audit: BPF prog-id=92 op=LOAD Jan 13 23:46:31.212000 audit[2623]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2606 pid=2623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6231653634663563303132356461313834386331306263316363636131 Jan 13 23:46:31.222392 containerd[1673]: time="2026-01-13T23:46:31.222246870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-660efdb355,Uid:fb1d772901d2bbe72665adc1c84d0ea9,Namespace:kube-system,Attempt:0,} returns sandbox id \"e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9\"" Jan 13 23:46:31.222000 audit: BPF prog-id=93 op=LOAD Jan 13 23:46:31.222000 audit: BPF prog-id=94 op=LOAD Jan 13 23:46:31.222000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2642 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139373031323938646431313566393963313132656339303039613666 Jan 13 23:46:31.222000 audit: BPF prog-id=94 op=UNLOAD Jan 13 23:46:31.222000 audit[2655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2642 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139373031323938646431313566393963313132656339303039613666 Jan 13 23:46:31.223000 audit: BPF prog-id=95 op=LOAD Jan 13 23:46:31.223000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2642 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.223000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139373031323938646431313566393963313132656339303039613666 Jan 13 23:46:31.223000 audit: BPF prog-id=96 op=LOAD Jan 13 23:46:31.223000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2642 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.223000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139373031323938646431313566393963313132656339303039613666 Jan 13 23:46:31.223000 audit: BPF prog-id=96 op=UNLOAD Jan 13 23:46:31.223000 audit[2655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2642 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.223000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139373031323938646431313566393963313132656339303039613666 Jan 13 23:46:31.223000 audit: BPF prog-id=95 op=UNLOAD Jan 13 23:46:31.223000 audit[2655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2642 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.223000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139373031323938646431313566393963313132656339303039613666 Jan 13 23:46:31.224000 audit: BPF prog-id=97 op=LOAD Jan 13 23:46:31.224000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2642 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.224000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139373031323938646431313566393963313132656339303039613666 Jan 13 23:46:31.226834 containerd[1673]: time="2026-01-13T23:46:31.226801483Z" level=info msg="CreateContainer within sandbox \"e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 13 23:46:31.242049 containerd[1673]: time="2026-01-13T23:46:31.240822606Z" level=info msg="Container de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:46:31.253879 containerd[1673]: time="2026-01-13T23:46:31.253830445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-660efdb355,Uid:c6e6d5c400fbec52d361aeb3ad50bfa2,Namespace:kube-system,Attempt:0,} returns sandbox id \"b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f\"" Jan 13 23:46:31.254874 containerd[1673]: time="2026-01-13T23:46:31.254840248Z" level=info msg="CreateContainer within sandbox \"e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359\"" Jan 13 23:46:31.256740 containerd[1673]: time="2026-01-13T23:46:31.256581653Z" level=info msg="CreateContainer within sandbox \"b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 13 23:46:31.257125 containerd[1673]: time="2026-01-13T23:46:31.257090895Z" level=info msg="StartContainer for \"de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359\"" Jan 13 23:46:31.258270 containerd[1673]: time="2026-01-13T23:46:31.258233378Z" level=info msg="connecting to shim de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359" address="unix:///run/containerd/s/a5d70e2eea7671738ec9ec5607fd26d7a5f9ad5f54c6e3887d77df30ba51f8de" protocol=ttrpc version=3 Jan 13 23:46:31.264736 containerd[1673]: time="2026-01-13T23:46:31.264688638Z" level=info msg="Container 204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:46:31.265355 containerd[1673]: time="2026-01-13T23:46:31.265318840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-660efdb355,Uid:c7a79a89255eb2fa49d6bbfe324fa4cd,Namespace:kube-system,Attempt:0,} returns sandbox id \"19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a\"" Jan 13 23:46:31.269101 containerd[1673]: time="2026-01-13T23:46:31.268993491Z" level=info msg="CreateContainer within sandbox \"19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 13 23:46:31.277644 containerd[1673]: time="2026-01-13T23:46:31.277596757Z" level=info msg="CreateContainer within sandbox \"b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e\"" Jan 13 23:46:31.278037 containerd[1673]: time="2026-01-13T23:46:31.278009398Z" level=info msg="StartContainer for \"204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e\"" Jan 13 23:46:31.278963 containerd[1673]: time="2026-01-13T23:46:31.278939201Z" level=info msg="connecting to shim 204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e" address="unix:///run/containerd/s/a2662928acb8cf6f7bde3880480561138f7ee67f7cc250e7777a374faa5e1059" protocol=ttrpc version=3 Jan 13 23:46:31.279280 systemd[1]: Started cri-containerd-de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359.scope - libcontainer container de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359. Jan 13 23:46:31.279573 kubelet[2539]: E0113 23:46:31.279441 2539 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.248:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-660efdb355?timeout=10s\": dial tcp 10.0.21.248:6443: connect: connection refused" interval="800ms" Jan 13 23:46:31.291030 containerd[1673]: time="2026-01-13T23:46:31.290316715Z" level=info msg="Container e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:46:31.292000 audit: BPF prog-id=98 op=LOAD Jan 13 23:46:31.293000 audit: BPF prog-id=99 op=LOAD Jan 13 23:46:31.293000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2581 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465316165666430373937663932663066376663666335613635663736 Jan 13 23:46:31.293000 audit: BPF prog-id=99 op=UNLOAD Jan 13 23:46:31.293000 audit[2709]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465316165666430373937663932663066376663666335613635663736 Jan 13 23:46:31.294000 audit: BPF prog-id=100 op=LOAD Jan 13 23:46:31.294000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2581 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465316165666430373937663932663066376663666335613635663736 Jan 13 23:46:31.294000 audit: BPF prog-id=101 op=LOAD Jan 13 23:46:31.294000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2581 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465316165666430373937663932663066376663666335613635663736 Jan 13 23:46:31.294000 audit: BPF prog-id=101 op=UNLOAD Jan 13 23:46:31.294000 audit[2709]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465316165666430373937663932663066376663666335613635663736 Jan 13 23:46:31.294000 audit: BPF prog-id=100 op=UNLOAD Jan 13 23:46:31.294000 audit[2709]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465316165666430373937663932663066376663666335613635663736 Jan 13 23:46:31.294000 audit: BPF prog-id=102 op=LOAD Jan 13 23:46:31.294000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2581 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465316165666430373937663932663066376663666335613635663736 Jan 13 23:46:31.300806 containerd[1673]: time="2026-01-13T23:46:31.300758507Z" level=info msg="CreateContainer within sandbox \"19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e\"" Jan 13 23:46:31.301370 containerd[1673]: time="2026-01-13T23:46:31.301306228Z" level=info msg="StartContainer for \"e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e\"" Jan 13 23:46:31.302448 containerd[1673]: time="2026-01-13T23:46:31.302416632Z" level=info msg="connecting to shim e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e" address="unix:///run/containerd/s/0d29b3bbba79830d7eb832c76491d63acb8185175d2f54ddf7313124c756afe6" protocol=ttrpc version=3 Jan 13 23:46:31.305273 systemd[1]: Started cri-containerd-204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e.scope - libcontainer container 204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e. Jan 13 23:46:31.324437 systemd[1]: Started cri-containerd-e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e.scope - libcontainer container e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e. Jan 13 23:46:31.325000 audit: BPF prog-id=103 op=LOAD Jan 13 23:46:31.326000 audit: BPF prog-id=104 op=LOAD Jan 13 23:46:31.326000 audit[2728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2606 pid=2728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230343334336539626230643832623738353133613962663866303131 Jan 13 23:46:31.327000 audit: BPF prog-id=104 op=UNLOAD Jan 13 23:46:31.327000 audit[2728]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=2728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230343334336539626230643832623738353133613962663866303131 Jan 13 23:46:31.327000 audit: BPF prog-id=105 op=LOAD Jan 13 23:46:31.327000 audit[2728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2606 pid=2728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230343334336539626230643832623738353133613962663866303131 Jan 13 23:46:31.327000 audit: BPF prog-id=106 op=LOAD Jan 13 23:46:31.327000 audit[2728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2606 pid=2728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230343334336539626230643832623738353133613962663866303131 Jan 13 23:46:31.328000 audit: BPF prog-id=106 op=UNLOAD Jan 13 23:46:31.328000 audit[2728]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=2728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.328000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230343334336539626230643832623738353133613962663866303131 Jan 13 23:46:31.328000 audit: BPF prog-id=105 op=UNLOAD Jan 13 23:46:31.328000 audit[2728]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=2728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.328000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230343334336539626230643832623738353133613962663866303131 Jan 13 23:46:31.328000 audit: BPF prog-id=107 op=LOAD Jan 13 23:46:31.328000 audit[2728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2606 pid=2728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.328000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3230343334336539626230643832623738353133613962663866303131 Jan 13 23:46:31.330299 containerd[1673]: time="2026-01-13T23:46:31.329850314Z" level=info msg="StartContainer for \"de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359\" returns successfully" Jan 13 23:46:31.338000 audit: BPF prog-id=108 op=LOAD Jan 13 23:46:31.339000 audit: BPF prog-id=109 op=LOAD Jan 13 23:46:31.339000 audit[2746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=2642 pid=2746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539663035376632366262623363396664663434643134646564323066 Jan 13 23:46:31.339000 audit: BPF prog-id=109 op=UNLOAD Jan 13 23:46:31.339000 audit[2746]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2642 pid=2746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539663035376632366262623363396664663434643134646564323066 Jan 13 23:46:31.339000 audit: BPF prog-id=110 op=LOAD Jan 13 23:46:31.339000 audit[2746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=2642 pid=2746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539663035376632366262623363396664663434643134646564323066 Jan 13 23:46:31.341000 audit: BPF prog-id=111 op=LOAD Jan 13 23:46:31.341000 audit[2746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=2642 pid=2746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539663035376632366262623363396664663434643134646564323066 Jan 13 23:46:31.341000 audit: BPF prog-id=111 op=UNLOAD Jan 13 23:46:31.341000 audit[2746]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2642 pid=2746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539663035376632366262623363396664663434643134646564323066 Jan 13 23:46:31.341000 audit: BPF prog-id=110 op=UNLOAD Jan 13 23:46:31.341000 audit[2746]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2642 pid=2746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539663035376632366262623363396664663434643134646564323066 Jan 13 23:46:31.341000 audit: BPF prog-id=112 op=LOAD Jan 13 23:46:31.341000 audit[2746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=2642 pid=2746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:31.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539663035376632366262623363396664663434643134646564323066 Jan 13 23:46:31.366235 containerd[1673]: time="2026-01-13T23:46:31.366191864Z" level=info msg="StartContainer for \"204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e\" returns successfully" Jan 13 23:46:31.386747 containerd[1673]: time="2026-01-13T23:46:31.386708966Z" level=info msg="StartContainer for \"e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e\" returns successfully" Jan 13 23:46:31.456781 kubelet[2539]: I0113 23:46:31.456728 2539 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:31.458271 kubelet[2539]: E0113 23:46:31.458248 2539 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.248:6443/api/v1/nodes\": dial tcp 10.0.21.248:6443: connect: connection refused" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:31.703418 kubelet[2539]: E0113 23:46:31.703384 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:31.705683 kubelet[2539]: E0113 23:46:31.705513 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:31.708782 kubelet[2539]: E0113 23:46:31.708559 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:32.261013 kubelet[2539]: I0113 23:46:32.260955 2539 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:32.710736 kubelet[2539]: E0113 23:46:32.710588 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:32.712616 kubelet[2539]: E0113 23:46:32.712574 2539 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.126524 kubelet[2539]: E0113 23:46:33.126469 2539 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-n-660efdb355\" not found" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.155100 kubelet[2539]: I0113 23:46:33.154207 2539 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.155100 kubelet[2539]: E0113 23:46:33.154249 2539 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4547-0-0-n-660efdb355\": node \"ci-4547-0-0-n-660efdb355\" not found" Jan 13 23:46:33.177749 kubelet[2539]: I0113 23:46:33.177716 2539 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.184184 kubelet[2539]: E0113 23:46:33.184151 2539 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.184184 kubelet[2539]: I0113 23:46:33.184184 2539 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.185689 kubelet[2539]: E0113 23:46:33.185666 2539 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-660efdb355\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.185722 kubelet[2539]: I0113 23:46:33.185690 2539 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.187473 kubelet[2539]: E0113 23:46:33.187446 2539 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-660efdb355\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.292689 kubelet[2539]: I0113 23:46:33.292655 2539 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.298387 kubelet[2539]: E0113 23:46:33.298355 2539 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.659526 kubelet[2539]: I0113 23:46:33.659469 2539 apiserver.go:52] "Watching apiserver" Jan 13 23:46:33.675809 kubelet[2539]: I0113 23:46:33.675730 2539 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 13 23:46:33.710086 kubelet[2539]: I0113 23:46:33.709863 2539 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.712058 kubelet[2539]: E0113 23:46:33.712027 2539 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-660efdb355\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.740546 kubelet[2539]: I0113 23:46:33.740425 2539 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:33.744323 kubelet[2539]: E0113 23:46:33.744255 2539 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-660efdb355\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:35.238016 systemd[1]: Reload requested from client PID 2816 ('systemctl') (unit session-10.scope)... Jan 13 23:46:35.238032 systemd[1]: Reloading... Jan 13 23:46:35.321101 zram_generator::config[2862]: No configuration found. Jan 13 23:46:35.506422 systemd[1]: Reloading finished in 268 ms. Jan 13 23:46:35.538748 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:46:35.558836 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 23:46:35.559256 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:35.558000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:35.559526 systemd[1]: kubelet.service: Consumed 1.572s CPU time, 128.1M memory peak. Jan 13 23:46:35.560101 kernel: kauditd_printk_skb: 204 callbacks suppressed Jan 13 23:46:35.560163 kernel: audit: type=1131 audit(1768347995.558:396): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:35.562051 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 23:46:35.562000 audit: BPF prog-id=113 op=LOAD Jan 13 23:46:35.563571 kernel: audit: type=1334 audit(1768347995.562:397): prog-id=113 op=LOAD Jan 13 23:46:35.563621 kernel: audit: type=1334 audit(1768347995.562:398): prog-id=72 op=UNLOAD Jan 13 23:46:35.562000 audit: BPF prog-id=72 op=UNLOAD Jan 13 23:46:35.564000 audit: BPF prog-id=114 op=LOAD Jan 13 23:46:35.565450 kernel: audit: type=1334 audit(1768347995.564:399): prog-id=114 op=LOAD Jan 13 23:46:35.565492 kernel: audit: type=1334 audit(1768347995.564:400): prog-id=80 op=UNLOAD Jan 13 23:46:35.564000 audit: BPF prog-id=80 op=UNLOAD Jan 13 23:46:35.565000 audit: BPF prog-id=115 op=LOAD Jan 13 23:46:35.565000 audit: BPF prog-id=116 op=LOAD Jan 13 23:46:35.567092 kernel: audit: type=1334 audit(1768347995.565:401): prog-id=115 op=LOAD Jan 13 23:46:35.567128 kernel: audit: type=1334 audit(1768347995.565:402): prog-id=116 op=LOAD Jan 13 23:46:35.565000 audit: BPF prog-id=81 op=UNLOAD Jan 13 23:46:35.568733 kernel: audit: type=1334 audit(1768347995.565:403): prog-id=81 op=UNLOAD Jan 13 23:46:35.568783 kernel: audit: type=1334 audit(1768347995.565:404): prog-id=82 op=UNLOAD Jan 13 23:46:35.565000 audit: BPF prog-id=82 op=UNLOAD Jan 13 23:46:35.569547 kernel: audit: type=1334 audit(1768347995.566:405): prog-id=117 op=LOAD Jan 13 23:46:35.566000 audit: BPF prog-id=117 op=LOAD Jan 13 23:46:35.566000 audit: BPF prog-id=76 op=UNLOAD Jan 13 23:46:35.569000 audit: BPF prog-id=118 op=LOAD Jan 13 23:46:35.583000 audit: BPF prog-id=119 op=LOAD Jan 13 23:46:35.583000 audit: BPF prog-id=77 op=UNLOAD Jan 13 23:46:35.583000 audit: BPF prog-id=78 op=UNLOAD Jan 13 23:46:35.584000 audit: BPF prog-id=120 op=LOAD Jan 13 23:46:35.584000 audit: BPF prog-id=79 op=UNLOAD Jan 13 23:46:35.584000 audit: BPF prog-id=121 op=LOAD Jan 13 23:46:35.584000 audit: BPF prog-id=122 op=LOAD Jan 13 23:46:35.584000 audit: BPF prog-id=63 op=UNLOAD Jan 13 23:46:35.584000 audit: BPF prog-id=64 op=UNLOAD Jan 13 23:46:35.586000 audit: BPF prog-id=123 op=LOAD Jan 13 23:46:35.586000 audit: BPF prog-id=73 op=UNLOAD Jan 13 23:46:35.586000 audit: BPF prog-id=124 op=LOAD Jan 13 23:46:35.586000 audit: BPF prog-id=125 op=LOAD Jan 13 23:46:35.586000 audit: BPF prog-id=74 op=UNLOAD Jan 13 23:46:35.586000 audit: BPF prog-id=75 op=UNLOAD Jan 13 23:46:35.586000 audit: BPF prog-id=126 op=LOAD Jan 13 23:46:35.586000 audit: BPF prog-id=69 op=UNLOAD Jan 13 23:46:35.587000 audit: BPF prog-id=127 op=LOAD Jan 13 23:46:35.587000 audit: BPF prog-id=128 op=LOAD Jan 13 23:46:35.587000 audit: BPF prog-id=70 op=UNLOAD Jan 13 23:46:35.587000 audit: BPF prog-id=71 op=UNLOAD Jan 13 23:46:35.588000 audit: BPF prog-id=129 op=LOAD Jan 13 23:46:35.588000 audit: BPF prog-id=65 op=UNLOAD Jan 13 23:46:35.588000 audit: BPF prog-id=130 op=LOAD Jan 13 23:46:35.588000 audit: BPF prog-id=131 op=LOAD Jan 13 23:46:35.588000 audit: BPF prog-id=66 op=UNLOAD Jan 13 23:46:35.588000 audit: BPF prog-id=67 op=UNLOAD Jan 13 23:46:35.589000 audit: BPF prog-id=132 op=LOAD Jan 13 23:46:35.589000 audit: BPF prog-id=68 op=UNLOAD Jan 13 23:46:35.731114 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 23:46:35.730000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:35.736695 (kubelet)[2907]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 23:46:35.811121 kubelet[2907]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 23:46:35.811121 kubelet[2907]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 13 23:46:35.811121 kubelet[2907]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 23:46:35.811766 kubelet[2907]: I0113 23:46:35.811104 2907 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 23:46:35.818802 kubelet[2907]: I0113 23:46:35.818757 2907 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 13 23:46:35.818802 kubelet[2907]: I0113 23:46:35.818790 2907 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 23:46:35.819047 kubelet[2907]: I0113 23:46:35.819014 2907 server.go:954] "Client rotation is on, will bootstrap in background" Jan 13 23:46:35.820252 kubelet[2907]: I0113 23:46:35.820223 2907 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 13 23:46:35.854196 kubelet[2907]: I0113 23:46:35.853740 2907 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 23:46:35.857416 kubelet[2907]: I0113 23:46:35.857381 2907 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 13 23:46:35.860479 kubelet[2907]: I0113 23:46:35.860420 2907 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 23:46:35.860677 kubelet[2907]: I0113 23:46:35.860615 2907 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 23:46:35.861439 kubelet[2907]: I0113 23:46:35.860665 2907 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-660efdb355","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 13 23:46:35.861439 kubelet[2907]: I0113 23:46:35.860926 2907 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 23:46:35.861439 kubelet[2907]: I0113 23:46:35.860948 2907 container_manager_linux.go:304] "Creating device plugin manager" Jan 13 23:46:35.861439 kubelet[2907]: I0113 23:46:35.861023 2907 state_mem.go:36] "Initialized new in-memory state store" Jan 13 23:46:35.861439 kubelet[2907]: I0113 23:46:35.861197 2907 kubelet.go:446] "Attempting to sync node with API server" Jan 13 23:46:35.861624 kubelet[2907]: I0113 23:46:35.861215 2907 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 23:46:35.861624 kubelet[2907]: I0113 23:46:35.861238 2907 kubelet.go:352] "Adding apiserver pod source" Jan 13 23:46:35.861624 kubelet[2907]: I0113 23:46:35.861247 2907 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 23:46:35.862379 kubelet[2907]: I0113 23:46:35.862356 2907 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 13 23:46:35.862880 kubelet[2907]: I0113 23:46:35.862864 2907 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 23:46:35.863722 kubelet[2907]: I0113 23:46:35.863699 2907 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 13 23:46:35.863820 kubelet[2907]: I0113 23:46:35.863811 2907 server.go:1287] "Started kubelet" Jan 13 23:46:35.864906 kubelet[2907]: I0113 23:46:35.864881 2907 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 23:46:35.868076 kubelet[2907]: I0113 23:46:35.868006 2907 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 23:46:35.868886 kubelet[2907]: I0113 23:46:35.868851 2907 server.go:479] "Adding debug handlers to kubelet server" Jan 13 23:46:35.869355 kubelet[2907]: I0113 23:46:35.869335 2907 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 13 23:46:35.869612 kubelet[2907]: E0113 23:46:35.869590 2907 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-660efdb355\" not found" Jan 13 23:46:35.869800 kubelet[2907]: I0113 23:46:35.869727 2907 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 23:46:35.869951 kubelet[2907]: I0113 23:46:35.869926 2907 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 23:46:35.870167 kubelet[2907]: I0113 23:46:35.870134 2907 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 13 23:46:35.870944 kubelet[2907]: I0113 23:46:35.870906 2907 reconciler.go:26] "Reconciler: start to sync state" Jan 13 23:46:35.870944 kubelet[2907]: I0113 23:46:35.870943 2907 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 13 23:46:35.875156 kubelet[2907]: I0113 23:46:35.875126 2907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 23:46:35.879145 kubelet[2907]: I0113 23:46:35.877721 2907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 23:46:35.879145 kubelet[2907]: I0113 23:46:35.877744 2907 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 13 23:46:35.879145 kubelet[2907]: I0113 23:46:35.877761 2907 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 13 23:46:35.879145 kubelet[2907]: I0113 23:46:35.877767 2907 kubelet.go:2382] "Starting kubelet main sync loop" Jan 13 23:46:35.879145 kubelet[2907]: E0113 23:46:35.877803 2907 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 23:46:35.884217 kubelet[2907]: I0113 23:46:35.883853 2907 factory.go:221] Registration of the systemd container factory successfully Jan 13 23:46:35.884217 kubelet[2907]: I0113 23:46:35.884019 2907 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 23:46:35.886532 kubelet[2907]: I0113 23:46:35.886490 2907 factory.go:221] Registration of the containerd container factory successfully Jan 13 23:46:35.890875 kubelet[2907]: E0113 23:46:35.890806 2907 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 23:46:35.921811 kubelet[2907]: I0113 23:46:35.921782 2907 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 13 23:46:35.921811 kubelet[2907]: I0113 23:46:35.921805 2907 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 13 23:46:35.921949 kubelet[2907]: I0113 23:46:35.921825 2907 state_mem.go:36] "Initialized new in-memory state store" Jan 13 23:46:35.922000 kubelet[2907]: I0113 23:46:35.921977 2907 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 13 23:46:35.922025 kubelet[2907]: I0113 23:46:35.921997 2907 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 13 23:46:35.922025 kubelet[2907]: I0113 23:46:35.922015 2907 policy_none.go:49] "None policy: Start" Jan 13 23:46:35.922025 kubelet[2907]: I0113 23:46:35.922024 2907 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 13 23:46:35.922094 kubelet[2907]: I0113 23:46:35.922033 2907 state_mem.go:35] "Initializing new in-memory state store" Jan 13 23:46:35.922188 kubelet[2907]: I0113 23:46:35.922172 2907 state_mem.go:75] "Updated machine memory state" Jan 13 23:46:35.926037 kubelet[2907]: I0113 23:46:35.926003 2907 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 23:46:35.926801 kubelet[2907]: I0113 23:46:35.926195 2907 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 13 23:46:35.926801 kubelet[2907]: I0113 23:46:35.926208 2907 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 23:46:35.926801 kubelet[2907]: I0113 23:46:35.926799 2907 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 23:46:35.927572 kubelet[2907]: E0113 23:46:35.927478 2907 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 13 23:46:35.979702 kubelet[2907]: I0113 23:46:35.979612 2907 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:35.980532 kubelet[2907]: I0113 23:46:35.980504 2907 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:35.980644 kubelet[2907]: I0113 23:46:35.980609 2907 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.029517 kubelet[2907]: I0113 23:46:36.029469 2907 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.038831 kubelet[2907]: I0113 23:46:36.037546 2907 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.038831 kubelet[2907]: I0113 23:46:36.037625 2907 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.172992 kubelet[2907]: I0113 23:46:36.172873 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c7a79a89255eb2fa49d6bbfe324fa4cd-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-660efdb355\" (UID: \"c7a79a89255eb2fa49d6bbfe324fa4cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.173154 kubelet[2907]: I0113 23:46:36.173136 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c7a79a89255eb2fa49d6bbfe324fa4cd-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-660efdb355\" (UID: \"c7a79a89255eb2fa49d6bbfe324fa4cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.173316 kubelet[2907]: I0113 23:46:36.173287 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.173397 kubelet[2907]: I0113 23:46:36.173385 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.173746 kubelet[2907]: I0113 23:46:36.173706 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.173860 kubelet[2907]: I0113 23:46:36.173842 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.174278 kubelet[2907]: I0113 23:46:36.173972 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb1d772901d2bbe72665adc1c84d0ea9-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-660efdb355\" (UID: \"fb1d772901d2bbe72665adc1c84d0ea9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.174709 kubelet[2907]: I0113 23:46:36.174427 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c6e6d5c400fbec52d361aeb3ad50bfa2-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-660efdb355\" (UID: \"c6e6d5c400fbec52d361aeb3ad50bfa2\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.174709 kubelet[2907]: I0113 23:46:36.174581 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c7a79a89255eb2fa49d6bbfe324fa4cd-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-660efdb355\" (UID: \"c7a79a89255eb2fa49d6bbfe324fa4cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.862175 kubelet[2907]: I0113 23:46:36.862130 2907 apiserver.go:52] "Watching apiserver" Jan 13 23:46:36.871048 kubelet[2907]: I0113 23:46:36.871002 2907 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 13 23:46:36.907034 kubelet[2907]: I0113 23:46:36.906980 2907 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.907344 kubelet[2907]: I0113 23:46:36.907117 2907 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.914024 kubelet[2907]: E0113 23:46:36.913968 2907 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-660efdb355\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.914276 kubelet[2907]: E0113 23:46:36.914239 2907 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-660efdb355\" already exists" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" Jan 13 23:46:36.924299 kubelet[2907]: I0113 23:46:36.924250 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-660efdb355" podStartSLOduration=1.924236974 podStartE2EDuration="1.924236974s" podCreationTimestamp="2026-01-13 23:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-13 23:46:36.924236134 +0000 UTC m=+1.184551661" watchObservedRunningTime="2026-01-13 23:46:36.924236974 +0000 UTC m=+1.184552541" Jan 13 23:46:36.940872 kubelet[2907]: I0113 23:46:36.940821 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-n-660efdb355" podStartSLOduration=1.940805984 podStartE2EDuration="1.940805984s" podCreationTimestamp="2026-01-13 23:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-13 23:46:36.931172995 +0000 UTC m=+1.191488562" watchObservedRunningTime="2026-01-13 23:46:36.940805984 +0000 UTC m=+1.201121551" Jan 13 23:46:36.950507 kubelet[2907]: I0113 23:46:36.950458 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-n-660efdb355" podStartSLOduration=1.9504464929999998 podStartE2EDuration="1.950446493s" podCreationTimestamp="2026-01-13 23:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-13 23:46:36.940940464 +0000 UTC m=+1.201256031" watchObservedRunningTime="2026-01-13 23:46:36.950446493 +0000 UTC m=+1.210762020" Jan 13 23:46:38.655990 update_engine[1652]: I20260113 23:46:38.655107 1652 update_attempter.cc:509] Updating boot flags... Jan 13 23:46:40.787896 kubelet[2907]: I0113 23:46:40.787826 2907 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 13 23:46:40.788722 containerd[1673]: time="2026-01-13T23:46:40.788614079Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 23:46:40.789013 kubelet[2907]: I0113 23:46:40.788847 2907 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 13 23:46:41.786789 systemd[1]: Created slice kubepods-besteffort-pod93f5e148_37cc_4f75_ad33_7771a4bff319.slice - libcontainer container kubepods-besteffort-pod93f5e148_37cc_4f75_ad33_7771a4bff319.slice. Jan 13 23:46:41.808218 kubelet[2907]: I0113 23:46:41.808101 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93f5e148-37cc-4f75-ad33-7771a4bff319-lib-modules\") pod \"kube-proxy-zsrrz\" (UID: \"93f5e148-37cc-4f75-ad33-7771a4bff319\") " pod="kube-system/kube-proxy-zsrrz" Jan 13 23:46:41.808218 kubelet[2907]: I0113 23:46:41.808158 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/93f5e148-37cc-4f75-ad33-7771a4bff319-kube-proxy\") pod \"kube-proxy-zsrrz\" (UID: \"93f5e148-37cc-4f75-ad33-7771a4bff319\") " pod="kube-system/kube-proxy-zsrrz" Jan 13 23:46:41.808218 kubelet[2907]: I0113 23:46:41.808178 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/93f5e148-37cc-4f75-ad33-7771a4bff319-xtables-lock\") pod \"kube-proxy-zsrrz\" (UID: \"93f5e148-37cc-4f75-ad33-7771a4bff319\") " pod="kube-system/kube-proxy-zsrrz" Jan 13 23:46:41.808218 kubelet[2907]: I0113 23:46:41.808196 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd4rx\" (UniqueName: \"kubernetes.io/projected/93f5e148-37cc-4f75-ad33-7771a4bff319-kube-api-access-fd4rx\") pod \"kube-proxy-zsrrz\" (UID: \"93f5e148-37cc-4f75-ad33-7771a4bff319\") " pod="kube-system/kube-proxy-zsrrz" Jan 13 23:46:41.906851 systemd[1]: Created slice kubepods-besteffort-pod3afc597f_2fdd_4746_9d35_ec1a17412d3f.slice - libcontainer container kubepods-besteffort-pod3afc597f_2fdd_4746_9d35_ec1a17412d3f.slice. Jan 13 23:46:41.909652 kubelet[2907]: I0113 23:46:41.909621 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpml\" (UniqueName: \"kubernetes.io/projected/3afc597f-2fdd-4746-9d35-ec1a17412d3f-kube-api-access-4cpml\") pod \"tigera-operator-7dcd859c48-mfnf4\" (UID: \"3afc597f-2fdd-4746-9d35-ec1a17412d3f\") " pod="tigera-operator/tigera-operator-7dcd859c48-mfnf4" Jan 13 23:46:41.909837 kubelet[2907]: I0113 23:46:41.909820 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3afc597f-2fdd-4746-9d35-ec1a17412d3f-var-lib-calico\") pod \"tigera-operator-7dcd859c48-mfnf4\" (UID: \"3afc597f-2fdd-4746-9d35-ec1a17412d3f\") " pod="tigera-operator/tigera-operator-7dcd859c48-mfnf4" Jan 13 23:46:42.104160 containerd[1673]: time="2026-01-13T23:46:42.104038369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zsrrz,Uid:93f5e148-37cc-4f75-ad33-7771a4bff319,Namespace:kube-system,Attempt:0,}" Jan 13 23:46:42.123543 containerd[1673]: time="2026-01-13T23:46:42.123504268Z" level=info msg="connecting to shim a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670" address="unix:///run/containerd/s/39f98b07db4833669528a495782dfcdf980663453d916ebed9fd0d943beea130" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:46:42.158343 systemd[1]: Started cri-containerd-a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670.scope - libcontainer container a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670. Jan 13 23:46:42.166000 audit: BPF prog-id=133 op=LOAD Jan 13 23:46:42.168283 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 13 23:46:42.168334 kernel: audit: type=1334 audit(1768348002.166:438): prog-id=133 op=LOAD Jan 13 23:46:42.168000 audit: BPF prog-id=134 op=LOAD Jan 13 23:46:42.170757 kernel: audit: type=1334 audit(1768348002.168:439): prog-id=134 op=LOAD Jan 13 23:46:42.170846 kernel: audit: type=1300 audit(1768348002.168:439): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.168000 audit[2991]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.168000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.177138 kernel: audit: type=1327 audit(1768348002.168:439): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.177188 kernel: audit: type=1334 audit(1768348002.168:440): prog-id=134 op=UNLOAD Jan 13 23:46:42.168000 audit: BPF prog-id=134 op=UNLOAD Jan 13 23:46:42.168000 audit[2991]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.181453 kernel: audit: type=1300 audit(1768348002.168:440): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.168000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.184542 kernel: audit: type=1327 audit(1768348002.168:440): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.184704 kernel: audit: type=1334 audit(1768348002.168:441): prog-id=135 op=LOAD Jan 13 23:46:42.168000 audit: BPF prog-id=135 op=LOAD Jan 13 23:46:42.168000 audit[2991]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.188482 kernel: audit: type=1300 audit(1768348002.168:441): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.168000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.191953 kernel: audit: type=1327 audit(1768348002.168:441): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.169000 audit: BPF prog-id=136 op=LOAD Jan 13 23:46:42.169000 audit[2991]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.169000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.172000 audit: BPF prog-id=136 op=UNLOAD Jan 13 23:46:42.172000 audit[2991]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.172000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.172000 audit: BPF prog-id=135 op=UNLOAD Jan 13 23:46:42.172000 audit[2991]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.172000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.172000 audit: BPF prog-id=137 op=LOAD Jan 13 23:46:42.172000 audit[2991]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2980 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.172000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137323031386665636565383333306332613465306130376338386336 Jan 13 23:46:42.202868 containerd[1673]: time="2026-01-13T23:46:42.202748907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zsrrz,Uid:93f5e148-37cc-4f75-ad33-7771a4bff319,Namespace:kube-system,Attempt:0,} returns sandbox id \"a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670\"" Jan 13 23:46:42.206331 containerd[1673]: time="2026-01-13T23:46:42.206293998Z" level=info msg="CreateContainer within sandbox \"a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 23:46:42.210946 containerd[1673]: time="2026-01-13T23:46:42.210910732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mfnf4,Uid:3afc597f-2fdd-4746-9d35-ec1a17412d3f,Namespace:tigera-operator,Attempt:0,}" Jan 13 23:46:42.217085 containerd[1673]: time="2026-01-13T23:46:42.216451029Z" level=info msg="Container 30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:46:42.228596 containerd[1673]: time="2026-01-13T23:46:42.228518465Z" level=info msg="CreateContainer within sandbox \"a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb\"" Jan 13 23:46:42.229072 containerd[1673]: time="2026-01-13T23:46:42.229025827Z" level=info msg="StartContainer for \"30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb\"" Jan 13 23:46:42.231339 containerd[1673]: time="2026-01-13T23:46:42.231309954Z" level=info msg="connecting to shim 30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb" address="unix:///run/containerd/s/39f98b07db4833669528a495782dfcdf980663453d916ebed9fd0d943beea130" protocol=ttrpc version=3 Jan 13 23:46:42.245100 containerd[1673]: time="2026-01-13T23:46:42.244882714Z" level=info msg="connecting to shim a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29" address="unix:///run/containerd/s/1107ef2b7f3507e75436b23ad64fc933b92ebe26f3b598af600d9a69877dc0a2" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:46:42.255304 systemd[1]: Started cri-containerd-30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb.scope - libcontainer container 30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb. Jan 13 23:46:42.272297 systemd[1]: Started cri-containerd-a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29.scope - libcontainer container a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29. Jan 13 23:46:42.281000 audit: BPF prog-id=138 op=LOAD Jan 13 23:46:42.281000 audit: BPF prog-id=139 op=LOAD Jan 13 23:46:42.281000 audit[3048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3037 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131323032323533353264666431306235663237363530613636646333 Jan 13 23:46:42.281000 audit: BPF prog-id=139 op=UNLOAD Jan 13 23:46:42.281000 audit[3048]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131323032323533353264666431306235663237363530613636646333 Jan 13 23:46:42.281000 audit: BPF prog-id=140 op=LOAD Jan 13 23:46:42.281000 audit[3048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3037 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131323032323533353264666431306235663237363530613636646333 Jan 13 23:46:42.281000 audit: BPF prog-id=141 op=LOAD Jan 13 23:46:42.281000 audit[3048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3037 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131323032323533353264666431306235663237363530613636646333 Jan 13 23:46:42.282000 audit: BPF prog-id=141 op=UNLOAD Jan 13 23:46:42.282000 audit[3048]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.282000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131323032323533353264666431306235663237363530613636646333 Jan 13 23:46:42.282000 audit: BPF prog-id=140 op=UNLOAD Jan 13 23:46:42.282000 audit[3048]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.282000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131323032323533353264666431306235663237363530613636646333 Jan 13 23:46:42.282000 audit: BPF prog-id=142 op=LOAD Jan 13 23:46:42.282000 audit[3048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3037 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.282000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131323032323533353264666431306235663237363530613636646333 Jan 13 23:46:42.305809 containerd[1673]: time="2026-01-13T23:46:42.305748338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mfnf4,Uid:3afc597f-2fdd-4746-9d35-ec1a17412d3f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29\"" Jan 13 23:46:42.307359 containerd[1673]: time="2026-01-13T23:46:42.307285783Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 13 23:46:42.307000 audit: BPF prog-id=143 op=LOAD Jan 13 23:46:42.307000 audit[3016]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2980 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.307000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330653737333462353935343535356666616262356632356432313234 Jan 13 23:46:42.307000 audit: BPF prog-id=144 op=LOAD Jan 13 23:46:42.307000 audit[3016]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2980 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.307000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330653737333462353935343535356666616262356632356432313234 Jan 13 23:46:42.307000 audit: BPF prog-id=144 op=UNLOAD Jan 13 23:46:42.307000 audit[3016]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2980 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.307000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330653737333462353935343535356666616262356632356432313234 Jan 13 23:46:42.307000 audit: BPF prog-id=143 op=UNLOAD Jan 13 23:46:42.307000 audit[3016]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2980 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.307000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330653737333462353935343535356666616262356632356432313234 Jan 13 23:46:42.307000 audit: BPF prog-id=145 op=LOAD Jan 13 23:46:42.307000 audit[3016]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2980 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.307000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330653737333462353935343535356666616262356632356432313234 Jan 13 23:46:42.328532 containerd[1673]: time="2026-01-13T23:46:42.328492487Z" level=info msg="StartContainer for \"30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb\" returns successfully" Jan 13 23:46:42.487000 audit[3125]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.487000 audit[3125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffeedc3760 a2=0 a3=1 items=0 ppid=3061 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.487000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 13 23:46:42.489000 audit[3127]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.489000 audit[3127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdbaa5ce0 a2=0 a3=1 items=0 ppid=3061 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.489000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 13 23:46:42.489000 audit[3126]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=3126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.489000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcc3a0080 a2=0 a3=1 items=0 ppid=3061 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.489000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 13 23:46:42.491000 audit[3128]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3128 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.491000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe372b8e0 a2=0 a3=1 items=0 ppid=3061 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.491000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 13 23:46:42.493000 audit[3129]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.493000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe495efb0 a2=0 a3=1 items=0 ppid=3061 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.493000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 13 23:46:42.495000 audit[3131]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3131 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.495000 audit[3131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe7ae1790 a2=0 a3=1 items=0 ppid=3061 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.495000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 13 23:46:42.589000 audit[3132]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.589000 audit[3132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffff7c44c30 a2=0 a3=1 items=0 ppid=3061 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.589000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 13 23:46:42.592000 audit[3134]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.592000 audit[3134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd29e3a70 a2=0 a3=1 items=0 ppid=3061 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.592000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 13 23:46:42.596000 audit[3137]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.596000 audit[3137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffefd2ebb0 a2=0 a3=1 items=0 ppid=3061 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.596000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 13 23:46:42.597000 audit[3138]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.597000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd5d27c00 a2=0 a3=1 items=0 ppid=3061 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.597000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 13 23:46:42.599000 audit[3140]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.599000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd7c708c0 a2=0 a3=1 items=0 ppid=3061 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.599000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 13 23:46:42.600000 audit[3141]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.600000 audit[3141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe9d386a0 a2=0 a3=1 items=0 ppid=3061 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.600000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 13 23:46:42.603000 audit[3143]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.603000 audit[3143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffcaf2f6e0 a2=0 a3=1 items=0 ppid=3061 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.603000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 13 23:46:42.606000 audit[3146]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.606000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc66d05a0 a2=0 a3=1 items=0 ppid=3061 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.606000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 13 23:46:42.608000 audit[3147]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc20a3ce0 a2=0 a3=1 items=0 ppid=3061 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.608000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 13 23:46:42.610000 audit[3149]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.610000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd5503ef0 a2=0 a3=1 items=0 ppid=3061 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.610000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 13 23:46:42.611000 audit[3150]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.611000 audit[3150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc160d120 a2=0 a3=1 items=0 ppid=3061 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.611000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 13 23:46:42.614000 audit[3152]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.614000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffd802ff0 a2=0 a3=1 items=0 ppid=3061 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.614000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 13 23:46:42.617000 audit[3155]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.617000 audit[3155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffce027c60 a2=0 a3=1 items=0 ppid=3061 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.617000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 13 23:46:42.620000 audit[3158]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.620000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd7639ff0 a2=0 a3=1 items=0 ppid=3061 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.620000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 13 23:46:42.621000 audit[3159]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.621000 audit[3159]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd2b65360 a2=0 a3=1 items=0 ppid=3061 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.621000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 13 23:46:42.624000 audit[3161]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3161 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.624000 audit[3161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe74462e0 a2=0 a3=1 items=0 ppid=3061 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.624000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 13 23:46:42.627000 audit[3164]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3164 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.627000 audit[3164]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe36ede90 a2=0 a3=1 items=0 ppid=3061 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.627000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 13 23:46:42.628000 audit[3165]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3165 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.628000 audit[3165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff83a1fa0 a2=0 a3=1 items=0 ppid=3061 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.628000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 13 23:46:42.631000 audit[3167]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 13 23:46:42.631000 audit[3167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffc9856300 a2=0 a3=1 items=0 ppid=3061 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.631000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 13 23:46:42.656000 audit[3174]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3174 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:42.656000 audit[3174]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe6c711b0 a2=0 a3=1 items=0 ppid=3061 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.656000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:42.672000 audit[3174]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3174 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:42.672000 audit[3174]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffe6c711b0 a2=0 a3=1 items=0 ppid=3061 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.672000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:42.673000 audit[3179]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3179 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.673000 audit[3179]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffdbd05a90 a2=0 a3=1 items=0 ppid=3061 pid=3179 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.673000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 13 23:46:42.676000 audit[3181]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.676000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffd123e670 a2=0 a3=1 items=0 ppid=3061 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.676000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 13 23:46:42.680000 audit[3184]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3184 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.680000 audit[3184]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffe1e7b0e0 a2=0 a3=1 items=0 ppid=3061 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.680000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 13 23:46:42.681000 audit[3185]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3185 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.681000 audit[3185]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd35e8180 a2=0 a3=1 items=0 ppid=3061 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.681000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 13 23:46:42.683000 audit[3187]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.683000 audit[3187]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffffd5a1790 a2=0 a3=1 items=0 ppid=3061 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.683000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 13 23:46:42.684000 audit[3188]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.684000 audit[3188]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd7faa6c0 a2=0 a3=1 items=0 ppid=3061 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.684000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 13 23:46:42.687000 audit[3190]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.687000 audit[3190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffeeb66850 a2=0 a3=1 items=0 ppid=3061 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.687000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 13 23:46:42.690000 audit[3193]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.690000 audit[3193]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffc2e05fd0 a2=0 a3=1 items=0 ppid=3061 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.690000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 13 23:46:42.692000 audit[3194]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.692000 audit[3194]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc5e9be90 a2=0 a3=1 items=0 ppid=3061 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.692000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 13 23:46:42.694000 audit[3196]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.694000 audit[3196]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff9f4e380 a2=0 a3=1 items=0 ppid=3061 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.694000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 13 23:46:42.695000 audit[3197]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.695000 audit[3197]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe6093490 a2=0 a3=1 items=0 ppid=3061 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.695000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 13 23:46:42.697000 audit[3199]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.697000 audit[3199]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff9c8cef0 a2=0 a3=1 items=0 ppid=3061 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.697000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 13 23:46:42.701000 audit[3202]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3202 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.701000 audit[3202]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff0a57a90 a2=0 a3=1 items=0 ppid=3061 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.701000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 13 23:46:42.704000 audit[3205]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.704000 audit[3205]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc31edf50 a2=0 a3=1 items=0 ppid=3061 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.704000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 13 23:46:42.705000 audit[3206]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.705000 audit[3206]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe561dc00 a2=0 a3=1 items=0 ppid=3061 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.705000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 13 23:46:42.707000 audit[3208]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.707000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe5555d50 a2=0 a3=1 items=0 ppid=3061 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.707000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 13 23:46:42.711000 audit[3211]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.711000 audit[3211]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff29ea570 a2=0 a3=1 items=0 ppid=3061 pid=3211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.711000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 13 23:46:42.712000 audit[3212]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.712000 audit[3212]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffeca7950 a2=0 a3=1 items=0 ppid=3061 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.712000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 13 23:46:42.714000 audit[3214]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3214 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.714000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffe7ab6d60 a2=0 a3=1 items=0 ppid=3061 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.714000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 13 23:46:42.716000 audit[3215]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3215 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.716000 audit[3215]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff9ee6a10 a2=0 a3=1 items=0 ppid=3061 pid=3215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.716000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 13 23:46:42.718000 audit[3217]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3217 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.718000 audit[3217]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd926a150 a2=0 a3=1 items=0 ppid=3061 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.718000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 13 23:46:42.722000 audit[3220]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 13 23:46:42.722000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd8a82ca0 a2=0 a3=1 items=0 ppid=3061 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.722000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 13 23:46:42.725000 audit[3222]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3222 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 13 23:46:42.725000 audit[3222]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffc33f9720 a2=0 a3=1 items=0 ppid=3061 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.725000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:42.726000 audit[3222]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3222 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 13 23:46:42.726000 audit[3222]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffc33f9720 a2=0 a3=1 items=0 ppid=3061 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:42.726000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:42.924418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2770490528.mount: Deactivated successfully. Jan 13 23:46:42.934693 kubelet[2907]: I0113 23:46:42.934630 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-zsrrz" podStartSLOduration=1.934611436 podStartE2EDuration="1.934611436s" podCreationTimestamp="2026-01-13 23:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-13 23:46:42.934291675 +0000 UTC m=+7.194607242" watchObservedRunningTime="2026-01-13 23:46:42.934611436 +0000 UTC m=+7.194926963" Jan 13 23:46:43.976034 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount589209978.mount: Deactivated successfully. Jan 13 23:46:44.256073 containerd[1673]: time="2026-01-13T23:46:44.255698463Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:44.256816 containerd[1673]: time="2026-01-13T23:46:44.256551786Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 13 23:46:44.257688 containerd[1673]: time="2026-01-13T23:46:44.257648509Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:44.260140 containerd[1673]: time="2026-01-13T23:46:44.260109117Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:46:44.260722 containerd[1673]: time="2026-01-13T23:46:44.260700999Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 1.953378776s" Jan 13 23:46:44.260818 containerd[1673]: time="2026-01-13T23:46:44.260800599Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 13 23:46:44.263756 containerd[1673]: time="2026-01-13T23:46:44.263720408Z" level=info msg="CreateContainer within sandbox \"a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 13 23:46:44.273284 containerd[1673]: time="2026-01-13T23:46:44.273239436Z" level=info msg="Container 78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:46:44.277554 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1101037274.mount: Deactivated successfully. Jan 13 23:46:44.280961 containerd[1673]: time="2026-01-13T23:46:44.280913740Z" level=info msg="CreateContainer within sandbox \"a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b\"" Jan 13 23:46:44.281500 containerd[1673]: time="2026-01-13T23:46:44.281455181Z" level=info msg="StartContainer for \"78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b\"" Jan 13 23:46:44.282623 containerd[1673]: time="2026-01-13T23:46:44.282588825Z" level=info msg="connecting to shim 78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b" address="unix:///run/containerd/s/1107ef2b7f3507e75436b23ad64fc933b92ebe26f3b598af600d9a69877dc0a2" protocol=ttrpc version=3 Jan 13 23:46:44.304448 systemd[1]: Started cri-containerd-78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b.scope - libcontainer container 78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b. Jan 13 23:46:44.313000 audit: BPF prog-id=146 op=LOAD Jan 13 23:46:44.313000 audit: BPF prog-id=147 op=LOAD Jan 13 23:46:44.313000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3037 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:44.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738666139343665316435336466333939333838643036383633653639 Jan 13 23:46:44.313000 audit: BPF prog-id=147 op=UNLOAD Jan 13 23:46:44.313000 audit[3231]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:44.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738666139343665316435336466333939333838643036383633653639 Jan 13 23:46:44.313000 audit: BPF prog-id=148 op=LOAD Jan 13 23:46:44.313000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3037 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:44.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738666139343665316435336466333939333838643036383633653639 Jan 13 23:46:44.313000 audit: BPF prog-id=149 op=LOAD Jan 13 23:46:44.313000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3037 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:44.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738666139343665316435336466333939333838643036383633653639 Jan 13 23:46:44.314000 audit: BPF prog-id=149 op=UNLOAD Jan 13 23:46:44.314000 audit[3231]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:44.314000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738666139343665316435336466333939333838643036383633653639 Jan 13 23:46:44.314000 audit: BPF prog-id=148 op=UNLOAD Jan 13 23:46:44.314000 audit[3231]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:44.314000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738666139343665316435336466333939333838643036383633653639 Jan 13 23:46:44.314000 audit: BPF prog-id=150 op=LOAD Jan 13 23:46:44.314000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3037 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:44.314000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738666139343665316435336466333939333838643036383633653639 Jan 13 23:46:44.335577 containerd[1673]: time="2026-01-13T23:46:44.335538184Z" level=info msg="StartContainer for \"78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b\" returns successfully" Jan 13 23:46:44.935945 kubelet[2907]: I0113 23:46:44.935834 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-mfnf4" podStartSLOduration=1.9811317370000001 podStartE2EDuration="3.935816036s" podCreationTimestamp="2026-01-13 23:46:41 +0000 UTC" firstStartedPulling="2026-01-13 23:46:42.306890342 +0000 UTC m=+6.567205869" lastFinishedPulling="2026-01-13 23:46:44.261574601 +0000 UTC m=+8.521890168" observedRunningTime="2026-01-13 23:46:44.934964794 +0000 UTC m=+9.195280361" watchObservedRunningTime="2026-01-13 23:46:44.935816036 +0000 UTC m=+9.196131603" Jan 13 23:46:49.447000 audit[1960]: USER_END pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:49.448241 sudo[1960]: pam_unix(sudo:session): session closed for user root Jan 13 23:46:49.451425 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 13 23:46:49.451496 kernel: audit: type=1106 audit(1768348009.447:518): pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:49.447000 audit[1960]: CRED_DISP pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:49.454172 kernel: audit: type=1104 audit(1768348009.447:519): pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 13 23:46:49.546093 sshd[1959]: Connection closed by 20.161.92.111 port 44782 Jan 13 23:46:49.547879 sshd-session[1955]: pam_unix(sshd:session): session closed for user core Jan 13 23:46:49.548000 audit[1955]: USER_END pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:49.548000 audit[1955]: CRED_DISP pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:49.553385 systemd[1]: sshd@10-10.0.21.248:22-20.161.92.111:44782.service: Deactivated successfully. Jan 13 23:46:49.555247 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 23:46:49.555601 systemd[1]: session-10.scope: Consumed 5.833s CPU time, 220.3M memory peak. Jan 13 23:46:49.555765 kernel: audit: type=1106 audit(1768348009.548:520): pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:49.555805 kernel: audit: type=1104 audit(1768348009.548:521): pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 13 23:46:49.556082 kernel: audit: type=1131 audit(1768348009.552:522): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.248:22-20.161.92.111:44782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:49.552000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.248:22-20.161.92.111:44782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 13 23:46:49.558664 systemd-logind[1650]: Session 10 logged out. Waiting for processes to exit. Jan 13 23:46:49.560125 systemd-logind[1650]: Removed session 10. Jan 13 23:46:51.306000 audit[3323]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:51.306000 audit[3323]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff5a15c50 a2=0 a3=1 items=0 ppid=3061 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:51.313079 kernel: audit: type=1325 audit(1768348011.306:523): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:51.313216 kernel: audit: type=1300 audit(1768348011.306:523): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff5a15c50 a2=0 a3=1 items=0 ppid=3061 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:51.306000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:51.315718 kernel: audit: type=1327 audit(1768348011.306:523): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:51.318000 audit[3323]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:51.318000 audit[3323]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff5a15c50 a2=0 a3=1 items=0 ppid=3061 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:51.325356 kernel: audit: type=1325 audit(1768348011.318:524): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:51.325471 kernel: audit: type=1300 audit(1768348011.318:524): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff5a15c50 a2=0 a3=1 items=0 ppid=3061 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:51.318000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:51.330000 audit[3325]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:51.330000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd0a82680 a2=0 a3=1 items=0 ppid=3061 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:51.330000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:51.339000 audit[3325]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:51.339000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd0a82680 a2=0 a3=1 items=0 ppid=3061 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:51.339000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.462000 audit[3327]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:55.465500 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 13 23:46:55.465561 kernel: audit: type=1325 audit(1768348015.462:527): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:55.462000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff515e390 a2=0 a3=1 items=0 ppid=3061 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:55.470482 kernel: audit: type=1300 audit(1768348015.462:527): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff515e390 a2=0 a3=1 items=0 ppid=3061 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:55.470546 kernel: audit: type=1327 audit(1768348015.462:527): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.462000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.474000 audit[3327]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:55.474000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff515e390 a2=0 a3=1 items=0 ppid=3061 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:55.482251 kernel: audit: type=1325 audit(1768348015.474:528): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:55.482351 kernel: audit: type=1300 audit(1768348015.474:528): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff515e390 a2=0 a3=1 items=0 ppid=3061 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:55.474000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.483950 kernel: audit: type=1327 audit(1768348015.474:528): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.499000 audit[3330]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:55.503100 kernel: audit: type=1325 audit(1768348015.499:529): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:55.503203 kernel: audit: type=1300 audit(1768348015.499:529): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffcc166080 a2=0 a3=1 items=0 ppid=3061 pid=3330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:55.499000 audit[3330]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffcc166080 a2=0 a3=1 items=0 ppid=3061 pid=3330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:55.499000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.508270 kernel: audit: type=1327 audit(1768348015.499:529): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.509000 audit[3330]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:55.509000 audit[3330]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcc166080 a2=0 a3=1 items=0 ppid=3061 pid=3330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:55.509000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:55.512101 kernel: audit: type=1325 audit(1768348015.509:530): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:56.528000 audit[3332]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:56.528000 audit[3332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd83685e0 a2=0 a3=1 items=0 ppid=3061 pid=3332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:56.528000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:56.542000 audit[3332]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:56.542000 audit[3332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd83685e0 a2=0 a3=1 items=0 ppid=3061 pid=3332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:56.542000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:58.312000 audit[3335]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:58.312000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffd2e71fb0 a2=0 a3=1 items=0 ppid=3061 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.312000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:58.317000 audit[3335]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:58.317000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd2e71fb0 a2=0 a3=1 items=0 ppid=3061 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.317000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:58.343301 systemd[1]: Created slice kubepods-besteffort-podb524f6ed_f03e_406a_80d1_4cde1e342848.slice - libcontainer container kubepods-besteffort-podb524f6ed_f03e_406a_80d1_4cde1e342848.slice. Jan 13 23:46:58.418030 kubelet[2907]: I0113 23:46:58.417893 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndjn\" (UniqueName: \"kubernetes.io/projected/b524f6ed-f03e-406a-80d1-4cde1e342848-kube-api-access-wndjn\") pod \"calico-typha-7d78848979-bb5h7\" (UID: \"b524f6ed-f03e-406a-80d1-4cde1e342848\") " pod="calico-system/calico-typha-7d78848979-bb5h7" Jan 13 23:46:58.418030 kubelet[2907]: I0113 23:46:58.418038 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b524f6ed-f03e-406a-80d1-4cde1e342848-typha-certs\") pod \"calico-typha-7d78848979-bb5h7\" (UID: \"b524f6ed-f03e-406a-80d1-4cde1e342848\") " pod="calico-system/calico-typha-7d78848979-bb5h7" Jan 13 23:46:58.418735 kubelet[2907]: I0113 23:46:58.418109 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b524f6ed-f03e-406a-80d1-4cde1e342848-tigera-ca-bundle\") pod \"calico-typha-7d78848979-bb5h7\" (UID: \"b524f6ed-f03e-406a-80d1-4cde1e342848\") " pod="calico-system/calico-typha-7d78848979-bb5h7" Jan 13 23:46:58.580267 systemd[1]: Created slice kubepods-besteffort-pod7f0c8458_83c8_4b60_a848_cd1d2a69ea21.slice - libcontainer container kubepods-besteffort-pod7f0c8458_83c8_4b60_a848_cd1d2a69ea21.slice. Jan 13 23:46:58.619444 kubelet[2907]: I0113 23:46:58.619336 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-flexvol-driver-host\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.619444 kubelet[2907]: I0113 23:46:58.619404 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-tigera-ca-bundle\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.619444 kubelet[2907]: I0113 23:46:58.619440 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-cni-log-dir\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.619828 kubelet[2907]: I0113 23:46:58.619466 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-cni-net-dir\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.619828 kubelet[2907]: I0113 23:46:58.619493 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-var-run-calico\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.619828 kubelet[2907]: I0113 23:46:58.619519 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnwk\" (UniqueName: \"kubernetes.io/projected/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-kube-api-access-wwnwk\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.619828 kubelet[2907]: I0113 23:46:58.619549 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-cni-bin-dir\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.619828 kubelet[2907]: I0113 23:46:58.619574 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-policysync\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.620002 kubelet[2907]: I0113 23:46:58.619607 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-xtables-lock\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.620002 kubelet[2907]: I0113 23:46:58.619714 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-node-certs\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.620002 kubelet[2907]: I0113 23:46:58.619764 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-lib-modules\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.620002 kubelet[2907]: I0113 23:46:58.619831 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7f0c8458-83c8-4b60-a848-cd1d2a69ea21-var-lib-calico\") pod \"calico-node-vxg9g\" (UID: \"7f0c8458-83c8-4b60-a848-cd1d2a69ea21\") " pod="calico-system/calico-node-vxg9g" Jan 13 23:46:58.647287 containerd[1673]: time="2026-01-13T23:46:58.647040179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d78848979-bb5h7,Uid:b524f6ed-f03e-406a-80d1-4cde1e342848,Namespace:calico-system,Attempt:0,}" Jan 13 23:46:58.669766 containerd[1673]: time="2026-01-13T23:46:58.669629527Z" level=info msg="connecting to shim 196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89" address="unix:///run/containerd/s/3b46469534a840d234e5b939ed39b1c6dd48ef95069929ef7288f47883f73d78" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:46:58.691261 systemd[1]: Started cri-containerd-196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89.scope - libcontainer container 196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89. Jan 13 23:46:58.700000 audit: BPF prog-id=151 op=LOAD Jan 13 23:46:58.701000 audit: BPF prog-id=152 op=LOAD Jan 13 23:46:58.701000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.701000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139366630396136366665633063633831303937653465643664396465 Jan 13 23:46:58.701000 audit: BPF prog-id=152 op=UNLOAD Jan 13 23:46:58.701000 audit[3358]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.701000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139366630396136366665633063633831303937653465643664396465 Jan 13 23:46:58.701000 audit: BPF prog-id=153 op=LOAD Jan 13 23:46:58.701000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.701000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139366630396136366665633063633831303937653465643664396465 Jan 13 23:46:58.701000 audit: BPF prog-id=154 op=LOAD Jan 13 23:46:58.701000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.701000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139366630396136366665633063633831303937653465643664396465 Jan 13 23:46:58.701000 audit: BPF prog-id=154 op=UNLOAD Jan 13 23:46:58.701000 audit[3358]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.701000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139366630396136366665633063633831303937653465643664396465 Jan 13 23:46:58.701000 audit: BPF prog-id=153 op=UNLOAD Jan 13 23:46:58.701000 audit[3358]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.701000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139366630396136366665633063633831303937653465643664396465 Jan 13 23:46:58.701000 audit: BPF prog-id=155 op=LOAD Jan 13 23:46:58.701000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.701000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139366630396136366665633063633831303937653465643664396465 Jan 13 23:46:58.723107 kubelet[2907]: E0113 23:46:58.722536 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.723107 kubelet[2907]: W0113 23:46:58.722559 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.723107 kubelet[2907]: E0113 23:46:58.722579 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.725787 kubelet[2907]: E0113 23:46:58.725759 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.725900 kubelet[2907]: W0113 23:46:58.725885 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.725989 kubelet[2907]: E0113 23:46:58.725977 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.729007 containerd[1673]: time="2026-01-13T23:46:58.728975226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d78848979-bb5h7,Uid:b524f6ed-f03e-406a-80d1-4cde1e342848,Namespace:calico-system,Attempt:0,} returns sandbox id \"196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89\"" Jan 13 23:46:58.732275 containerd[1673]: time="2026-01-13T23:46:58.732244516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 13 23:46:58.736115 kubelet[2907]: E0113 23:46:58.736019 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.736213 kubelet[2907]: W0113 23:46:58.736119 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.736213 kubelet[2907]: E0113 23:46:58.736142 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.774841 kubelet[2907]: E0113 23:46:58.774451 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:46:58.803741 kubelet[2907]: E0113 23:46:58.803557 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.803741 kubelet[2907]: W0113 23:46:58.803603 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.803741 kubelet[2907]: E0113 23:46:58.803643 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.803983 kubelet[2907]: E0113 23:46:58.803969 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.804106 kubelet[2907]: W0113 23:46:58.804022 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.804178 kubelet[2907]: E0113 23:46:58.804165 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.804384 kubelet[2907]: E0113 23:46:58.804370 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.804462 kubelet[2907]: W0113 23:46:58.804449 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.804522 kubelet[2907]: E0113 23:46:58.804511 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.804836 kubelet[2907]: E0113 23:46:58.804723 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.804836 kubelet[2907]: W0113 23:46:58.804737 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.804836 kubelet[2907]: E0113 23:46:58.804748 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.804989 kubelet[2907]: E0113 23:46:58.804977 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.805049 kubelet[2907]: W0113 23:46:58.805037 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.805148 kubelet[2907]: E0113 23:46:58.805136 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.805380 kubelet[2907]: E0113 23:46:58.805366 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.805449 kubelet[2907]: W0113 23:46:58.805437 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.805502 kubelet[2907]: E0113 23:46:58.805493 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.805689 kubelet[2907]: E0113 23:46:58.805677 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.805847 kubelet[2907]: W0113 23:46:58.805749 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.805847 kubelet[2907]: E0113 23:46:58.805766 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.805974 kubelet[2907]: E0113 23:46:58.805961 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.806026 kubelet[2907]: W0113 23:46:58.806015 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.806103 kubelet[2907]: E0113 23:46:58.806092 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.806394 kubelet[2907]: E0113 23:46:58.806299 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.806394 kubelet[2907]: W0113 23:46:58.806311 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.806394 kubelet[2907]: E0113 23:46:58.806321 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.806552 kubelet[2907]: E0113 23:46:58.806540 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.806607 kubelet[2907]: W0113 23:46:58.806597 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.806664 kubelet[2907]: E0113 23:46:58.806653 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.806966 kubelet[2907]: E0113 23:46:58.806869 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.806966 kubelet[2907]: W0113 23:46:58.806881 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.806966 kubelet[2907]: E0113 23:46:58.806891 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.807153 kubelet[2907]: E0113 23:46:58.807141 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.807237 kubelet[2907]: W0113 23:46:58.807225 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.807386 kubelet[2907]: E0113 23:46:58.807283 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.807583 kubelet[2907]: E0113 23:46:58.807570 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.807680 kubelet[2907]: W0113 23:46:58.807667 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.807810 kubelet[2907]: E0113 23:46:58.807755 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.808010 kubelet[2907]: E0113 23:46:58.807997 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.808159 kubelet[2907]: W0113 23:46:58.808057 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.808159 kubelet[2907]: E0113 23:46:58.808103 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.808405 kubelet[2907]: E0113 23:46:58.808365 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.808405 kubelet[2907]: W0113 23:46:58.808377 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.808586 kubelet[2907]: E0113 23:46:58.808486 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.808810 kubelet[2907]: E0113 23:46:58.808765 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.808810 kubelet[2907]: W0113 23:46:58.808777 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.808810 kubelet[2907]: E0113 23:46:58.808787 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.809292 kubelet[2907]: E0113 23:46:58.809192 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.809292 kubelet[2907]: W0113 23:46:58.809205 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.809292 kubelet[2907]: E0113 23:46:58.809216 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.809459 kubelet[2907]: E0113 23:46:58.809447 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.809514 kubelet[2907]: W0113 23:46:58.809503 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.809566 kubelet[2907]: E0113 23:46:58.809556 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.809750 kubelet[2907]: E0113 23:46:58.809738 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.809823 kubelet[2907]: W0113 23:46:58.809811 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.809870 kubelet[2907]: E0113 23:46:58.809861 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.810188 kubelet[2907]: E0113 23:46:58.810045 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.810188 kubelet[2907]: W0113 23:46:58.810055 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.810188 kubelet[2907]: E0113 23:46:58.810118 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.821589 kubelet[2907]: E0113 23:46:58.821572 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.821686 kubelet[2907]: W0113 23:46:58.821672 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.821741 kubelet[2907]: E0113 23:46:58.821731 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.821814 kubelet[2907]: I0113 23:46:58.821801 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/89f76643-e37a-4094-9d85-ab46009d2c90-varrun\") pod \"csi-node-driver-p6cc5\" (UID: \"89f76643-e37a-4094-9d85-ab46009d2c90\") " pod="calico-system/csi-node-driver-p6cc5" Jan 13 23:46:58.822041 kubelet[2907]: E0113 23:46:58.822026 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.822138 kubelet[2907]: W0113 23:46:58.822124 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.822205 kubelet[2907]: E0113 23:46:58.822195 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.822282 kubelet[2907]: I0113 23:46:58.822269 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89f76643-e37a-4094-9d85-ab46009d2c90-kubelet-dir\") pod \"csi-node-driver-p6cc5\" (UID: \"89f76643-e37a-4094-9d85-ab46009d2c90\") " pod="calico-system/csi-node-driver-p6cc5" Jan 13 23:46:58.822441 kubelet[2907]: E0113 23:46:58.822397 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.822441 kubelet[2907]: W0113 23:46:58.822418 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.822441 kubelet[2907]: E0113 23:46:58.822438 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.822628 kubelet[2907]: E0113 23:46:58.822605 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.822628 kubelet[2907]: W0113 23:46:58.822616 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.822681 kubelet[2907]: E0113 23:46:58.822637 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.822812 kubelet[2907]: E0113 23:46:58.822790 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.822812 kubelet[2907]: W0113 23:46:58.822804 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.822862 kubelet[2907]: E0113 23:46:58.822817 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.822972 kubelet[2907]: E0113 23:46:58.822961 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.823000 kubelet[2907]: W0113 23:46:58.822972 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.823000 kubelet[2907]: E0113 23:46:58.822985 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.823129 kubelet[2907]: E0113 23:46:58.823118 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.823160 kubelet[2907]: W0113 23:46:58.823128 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.823160 kubelet[2907]: E0113 23:46:58.823136 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.823202 kubelet[2907]: I0113 23:46:58.823158 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhnfp\" (UniqueName: \"kubernetes.io/projected/89f76643-e37a-4094-9d85-ab46009d2c90-kube-api-access-vhnfp\") pod \"csi-node-driver-p6cc5\" (UID: \"89f76643-e37a-4094-9d85-ab46009d2c90\") " pod="calico-system/csi-node-driver-p6cc5" Jan 13 23:46:58.823316 kubelet[2907]: E0113 23:46:58.823304 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.823340 kubelet[2907]: W0113 23:46:58.823315 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.823340 kubelet[2907]: E0113 23:46:58.823329 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.823380 kubelet[2907]: I0113 23:46:58.823343 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/89f76643-e37a-4094-9d85-ab46009d2c90-socket-dir\") pod \"csi-node-driver-p6cc5\" (UID: \"89f76643-e37a-4094-9d85-ab46009d2c90\") " pod="calico-system/csi-node-driver-p6cc5" Jan 13 23:46:58.823491 kubelet[2907]: E0113 23:46:58.823477 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.823491 kubelet[2907]: W0113 23:46:58.823489 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.823549 kubelet[2907]: E0113 23:46:58.823503 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.823549 kubelet[2907]: I0113 23:46:58.823516 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/89f76643-e37a-4094-9d85-ab46009d2c90-registration-dir\") pod \"csi-node-driver-p6cc5\" (UID: \"89f76643-e37a-4094-9d85-ab46009d2c90\") " pod="calico-system/csi-node-driver-p6cc5" Jan 13 23:46:58.823673 kubelet[2907]: E0113 23:46:58.823658 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.823673 kubelet[2907]: W0113 23:46:58.823671 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.823746 kubelet[2907]: E0113 23:46:58.823684 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.823819 kubelet[2907]: E0113 23:46:58.823806 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.823819 kubelet[2907]: W0113 23:46:58.823817 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.823867 kubelet[2907]: E0113 23:46:58.823830 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.823977 kubelet[2907]: E0113 23:46:58.823966 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.824001 kubelet[2907]: W0113 23:46:58.823977 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.824001 kubelet[2907]: E0113 23:46:58.823989 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.824127 kubelet[2907]: E0113 23:46:58.824116 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.824127 kubelet[2907]: W0113 23:46:58.824125 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.824183 kubelet[2907]: E0113 23:46:58.824137 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.824270 kubelet[2907]: E0113 23:46:58.824259 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.824270 kubelet[2907]: W0113 23:46:58.824269 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.824316 kubelet[2907]: E0113 23:46:58.824277 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.824407 kubelet[2907]: E0113 23:46:58.824397 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.824435 kubelet[2907]: W0113 23:46:58.824409 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.824435 kubelet[2907]: E0113 23:46:58.824417 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.884583 containerd[1673]: time="2026-01-13T23:46:58.884469215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vxg9g,Uid:7f0c8458-83c8-4b60-a848-cd1d2a69ea21,Namespace:calico-system,Attempt:0,}" Jan 13 23:46:58.912978 containerd[1673]: time="2026-01-13T23:46:58.912913701Z" level=info msg="connecting to shim 5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef" address="unix:///run/containerd/s/554e82987101ec8a68aa9eb996f7ca519d35c2c8864cacefe8a7205fff592571" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:46:58.924704 kubelet[2907]: E0113 23:46:58.924675 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.924704 kubelet[2907]: W0113 23:46:58.924694 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.924704 kubelet[2907]: E0113 23:46:58.924712 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.924908 kubelet[2907]: E0113 23:46:58.924888 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.924908 kubelet[2907]: W0113 23:46:58.924896 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.924950 kubelet[2907]: E0113 23:46:58.924910 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.925234 kubelet[2907]: E0113 23:46:58.925081 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.925234 kubelet[2907]: W0113 23:46:58.925093 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.925234 kubelet[2907]: E0113 23:46:58.925108 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.925347 kubelet[2907]: E0113 23:46:58.925256 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.925347 kubelet[2907]: W0113 23:46:58.925271 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.925347 kubelet[2907]: E0113 23:46:58.925281 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.925531 kubelet[2907]: E0113 23:46:58.925457 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.925531 kubelet[2907]: W0113 23:46:58.925468 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.925531 kubelet[2907]: E0113 23:46:58.925481 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.925667 kubelet[2907]: E0113 23:46:58.925649 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.925667 kubelet[2907]: W0113 23:46:58.925662 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.925721 kubelet[2907]: E0113 23:46:58.925675 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.925818 kubelet[2907]: E0113 23:46:58.925806 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.925842 kubelet[2907]: W0113 23:46:58.925818 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.925842 kubelet[2907]: E0113 23:46:58.925834 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.926024 kubelet[2907]: E0113 23:46:58.926010 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.926024 kubelet[2907]: W0113 23:46:58.926022 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.926105 kubelet[2907]: E0113 23:46:58.926046 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.926195 kubelet[2907]: E0113 23:46:58.926178 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.926195 kubelet[2907]: W0113 23:46:58.926191 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.926248 kubelet[2907]: E0113 23:46:58.926214 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.926383 kubelet[2907]: E0113 23:46:58.926371 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.926383 kubelet[2907]: W0113 23:46:58.926383 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.926439 kubelet[2907]: E0113 23:46:58.926427 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.926649 kubelet[2907]: E0113 23:46:58.926587 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.926649 kubelet[2907]: W0113 23:46:58.926598 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.926649 kubelet[2907]: E0113 23:46:58.926638 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.926792 kubelet[2907]: E0113 23:46:58.926778 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.926792 kubelet[2907]: W0113 23:46:58.926789 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.926875 kubelet[2907]: E0113 23:46:58.926863 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.927002 kubelet[2907]: E0113 23:46:58.926988 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.927002 kubelet[2907]: W0113 23:46:58.927001 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.927057 kubelet[2907]: E0113 23:46:58.927013 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.927661 kubelet[2907]: E0113 23:46:58.927402 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.927661 kubelet[2907]: W0113 23:46:58.927414 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.927661 kubelet[2907]: E0113 23:46:58.927428 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.927661 kubelet[2907]: E0113 23:46:58.927625 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.927661 kubelet[2907]: W0113 23:46:58.927632 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.927913 kubelet[2907]: E0113 23:46:58.927889 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.928160 kubelet[2907]: E0113 23:46:58.928144 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.928160 kubelet[2907]: W0113 23:46:58.928158 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.928288 kubelet[2907]: E0113 23:46:58.928195 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.928763 kubelet[2907]: E0113 23:46:58.928606 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.928763 kubelet[2907]: W0113 23:46:58.928630 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.928763 kubelet[2907]: E0113 23:46:58.928757 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.928910 kubelet[2907]: E0113 23:46:58.928892 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.928910 kubelet[2907]: W0113 23:46:58.928906 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.928956 kubelet[2907]: E0113 23:46:58.928936 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.929141 kubelet[2907]: E0113 23:46:58.929126 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.929141 kubelet[2907]: W0113 23:46:58.929138 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.929200 kubelet[2907]: E0113 23:46:58.929154 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.929377 kubelet[2907]: E0113 23:46:58.929364 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.929377 kubelet[2907]: W0113 23:46:58.929376 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.929430 kubelet[2907]: E0113 23:46:58.929392 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.929587 kubelet[2907]: E0113 23:46:58.929572 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.929587 kubelet[2907]: W0113 23:46:58.929584 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.929638 kubelet[2907]: E0113 23:46:58.929600 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.929776 kubelet[2907]: E0113 23:46:58.929762 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.929776 kubelet[2907]: W0113 23:46:58.929774 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.929827 kubelet[2907]: E0113 23:46:58.929789 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.929978 kubelet[2907]: E0113 23:46:58.929965 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.929978 kubelet[2907]: W0113 23:46:58.929976 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.930037 kubelet[2907]: E0113 23:46:58.929989 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.930154 kubelet[2907]: E0113 23:46:58.930140 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.930178 kubelet[2907]: W0113 23:46:58.930152 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.930178 kubelet[2907]: E0113 23:46:58.930161 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.930348 kubelet[2907]: E0113 23:46:58.930335 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.930385 kubelet[2907]: W0113 23:46:58.930349 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.930385 kubelet[2907]: E0113 23:46:58.930358 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.932273 systemd[1]: Started cri-containerd-5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef.scope - libcontainer container 5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef. Jan 13 23:46:58.940000 audit: BPF prog-id=156 op=LOAD Jan 13 23:46:58.941000 audit: BPF prog-id=157 op=LOAD Jan 13 23:46:58.941000 audit[3458]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3446 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565383363316566313230633834343539373330633236336165643262 Jan 13 23:46:58.941000 audit: BPF prog-id=157 op=UNLOAD Jan 13 23:46:58.941000 audit[3458]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565383363316566313230633834343539373330633236336165643262 Jan 13 23:46:58.943440 kubelet[2907]: E0113 23:46:58.942799 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:46:58.943440 kubelet[2907]: W0113 23:46:58.942855 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:46:58.943440 kubelet[2907]: E0113 23:46:58.942873 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:46:58.941000 audit: BPF prog-id=158 op=LOAD Jan 13 23:46:58.941000 audit[3458]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3446 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565383363316566313230633834343539373330633236336165643262 Jan 13 23:46:58.942000 audit: BPF prog-id=159 op=LOAD Jan 13 23:46:58.942000 audit[3458]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3446 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565383363316566313230633834343539373330633236336165643262 Jan 13 23:46:58.943000 audit: BPF prog-id=159 op=UNLOAD Jan 13 23:46:58.943000 audit[3458]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565383363316566313230633834343539373330633236336165643262 Jan 13 23:46:58.943000 audit: BPF prog-id=158 op=UNLOAD Jan 13 23:46:58.943000 audit[3458]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565383363316566313230633834343539373330633236336165643262 Jan 13 23:46:58.943000 audit: BPF prog-id=160 op=LOAD Jan 13 23:46:58.943000 audit[3458]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3446 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:58.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565383363316566313230633834343539373330633236336165643262 Jan 13 23:46:58.957812 containerd[1673]: time="2026-01-13T23:46:58.957765517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vxg9g,Uid:7f0c8458-83c8-4b60-a848-cd1d2a69ea21,Namespace:calico-system,Attempt:0,} returns sandbox id \"5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef\"" Jan 13 23:46:59.329000 audit[3511]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3511 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:59.329000 audit[3511]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffc7f78e40 a2=0 a3=1 items=0 ppid=3061 pid=3511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:59.329000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:59.336000 audit[3511]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3511 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:46:59.336000 audit[3511]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc7f78e40 a2=0 a3=1 items=0 ppid=3061 pid=3511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:46:59.336000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:46:59.880661 kubelet[2907]: E0113 23:46:59.880208 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:00.237765 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3216580450.mount: Deactivated successfully. Jan 13 23:47:00.949741 containerd[1673]: time="2026-01-13T23:47:00.949603808Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:00.951535 containerd[1673]: time="2026-01-13T23:47:00.951483694Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 13 23:47:00.953387 containerd[1673]: time="2026-01-13T23:47:00.953359860Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:00.955471 containerd[1673]: time="2026-01-13T23:47:00.955422106Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:00.956107 containerd[1673]: time="2026-01-13T23:47:00.956076068Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.223789672s" Jan 13 23:47:00.956171 containerd[1673]: time="2026-01-13T23:47:00.956112108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 13 23:47:00.958243 containerd[1673]: time="2026-01-13T23:47:00.958214834Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 13 23:47:00.965245 containerd[1673]: time="2026-01-13T23:47:00.965213255Z" level=info msg="CreateContainer within sandbox \"196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 23:47:00.973090 containerd[1673]: time="2026-01-13T23:47:00.972450237Z" level=info msg="Container 067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:47:00.980842 containerd[1673]: time="2026-01-13T23:47:00.980806942Z" level=info msg="CreateContainer within sandbox \"196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045\"" Jan 13 23:47:00.981378 containerd[1673]: time="2026-01-13T23:47:00.981349624Z" level=info msg="StartContainer for \"067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045\"" Jan 13 23:47:00.982817 containerd[1673]: time="2026-01-13T23:47:00.982790388Z" level=info msg="connecting to shim 067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045" address="unix:///run/containerd/s/3b46469534a840d234e5b939ed39b1c6dd48ef95069929ef7288f47883f73d78" protocol=ttrpc version=3 Jan 13 23:47:01.006267 systemd[1]: Started cri-containerd-067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045.scope - libcontainer container 067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045. Jan 13 23:47:01.019722 kernel: kauditd_printk_skb: 64 callbacks suppressed Jan 13 23:47:01.019816 kernel: audit: type=1334 audit(1768348021.017:553): prog-id=161 op=LOAD Jan 13 23:47:01.017000 audit: BPF prog-id=161 op=LOAD Jan 13 23:47:01.017000 audit: BPF prog-id=162 op=LOAD Jan 13 23:47:01.020806 kernel: audit: type=1334 audit(1768348021.017:554): prog-id=162 op=LOAD Jan 13 23:47:01.017000 audit[3522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.024495 kernel: audit: type=1300 audit(1768348021.017:554): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.024567 kernel: audit: type=1327 audit(1768348021.017:554): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.017000 audit: BPF prog-id=162 op=UNLOAD Jan 13 23:47:01.028860 kernel: audit: type=1334 audit(1768348021.017:555): prog-id=162 op=UNLOAD Jan 13 23:47:01.017000 audit[3522]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.032330 kernel: audit: type=1300 audit(1768348021.017:555): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.035727 kernel: audit: type=1327 audit(1768348021.017:555): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.017000 audit: BPF prog-id=163 op=LOAD Jan 13 23:47:01.036665 kernel: audit: type=1334 audit(1768348021.017:556): prog-id=163 op=LOAD Jan 13 23:47:01.036720 kernel: audit: type=1300 audit(1768348021.017:556): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.017000 audit[3522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.043056 kernel: audit: type=1327 audit(1768348021.017:556): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.018000 audit: BPF prog-id=164 op=LOAD Jan 13 23:47:01.018000 audit[3522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.018000 audit: BPF prog-id=164 op=UNLOAD Jan 13 23:47:01.018000 audit[3522]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.018000 audit: BPF prog-id=163 op=UNLOAD Jan 13 23:47:01.018000 audit[3522]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.018000 audit: BPF prog-id=165 op=LOAD Jan 13 23:47:01.018000 audit[3522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3347 pid=3522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036376261643631326561376561353437653763633035633936643933 Jan 13 23:47:01.066506 containerd[1673]: time="2026-01-13T23:47:01.066470921Z" level=info msg="StartContainer for \"067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045\" returns successfully" Jan 13 23:47:01.878808 kubelet[2907]: E0113 23:47:01.878490 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:01.974446 kubelet[2907]: I0113 23:47:01.974332 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7d78848979-bb5h7" podStartSLOduration=1.748301463 podStartE2EDuration="3.974296461s" podCreationTimestamp="2026-01-13 23:46:58 +0000 UTC" firstStartedPulling="2026-01-13 23:46:58.731805235 +0000 UTC m=+22.992120762" lastFinishedPulling="2026-01-13 23:47:00.957800193 +0000 UTC m=+25.218115760" observedRunningTime="2026-01-13 23:47:01.974178861 +0000 UTC m=+26.234494428" watchObservedRunningTime="2026-01-13 23:47:01.974296461 +0000 UTC m=+26.234612028" Jan 13 23:47:02.031847 kubelet[2907]: E0113 23:47:02.031764 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.031847 kubelet[2907]: W0113 23:47:02.031793 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.031847 kubelet[2907]: E0113 23:47:02.031815 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.032261 kubelet[2907]: E0113 23:47:02.031978 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.032261 kubelet[2907]: W0113 23:47:02.031986 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.032261 kubelet[2907]: E0113 23:47:02.032026 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.032261 kubelet[2907]: E0113 23:47:02.032196 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.032261 kubelet[2907]: W0113 23:47:02.032204 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.032261 kubelet[2907]: E0113 23:47:02.032212 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.032457 kubelet[2907]: E0113 23:47:02.032359 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.032457 kubelet[2907]: W0113 23:47:02.032367 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.032457 kubelet[2907]: E0113 23:47:02.032374 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.032536 kubelet[2907]: E0113 23:47:02.032518 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.032536 kubelet[2907]: W0113 23:47:02.032530 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.032640 kubelet[2907]: E0113 23:47:02.032539 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.032764 kubelet[2907]: E0113 23:47:02.032715 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.032764 kubelet[2907]: W0113 23:47:02.032735 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.032764 kubelet[2907]: E0113 23:47:02.032756 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.032953 kubelet[2907]: E0113 23:47:02.032926 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.032953 kubelet[2907]: W0113 23:47:02.032941 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.032998 kubelet[2907]: E0113 23:47:02.032956 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.033144 kubelet[2907]: E0113 23:47:02.033130 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.033183 kubelet[2907]: W0113 23:47:02.033148 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.033183 kubelet[2907]: E0113 23:47:02.033159 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.033355 kubelet[2907]: E0113 23:47:02.033338 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.033355 kubelet[2907]: W0113 23:47:02.033353 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.033405 kubelet[2907]: E0113 23:47:02.033376 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.033536 kubelet[2907]: E0113 23:47:02.033523 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.033564 kubelet[2907]: W0113 23:47:02.033547 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.033564 kubelet[2907]: E0113 23:47:02.033558 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.033776 kubelet[2907]: E0113 23:47:02.033757 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.033811 kubelet[2907]: W0113 23:47:02.033787 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.033811 kubelet[2907]: E0113 23:47:02.033798 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.033965 kubelet[2907]: E0113 23:47:02.033951 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.033965 kubelet[2907]: W0113 23:47:02.033963 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.034017 kubelet[2907]: E0113 23:47:02.033972 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.034199 kubelet[2907]: E0113 23:47:02.034185 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.034223 kubelet[2907]: W0113 23:47:02.034198 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.034223 kubelet[2907]: E0113 23:47:02.034208 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.034368 kubelet[2907]: E0113 23:47:02.034357 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.034397 kubelet[2907]: W0113 23:47:02.034368 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.034397 kubelet[2907]: E0113 23:47:02.034378 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.034524 kubelet[2907]: E0113 23:47:02.034511 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.034524 kubelet[2907]: W0113 23:47:02.034522 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.034574 kubelet[2907]: E0113 23:47:02.034529 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.048002 kubelet[2907]: E0113 23:47:02.047963 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.048002 kubelet[2907]: W0113 23:47:02.047985 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.048002 kubelet[2907]: E0113 23:47:02.047998 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.048279 kubelet[2907]: E0113 23:47:02.048244 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.048279 kubelet[2907]: W0113 23:47:02.048260 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.048279 kubelet[2907]: E0113 23:47:02.048276 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.048666 kubelet[2907]: E0113 23:47:02.048602 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.048666 kubelet[2907]: W0113 23:47:02.048646 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.048733 kubelet[2907]: E0113 23:47:02.048673 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.048869 kubelet[2907]: E0113 23:47:02.048855 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.048869 kubelet[2907]: W0113 23:47:02.048867 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.048921 kubelet[2907]: E0113 23:47:02.048880 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.049040 kubelet[2907]: E0113 23:47:02.049028 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.049040 kubelet[2907]: W0113 23:47:02.049038 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.049097 kubelet[2907]: E0113 23:47:02.049052 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.049256 kubelet[2907]: E0113 23:47:02.049244 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.049284 kubelet[2907]: W0113 23:47:02.049256 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.049284 kubelet[2907]: E0113 23:47:02.049272 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.049489 kubelet[2907]: E0113 23:47:02.049477 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.049489 kubelet[2907]: W0113 23:47:02.049488 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.049546 kubelet[2907]: E0113 23:47:02.049503 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.049735 kubelet[2907]: E0113 23:47:02.049722 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.049735 kubelet[2907]: W0113 23:47:02.049733 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.049803 kubelet[2907]: E0113 23:47:02.049784 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.049899 kubelet[2907]: E0113 23:47:02.049887 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.049927 kubelet[2907]: W0113 23:47:02.049899 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.049956 kubelet[2907]: E0113 23:47:02.049944 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.050076 kubelet[2907]: E0113 23:47:02.050057 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.050106 kubelet[2907]: W0113 23:47:02.050076 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.050106 kubelet[2907]: E0113 23:47:02.050094 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.050269 kubelet[2907]: E0113 23:47:02.050256 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.050269 kubelet[2907]: W0113 23:47:02.050267 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.050322 kubelet[2907]: E0113 23:47:02.050285 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.050460 kubelet[2907]: E0113 23:47:02.050434 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.050460 kubelet[2907]: W0113 23:47:02.050445 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.050460 kubelet[2907]: E0113 23:47:02.050456 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.050633 kubelet[2907]: E0113 23:47:02.050617 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.050664 kubelet[2907]: W0113 23:47:02.050632 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.050664 kubelet[2907]: E0113 23:47:02.050648 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.050797 kubelet[2907]: E0113 23:47:02.050786 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.050821 kubelet[2907]: W0113 23:47:02.050796 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.050821 kubelet[2907]: E0113 23:47:02.050809 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.050995 kubelet[2907]: E0113 23:47:02.050984 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.051029 kubelet[2907]: W0113 23:47:02.050996 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.051029 kubelet[2907]: E0113 23:47:02.051010 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.051257 kubelet[2907]: E0113 23:47:02.051241 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.051257 kubelet[2907]: W0113 23:47:02.051255 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.051313 kubelet[2907]: E0113 23:47:02.051267 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.051456 kubelet[2907]: E0113 23:47:02.051441 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.051456 kubelet[2907]: W0113 23:47:02.051455 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.051512 kubelet[2907]: E0113 23:47:02.051472 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.051653 kubelet[2907]: E0113 23:47:02.051637 2907 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 23:47:02.051653 kubelet[2907]: W0113 23:47:02.051652 2907 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 23:47:02.051704 kubelet[2907]: E0113 23:47:02.051663 2907 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 23:47:02.638980 containerd[1673]: time="2026-01-13T23:47:02.638878667Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:02.640521 containerd[1673]: time="2026-01-13T23:47:02.640457831Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:02.641674 containerd[1673]: time="2026-01-13T23:47:02.641633995Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:02.644405 containerd[1673]: time="2026-01-13T23:47:02.644363643Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:02.645223 containerd[1673]: time="2026-01-13T23:47:02.645183726Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.686932892s" Jan 13 23:47:02.645254 containerd[1673]: time="2026-01-13T23:47:02.645223526Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 13 23:47:02.647429 containerd[1673]: time="2026-01-13T23:47:02.647222292Z" level=info msg="CreateContainer within sandbox \"5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 23:47:02.656960 containerd[1673]: time="2026-01-13T23:47:02.656924401Z" level=info msg="Container eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:47:02.668843 containerd[1673]: time="2026-01-13T23:47:02.668786117Z" level=info msg="CreateContainer within sandbox \"5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641\"" Jan 13 23:47:02.669296 containerd[1673]: time="2026-01-13T23:47:02.669263958Z" level=info msg="StartContainer for \"eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641\"" Jan 13 23:47:02.670866 containerd[1673]: time="2026-01-13T23:47:02.670837723Z" level=info msg="connecting to shim eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641" address="unix:///run/containerd/s/554e82987101ec8a68aa9eb996f7ca519d35c2c8864cacefe8a7205fff592571" protocol=ttrpc version=3 Jan 13 23:47:02.686282 systemd[1]: Started cri-containerd-eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641.scope - libcontainer container eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641. Jan 13 23:47:02.760000 audit: BPF prog-id=166 op=LOAD Jan 13 23:47:02.760000 audit[3603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3446 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:02.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396462386362383930386464316538393831336463616530336334 Jan 13 23:47:02.760000 audit: BPF prog-id=167 op=LOAD Jan 13 23:47:02.760000 audit[3603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3446 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:02.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396462386362383930386464316538393831336463616530336334 Jan 13 23:47:02.760000 audit: BPF prog-id=167 op=UNLOAD Jan 13 23:47:02.760000 audit[3603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:02.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396462386362383930386464316538393831336463616530336334 Jan 13 23:47:02.760000 audit: BPF prog-id=166 op=UNLOAD Jan 13 23:47:02.760000 audit[3603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:02.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396462386362383930386464316538393831336463616530336334 Jan 13 23:47:02.760000 audit: BPF prog-id=168 op=LOAD Jan 13 23:47:02.760000 audit[3603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3446 pid=3603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:02.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396462386362383930386464316538393831336463616530336334 Jan 13 23:47:02.781158 containerd[1673]: time="2026-01-13T23:47:02.781110056Z" level=info msg="StartContainer for \"eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641\" returns successfully" Jan 13 23:47:02.796012 systemd[1]: cri-containerd-eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641.scope: Deactivated successfully. Jan 13 23:47:02.798470 containerd[1673]: time="2026-01-13T23:47:02.798432228Z" level=info msg="received container exit event container_id:\"eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641\" id:\"eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641\" pid:3617 exited_at:{seconds:1768348022 nanos:798020787}" Jan 13 23:47:02.802000 audit: BPF prog-id=168 op=UNLOAD Jan 13 23:47:02.817601 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641-rootfs.mount: Deactivated successfully. Jan 13 23:47:02.966969 kubelet[2907]: I0113 23:47:02.966862 2907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 23:47:03.879089 kubelet[2907]: E0113 23:47:03.878757 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:05.878912 kubelet[2907]: E0113 23:47:05.878802 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:05.946949 kubelet[2907]: I0113 23:47:05.946550 2907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 23:47:05.971000 audit[3656]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=3656 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:05.971000 audit[3656]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc8e705b0 a2=0 a3=1 items=0 ppid=3061 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:05.971000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:05.988000 audit[3656]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=3656 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:05.988000 audit[3656]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffc8e705b0 a2=0 a3=1 items=0 ppid=3061 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:05.988000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:06.977645 containerd[1673]: time="2026-01-13T23:47:06.977601001Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 13 23:47:07.878656 kubelet[2907]: E0113 23:47:07.878538 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:09.878760 kubelet[2907]: E0113 23:47:09.878711 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:10.379621 containerd[1673]: time="2026-01-13T23:47:10.379578589Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:10.380353 containerd[1673]: time="2026-01-13T23:47:10.380312231Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 13 23:47:10.381642 containerd[1673]: time="2026-01-13T23:47:10.381598995Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:10.385077 containerd[1673]: time="2026-01-13T23:47:10.385010645Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:10.385727 containerd[1673]: time="2026-01-13T23:47:10.385655127Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.408001205s" Jan 13 23:47:10.385727 containerd[1673]: time="2026-01-13T23:47:10.385691887Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 13 23:47:10.388774 containerd[1673]: time="2026-01-13T23:47:10.388742697Z" level=info msg="CreateContainer within sandbox \"5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 23:47:10.397969 containerd[1673]: time="2026-01-13T23:47:10.397880764Z" level=info msg="Container 8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:47:10.411016 containerd[1673]: time="2026-01-13T23:47:10.410898403Z" level=info msg="CreateContainer within sandbox \"5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263\"" Jan 13 23:47:10.411416 containerd[1673]: time="2026-01-13T23:47:10.411385485Z" level=info msg="StartContainer for \"8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263\"" Jan 13 23:47:10.412845 containerd[1673]: time="2026-01-13T23:47:10.412813809Z" level=info msg="connecting to shim 8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263" address="unix:///run/containerd/s/554e82987101ec8a68aa9eb996f7ca519d35c2c8864cacefe8a7205fff592571" protocol=ttrpc version=3 Jan 13 23:47:10.445498 systemd[1]: Started cri-containerd-8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263.scope - libcontainer container 8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263. Jan 13 23:47:10.507000 audit: BPF prog-id=169 op=LOAD Jan 13 23:47:10.509570 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 13 23:47:10.509632 kernel: audit: type=1334 audit(1768348030.507:569): prog-id=169 op=LOAD Jan 13 23:47:10.507000 audit[3665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.513381 kernel: audit: type=1300 audit(1768348030.507:569): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.513456 kernel: audit: type=1327 audit(1768348030.507:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.507000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.516466 kernel: audit: type=1334 audit(1768348030.507:570): prog-id=170 op=LOAD Jan 13 23:47:10.507000 audit: BPF prog-id=170 op=LOAD Jan 13 23:47:10.507000 audit[3665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.520199 kernel: audit: type=1300 audit(1768348030.507:570): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.520267 kernel: audit: type=1327 audit(1768348030.507:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.507000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.508000 audit: BPF prog-id=170 op=UNLOAD Jan 13 23:47:10.524077 kernel: audit: type=1334 audit(1768348030.508:571): prog-id=170 op=UNLOAD Jan 13 23:47:10.508000 audit[3665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.527890 kernel: audit: type=1300 audit(1768348030.508:571): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.528007 kernel: audit: type=1327 audit(1768348030.508:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.508000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.508000 audit: BPF prog-id=169 op=UNLOAD Jan 13 23:47:10.532081 kernel: audit: type=1334 audit(1768348030.508:572): prog-id=169 op=UNLOAD Jan 13 23:47:10.508000 audit[3665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.508000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.508000 audit: BPF prog-id=171 op=LOAD Jan 13 23:47:10.508000 audit[3665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=3446 pid=3665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:10.508000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861656461383532663062356230343239653061303231346332393662 Jan 13 23:47:10.548475 containerd[1673]: time="2026-01-13T23:47:10.548431419Z" level=info msg="StartContainer for \"8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263\" returns successfully" Jan 13 23:47:11.809557 containerd[1673]: time="2026-01-13T23:47:11.809424104Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 23:47:11.812319 systemd[1]: cri-containerd-8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263.scope: Deactivated successfully. Jan 13 23:47:11.812936 systemd[1]: cri-containerd-8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263.scope: Consumed 460ms CPU time, 186.1M memory peak, 165.9M written to disk. Jan 13 23:47:11.814412 containerd[1673]: time="2026-01-13T23:47:11.814317999Z" level=info msg="received container exit event container_id:\"8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263\" id:\"8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263\" pid:3678 exited_at:{seconds:1768348031 nanos:813958918}" Jan 13 23:47:11.815000 audit: BPF prog-id=171 op=UNLOAD Jan 13 23:47:11.825514 kubelet[2907]: I0113 23:47:11.824710 2907 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 13 23:47:11.841520 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263-rootfs.mount: Deactivated successfully. Jan 13 23:47:11.869472 systemd[1]: Created slice kubepods-besteffort-podd7f5b072_2b7e_422f_9af7_8a879bbd601a.slice - libcontainer container kubepods-besteffort-podd7f5b072_2b7e_422f_9af7_8a879bbd601a.slice. Jan 13 23:47:11.880771 systemd[1]: Created slice kubepods-burstable-podbd94acc5_d85c_4f68_95ac_32e9cd9a577e.slice - libcontainer container kubepods-burstable-podbd94acc5_d85c_4f68_95ac_32e9cd9a577e.slice. Jan 13 23:47:12.427751 kubelet[2907]: I0113 23:47:11.944125 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7t8p\" (UniqueName: \"kubernetes.io/projected/bd94acc5-d85c-4f68-95ac-32e9cd9a577e-kube-api-access-z7t8p\") pod \"coredns-668d6bf9bc-wmjgt\" (UID: \"bd94acc5-d85c-4f68-95ac-32e9cd9a577e\") " pod="kube-system/coredns-668d6bf9bc-wmjgt" Jan 13 23:47:12.427751 kubelet[2907]: I0113 23:47:11.944162 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sknl\" (UniqueName: \"kubernetes.io/projected/c4d38e9b-73ce-46dd-9acb-61df83d528d1-kube-api-access-8sknl\") pod \"calico-kube-controllers-7b4b69d7d6-mtxnp\" (UID: \"c4d38e9b-73ce-46dd-9acb-61df83d528d1\") " pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" Jan 13 23:47:12.427751 kubelet[2907]: I0113 23:47:11.944212 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd94acc5-d85c-4f68-95ac-32e9cd9a577e-config-volume\") pod \"coredns-668d6bf9bc-wmjgt\" (UID: \"bd94acc5-d85c-4f68-95ac-32e9cd9a577e\") " pod="kube-system/coredns-668d6bf9bc-wmjgt" Jan 13 23:47:12.427751 kubelet[2907]: I0113 23:47:11.944233 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/04948dc7-2d3d-4260-8809-f8eb4aa6cc17-calico-apiserver-certs\") pod \"calico-apiserver-68644f6664-r8m4t\" (UID: \"04948dc7-2d3d-4260-8809-f8eb4aa6cc17\") " pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" Jan 13 23:47:12.427751 kubelet[2907]: I0113 23:47:11.944261 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ddfa231-61f2-4ab3-bd34-82f93616c2de-config\") pod \"goldmane-666569f655-xghpj\" (UID: \"7ddfa231-61f2-4ab3-bd34-82f93616c2de\") " pod="calico-system/goldmane-666569f655-xghpj" Jan 13 23:47:11.889229 systemd[1]: Created slice kubepods-burstable-pod48b88556_647c_4972_b2c3_222a83018169.slice - libcontainer container kubepods-burstable-pod48b88556_647c_4972_b2c3_222a83018169.slice. Jan 13 23:47:12.427952 kubelet[2907]: I0113 23:47:11.944276 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ddfa231-61f2-4ab3-bd34-82f93616c2de-goldmane-ca-bundle\") pod \"goldmane-666569f655-xghpj\" (UID: \"7ddfa231-61f2-4ab3-bd34-82f93616c2de\") " pod="calico-system/goldmane-666569f655-xghpj" Jan 13 23:47:12.427952 kubelet[2907]: I0113 23:47:11.944292 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt7lw\" (UniqueName: \"kubernetes.io/projected/7ddfa231-61f2-4ab3-bd34-82f93616c2de-kube-api-access-dt7lw\") pod \"goldmane-666569f655-xghpj\" (UID: \"7ddfa231-61f2-4ab3-bd34-82f93616c2de\") " pod="calico-system/goldmane-666569f655-xghpj" Jan 13 23:47:12.427952 kubelet[2907]: I0113 23:47:11.944309 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4d38e9b-73ce-46dd-9acb-61df83d528d1-tigera-ca-bundle\") pod \"calico-kube-controllers-7b4b69d7d6-mtxnp\" (UID: \"c4d38e9b-73ce-46dd-9acb-61df83d528d1\") " pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" Jan 13 23:47:12.427952 kubelet[2907]: I0113 23:47:11.944328 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghb74\" (UniqueName: \"kubernetes.io/projected/04948dc7-2d3d-4260-8809-f8eb4aa6cc17-kube-api-access-ghb74\") pod \"calico-apiserver-68644f6664-r8m4t\" (UID: \"04948dc7-2d3d-4260-8809-f8eb4aa6cc17\") " pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" Jan 13 23:47:12.427952 kubelet[2907]: I0113 23:47:11.944365 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfsl\" (UniqueName: \"kubernetes.io/projected/48b88556-647c-4972-b2c3-222a83018169-kube-api-access-nwfsl\") pod \"coredns-668d6bf9bc-mwjb8\" (UID: \"48b88556-647c-4972-b2c3-222a83018169\") " pod="kube-system/coredns-668d6bf9bc-mwjb8" Jan 13 23:47:11.899598 systemd[1]: Created slice kubepods-besteffort-pod04948dc7_2d3d_4260_8809_f8eb4aa6cc17.slice - libcontainer container kubepods-besteffort-pod04948dc7_2d3d_4260_8809_f8eb4aa6cc17.slice. Jan 13 23:47:12.428119 kubelet[2907]: I0113 23:47:11.944442 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48b88556-647c-4972-b2c3-222a83018169-config-volume\") pod \"coredns-668d6bf9bc-mwjb8\" (UID: \"48b88556-647c-4972-b2c3-222a83018169\") " pod="kube-system/coredns-668d6bf9bc-mwjb8" Jan 13 23:47:12.428119 kubelet[2907]: I0113 23:47:11.944474 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7ddfa231-61f2-4ab3-bd34-82f93616c2de-goldmane-key-pair\") pod \"goldmane-666569f655-xghpj\" (UID: \"7ddfa231-61f2-4ab3-bd34-82f93616c2de\") " pod="calico-system/goldmane-666569f655-xghpj" Jan 13 23:47:12.428119 kubelet[2907]: I0113 23:47:11.944493 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnlgw\" (UniqueName: \"kubernetes.io/projected/c47b5326-d31f-4680-9c2d-bdd28d584c69-kube-api-access-hnlgw\") pod \"calico-apiserver-68644f6664-bwfss\" (UID: \"c47b5326-d31f-4680-9c2d-bdd28d584c69\") " pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" Jan 13 23:47:12.428119 kubelet[2907]: I0113 23:47:11.944513 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c47b5326-d31f-4680-9c2d-bdd28d584c69-calico-apiserver-certs\") pod \"calico-apiserver-68644f6664-bwfss\" (UID: \"c47b5326-d31f-4680-9c2d-bdd28d584c69\") " pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" Jan 13 23:47:12.428119 kubelet[2907]: I0113 23:47:11.944530 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-ca-bundle\") pod \"whisker-7c7b464ffc-6fcxs\" (UID: \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\") " pod="calico-system/whisker-7c7b464ffc-6fcxs" Jan 13 23:47:11.908643 systemd[1]: Created slice kubepods-besteffort-podc4d38e9b_73ce_46dd_9acb_61df83d528d1.slice - libcontainer container kubepods-besteffort-podc4d38e9b_73ce_46dd_9acb_61df83d528d1.slice. Jan 13 23:47:12.428263 kubelet[2907]: I0113 23:47:11.944545 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-backend-key-pair\") pod \"whisker-7c7b464ffc-6fcxs\" (UID: \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\") " pod="calico-system/whisker-7c7b464ffc-6fcxs" Jan 13 23:47:12.428263 kubelet[2907]: I0113 23:47:11.944565 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jsq7\" (UniqueName: \"kubernetes.io/projected/d7f5b072-2b7e-422f-9af7-8a879bbd601a-kube-api-access-6jsq7\") pod \"whisker-7c7b464ffc-6fcxs\" (UID: \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\") " pod="calico-system/whisker-7c7b464ffc-6fcxs" Jan 13 23:47:11.914766 systemd[1]: Created slice kubepods-besteffort-podc47b5326_d31f_4680_9c2d_bdd28d584c69.slice - libcontainer container kubepods-besteffort-podc47b5326_d31f_4680_9c2d_bdd28d584c69.slice. Jan 13 23:47:11.920882 systemd[1]: Created slice kubepods-besteffort-pod7ddfa231_61f2_4ab3_bd34_82f93616c2de.slice - libcontainer container kubepods-besteffort-pod7ddfa231_61f2_4ab3_bd34_82f93616c2de.slice. Jan 13 23:47:11.926370 systemd[1]: Created slice kubepods-besteffort-pod89f76643_e37a_4094_9d85_ab46009d2c90.slice - libcontainer container kubepods-besteffort-pod89f76643_e37a_4094_9d85_ab46009d2c90.slice. Jan 13 23:47:12.540669 containerd[1673]: time="2026-01-13T23:47:12.540561911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p6cc5,Uid:89f76643-e37a-4094-9d85-ab46009d2c90,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:12.555104 containerd[1673]: time="2026-01-13T23:47:12.554240832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-bwfss,Uid:c47b5326-d31f-4680-9c2d-bdd28d584c69,Namespace:calico-apiserver,Attempt:0,}" Jan 13 23:47:12.841759 containerd[1673]: time="2026-01-13T23:47:12.841718300Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c7b464ffc-6fcxs,Uid:d7f5b072-2b7e-422f-9af7-8a879bbd601a,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:12.842153 containerd[1673]: time="2026-01-13T23:47:12.842126141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wmjgt,Uid:bd94acc5-d85c-4f68-95ac-32e9cd9a577e,Namespace:kube-system,Attempt:0,}" Jan 13 23:47:12.843807 containerd[1673]: time="2026-01-13T23:47:12.843735106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-xghpj,Uid:7ddfa231-61f2-4ab3-bd34-82f93616c2de,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:12.846288 containerd[1673]: time="2026-01-13T23:47:12.846249314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwjb8,Uid:48b88556-647c-4972-b2c3-222a83018169,Namespace:kube-system,Attempt:0,}" Jan 13 23:47:12.851128 containerd[1673]: time="2026-01-13T23:47:12.851097448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b4b69d7d6-mtxnp,Uid:c4d38e9b-73ce-46dd-9acb-61df83d528d1,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:12.855111 containerd[1673]: time="2026-01-13T23:47:12.855057380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-r8m4t,Uid:04948dc7-2d3d-4260-8809-f8eb4aa6cc17,Namespace:calico-apiserver,Attempt:0,}" Jan 13 23:47:13.405538 containerd[1673]: time="2026-01-13T23:47:13.405478561Z" level=error msg="Failed to destroy network for sandbox \"6b3121146676f4f8188ba364a0227b402936b94bacaddf9e401d784f9405d018\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.408934 containerd[1673]: time="2026-01-13T23:47:13.408885612Z" level=error msg="Failed to destroy network for sandbox \"3b0c7492cb62d406dd160e8fe4c0be9520f44832b1015f7b52d15c2a57162654\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.410351 containerd[1673]: time="2026-01-13T23:47:13.410304736Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwjb8,Uid:48b88556-647c-4972-b2c3-222a83018169,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b3121146676f4f8188ba364a0227b402936b94bacaddf9e401d784f9405d018\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.411107 kubelet[2907]: E0113 23:47:13.410740 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b3121146676f4f8188ba364a0227b402936b94bacaddf9e401d784f9405d018\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.411107 kubelet[2907]: E0113 23:47:13.410822 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b3121146676f4f8188ba364a0227b402936b94bacaddf9e401d784f9405d018\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mwjb8" Jan 13 23:47:13.411107 kubelet[2907]: E0113 23:47:13.410840 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b3121146676f4f8188ba364a0227b402936b94bacaddf9e401d784f9405d018\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mwjb8" Jan 13 23:47:13.411474 kubelet[2907]: E0113 23:47:13.410900 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-mwjb8_kube-system(48b88556-647c-4972-b2c3-222a83018169)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-mwjb8_kube-system(48b88556-647c-4972-b2c3-222a83018169)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b3121146676f4f8188ba364a0227b402936b94bacaddf9e401d784f9405d018\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-mwjb8" podUID="48b88556-647c-4972-b2c3-222a83018169" Jan 13 23:47:13.414922 containerd[1673]: time="2026-01-13T23:47:13.414848750Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p6cc5,Uid:89f76643-e37a-4094-9d85-ab46009d2c90,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0c7492cb62d406dd160e8fe4c0be9520f44832b1015f7b52d15c2a57162654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.415119 kubelet[2907]: E0113 23:47:13.415058 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0c7492cb62d406dd160e8fe4c0be9520f44832b1015f7b52d15c2a57162654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.415175 kubelet[2907]: E0113 23:47:13.415131 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0c7492cb62d406dd160e8fe4c0be9520f44832b1015f7b52d15c2a57162654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p6cc5" Jan 13 23:47:13.415175 kubelet[2907]: E0113 23:47:13.415151 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b0c7492cb62d406dd160e8fe4c0be9520f44832b1015f7b52d15c2a57162654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p6cc5" Jan 13 23:47:13.415218 kubelet[2907]: E0113 23:47:13.415193 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b0c7492cb62d406dd160e8fe4c0be9520f44832b1015f7b52d15c2a57162654\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:13.422040 containerd[1673]: time="2026-01-13T23:47:13.421985291Z" level=error msg="Failed to destroy network for sandbox \"34c6d3de0a87494f0a0e8a0edf5ec8835d2e267cd4854867d352fb8acdf358c5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.423427 containerd[1673]: time="2026-01-13T23:47:13.423370215Z" level=error msg="Failed to destroy network for sandbox \"bc8528a04151aa0b1334520bce4d35f27531b1d85f7dd9843d1db1c1fb23fc1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.426140 containerd[1673]: time="2026-01-13T23:47:13.426049624Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-r8m4t,Uid:04948dc7-2d3d-4260-8809-f8eb4aa6cc17,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"34c6d3de0a87494f0a0e8a0edf5ec8835d2e267cd4854867d352fb8acdf358c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.426409 kubelet[2907]: E0113 23:47:13.426327 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34c6d3de0a87494f0a0e8a0edf5ec8835d2e267cd4854867d352fb8acdf358c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.426488 kubelet[2907]: E0113 23:47:13.426425 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34c6d3de0a87494f0a0e8a0edf5ec8835d2e267cd4854867d352fb8acdf358c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" Jan 13 23:47:13.426488 kubelet[2907]: E0113 23:47:13.426446 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34c6d3de0a87494f0a0e8a0edf5ec8835d2e267cd4854867d352fb8acdf358c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" Jan 13 23:47:13.426624 kubelet[2907]: E0113 23:47:13.426587 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-68644f6664-r8m4t_calico-apiserver(04948dc7-2d3d-4260-8809-f8eb4aa6cc17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-68644f6664-r8m4t_calico-apiserver(04948dc7-2d3d-4260-8809-f8eb4aa6cc17)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"34c6d3de0a87494f0a0e8a0edf5ec8835d2e267cd4854867d352fb8acdf358c5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:47:13.428846 containerd[1673]: time="2026-01-13T23:47:13.428688431Z" level=error msg="Failed to destroy network for sandbox \"1bad8cc9ca4a05d20769a4de987803557cff5caa6fb303172cea67aedafe9d65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.431213 containerd[1673]: time="2026-01-13T23:47:13.431050279Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b4b69d7d6-mtxnp,Uid:c4d38e9b-73ce-46dd-9acb-61df83d528d1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc8528a04151aa0b1334520bce4d35f27531b1d85f7dd9843d1db1c1fb23fc1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.431708 containerd[1673]: time="2026-01-13T23:47:13.431495840Z" level=error msg="Failed to destroy network for sandbox \"a26724b39dc74931c822950b9780d3c9a0bbe225d3a178a78c81348067ee9a2d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.431902 kubelet[2907]: E0113 23:47:13.431870 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc8528a04151aa0b1334520bce4d35f27531b1d85f7dd9843d1db1c1fb23fc1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.432044 kubelet[2907]: E0113 23:47:13.432020 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc8528a04151aa0b1334520bce4d35f27531b1d85f7dd9843d1db1c1fb23fc1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" Jan 13 23:47:13.432700 kubelet[2907]: E0113 23:47:13.432400 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc8528a04151aa0b1334520bce4d35f27531b1d85f7dd9843d1db1c1fb23fc1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" Jan 13 23:47:13.432700 kubelet[2907]: E0113 23:47:13.432470 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b4b69d7d6-mtxnp_calico-system(c4d38e9b-73ce-46dd-9acb-61df83d528d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b4b69d7d6-mtxnp_calico-system(c4d38e9b-73ce-46dd-9acb-61df83d528d1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc8528a04151aa0b1334520bce4d35f27531b1d85f7dd9843d1db1c1fb23fc1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:47:13.435701 containerd[1673]: time="2026-01-13T23:47:13.435611972Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c7b464ffc-6fcxs,Uid:d7f5b072-2b7e-422f-9af7-8a879bbd601a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bad8cc9ca4a05d20769a4de987803557cff5caa6fb303172cea67aedafe9d65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.435993 kubelet[2907]: E0113 23:47:13.435889 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bad8cc9ca4a05d20769a4de987803557cff5caa6fb303172cea67aedafe9d65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.435993 kubelet[2907]: E0113 23:47:13.435964 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bad8cc9ca4a05d20769a4de987803557cff5caa6fb303172cea67aedafe9d65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7c7b464ffc-6fcxs" Jan 13 23:47:13.435993 kubelet[2907]: E0113 23:47:13.435987 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bad8cc9ca4a05d20769a4de987803557cff5caa6fb303172cea67aedafe9d65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7c7b464ffc-6fcxs" Jan 13 23:47:13.436197 kubelet[2907]: E0113 23:47:13.436020 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7c7b464ffc-6fcxs_calico-system(d7f5b072-2b7e-422f-9af7-8a879bbd601a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7c7b464ffc-6fcxs_calico-system(d7f5b072-2b7e-422f-9af7-8a879bbd601a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1bad8cc9ca4a05d20769a4de987803557cff5caa6fb303172cea67aedafe9d65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7c7b464ffc-6fcxs" podUID="d7f5b072-2b7e-422f-9af7-8a879bbd601a" Jan 13 23:47:13.438275 containerd[1673]: time="2026-01-13T23:47:13.438227740Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-xghpj,Uid:7ddfa231-61f2-4ab3-bd34-82f93616c2de,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a26724b39dc74931c822950b9780d3c9a0bbe225d3a178a78c81348067ee9a2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.438769 kubelet[2907]: E0113 23:47:13.438598 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a26724b39dc74931c822950b9780d3c9a0bbe225d3a178a78c81348067ee9a2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.438769 kubelet[2907]: E0113 23:47:13.438647 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a26724b39dc74931c822950b9780d3c9a0bbe225d3a178a78c81348067ee9a2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-xghpj" Jan 13 23:47:13.438769 kubelet[2907]: E0113 23:47:13.438663 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a26724b39dc74931c822950b9780d3c9a0bbe225d3a178a78c81348067ee9a2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-xghpj" Jan 13 23:47:13.438937 kubelet[2907]: E0113 23:47:13.438727 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-xghpj_calico-system(7ddfa231-61f2-4ab3-bd34-82f93616c2de)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-xghpj_calico-system(7ddfa231-61f2-4ab3-bd34-82f93616c2de)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a26724b39dc74931c822950b9780d3c9a0bbe225d3a178a78c81348067ee9a2d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:47:13.439320 containerd[1673]: time="2026-01-13T23:47:13.439279663Z" level=error msg="Failed to destroy network for sandbox \"2e53ac9f3178eaabd3f7a2fd0f444be84ccfe210ecce7320bec6b24bee92da62\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.442114 containerd[1673]: time="2026-01-13T23:47:13.442038672Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-bwfss,Uid:c47b5326-d31f-4680-9c2d-bdd28d584c69,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e53ac9f3178eaabd3f7a2fd0f444be84ccfe210ecce7320bec6b24bee92da62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.442436 containerd[1673]: time="2026-01-13T23:47:13.442269232Z" level=error msg="Failed to destroy network for sandbox \"7dc0af93ac33ed8b2cc5a1b25cbda4801c71f1b3f6d1745e2fcd374c48695234\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.442578 kubelet[2907]: E0113 23:47:13.442534 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e53ac9f3178eaabd3f7a2fd0f444be84ccfe210ecce7320bec6b24bee92da62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.442621 kubelet[2907]: E0113 23:47:13.442580 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e53ac9f3178eaabd3f7a2fd0f444be84ccfe210ecce7320bec6b24bee92da62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" Jan 13 23:47:13.442621 kubelet[2907]: E0113 23:47:13.442597 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e53ac9f3178eaabd3f7a2fd0f444be84ccfe210ecce7320bec6b24bee92da62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" Jan 13 23:47:13.442666 kubelet[2907]: E0113 23:47:13.442640 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-68644f6664-bwfss_calico-apiserver(c47b5326-d31f-4680-9c2d-bdd28d584c69)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-68644f6664-bwfss_calico-apiserver(c47b5326-d31f-4680-9c2d-bdd28d584c69)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e53ac9f3178eaabd3f7a2fd0f444be84ccfe210ecce7320bec6b24bee92da62\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:47:13.444962 containerd[1673]: time="2026-01-13T23:47:13.444866320Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wmjgt,Uid:bd94acc5-d85c-4f68-95ac-32e9cd9a577e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc0af93ac33ed8b2cc5a1b25cbda4801c71f1b3f6d1745e2fcd374c48695234\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.445084 kubelet[2907]: E0113 23:47:13.445032 2907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc0af93ac33ed8b2cc5a1b25cbda4801c71f1b3f6d1745e2fcd374c48695234\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 23:47:13.445128 kubelet[2907]: E0113 23:47:13.445108 2907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc0af93ac33ed8b2cc5a1b25cbda4801c71f1b3f6d1745e2fcd374c48695234\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wmjgt" Jan 13 23:47:13.445154 kubelet[2907]: E0113 23:47:13.445126 2907 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dc0af93ac33ed8b2cc5a1b25cbda4801c71f1b3f6d1745e2fcd374c48695234\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wmjgt" Jan 13 23:47:13.445199 kubelet[2907]: E0113 23:47:13.445177 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-wmjgt_kube-system(bd94acc5-d85c-4f68-95ac-32e9cd9a577e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-wmjgt_kube-system(bd94acc5-d85c-4f68-95ac-32e9cd9a577e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7dc0af93ac33ed8b2cc5a1b25cbda4801c71f1b3f6d1745e2fcd374c48695234\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wmjgt" podUID="bd94acc5-d85c-4f68-95ac-32e9cd9a577e" Jan 13 23:47:13.994214 containerd[1673]: time="2026-01-13T23:47:13.994133858Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 13 23:47:14.278714 systemd[1]: run-netns-cni\x2d0985ed3a\x2ddc9c\x2d090b\x2dd4dc\x2d48f0bde21f4b.mount: Deactivated successfully. Jan 13 23:47:14.278816 systemd[1]: run-netns-cni\x2def7ab226\x2d341c\x2dc1e4\x2d46ed\x2db8dbd2400fbb.mount: Deactivated successfully. Jan 13 23:47:14.278873 systemd[1]: run-netns-cni\x2d227b09e7\x2dfe2c\x2ddc72\x2d257a\x2df6b30428021c.mount: Deactivated successfully. Jan 13 23:47:14.278918 systemd[1]: run-netns-cni\x2d43db49d4\x2de0aa\x2d6e4a\x2d1e0c\x2d5d18736b0ff1.mount: Deactivated successfully. Jan 13 23:47:14.278959 systemd[1]: run-netns-cni\x2d81598292\x2da3b7\x2d4bd2\x2d37cf\x2d085d189af911.mount: Deactivated successfully. Jan 13 23:47:14.279000 systemd[1]: run-netns-cni\x2d4091753e\x2dd029\x2d19af\x2d9ddf\x2de0083d6e2d00.mount: Deactivated successfully. Jan 13 23:47:14.279040 systemd[1]: run-netns-cni\x2d33c5a57e\x2dfae3\x2dcca9\x2d042d\x2d15feb8be3995.mount: Deactivated successfully. Jan 13 23:47:14.279098 systemd[1]: run-netns-cni\x2d7c4fa800\x2d7012\x2dbfbf\x2d91bb\x2df1735d52224e.mount: Deactivated successfully. Jan 13 23:47:20.966545 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2358703203.mount: Deactivated successfully. Jan 13 23:47:20.988080 containerd[1673]: time="2026-01-13T23:47:20.987442285Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:20.988585 containerd[1673]: time="2026-01-13T23:47:20.988545008Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 13 23:47:20.989894 containerd[1673]: time="2026-01-13T23:47:20.989867972Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:20.992397 containerd[1673]: time="2026-01-13T23:47:20.992357579Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 23:47:20.993055 containerd[1673]: time="2026-01-13T23:47:20.993025381Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 6.998818363s" Jan 13 23:47:20.993168 containerd[1673]: time="2026-01-13T23:47:20.993153422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 13 23:47:21.003558 containerd[1673]: time="2026-01-13T23:47:21.003514373Z" level=info msg="CreateContainer within sandbox \"5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 23:47:21.018184 containerd[1673]: time="2026-01-13T23:47:21.018082577Z" level=info msg="Container f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:47:21.027450 containerd[1673]: time="2026-01-13T23:47:21.027402565Z" level=info msg="CreateContainer within sandbox \"5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08\"" Jan 13 23:47:21.028003 containerd[1673]: time="2026-01-13T23:47:21.027925487Z" level=info msg="StartContainer for \"f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08\"" Jan 13 23:47:21.029860 containerd[1673]: time="2026-01-13T23:47:21.029812452Z" level=info msg="connecting to shim f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08" address="unix:///run/containerd/s/554e82987101ec8a68aa9eb996f7ca519d35c2c8864cacefe8a7205fff592571" protocol=ttrpc version=3 Jan 13 23:47:21.050252 systemd[1]: Started cri-containerd-f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08.scope - libcontainer container f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08. Jan 13 23:47:21.120000 audit: BPF prog-id=172 op=LOAD Jan 13 23:47:21.122145 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 13 23:47:21.122199 kernel: audit: type=1334 audit(1768348041.120:575): prog-id=172 op=LOAD Jan 13 23:47:21.120000 audit[4013]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.125913 kernel: audit: type=1300 audit(1768348041.120:575): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.125955 kernel: audit: type=1327 audit(1768348041.120:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.120000 audit: BPF prog-id=173 op=LOAD Jan 13 23:47:21.120000 audit[4013]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.133213 kernel: audit: type=1334 audit(1768348041.120:576): prog-id=173 op=LOAD Jan 13 23:47:21.133291 kernel: audit: type=1300 audit(1768348041.120:576): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.133313 kernel: audit: type=1327 audit(1768348041.120:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.120000 audit: BPF prog-id=173 op=UNLOAD Jan 13 23:47:21.137420 kernel: audit: type=1334 audit(1768348041.120:577): prog-id=173 op=UNLOAD Jan 13 23:47:21.137458 kernel: audit: type=1300 audit(1768348041.120:577): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.120000 audit[4013]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.143445 kernel: audit: type=1327 audit(1768348041.120:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.143551 kernel: audit: type=1334 audit(1768348041.120:578): prog-id=172 op=UNLOAD Jan 13 23:47:21.120000 audit: BPF prog-id=172 op=UNLOAD Jan 13 23:47:21.120000 audit[4013]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.120000 audit: BPF prog-id=174 op=LOAD Jan 13 23:47:21.120000 audit[4013]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3446 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:21.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633396134396434356332393339353666396562316262613132643335 Jan 13 23:47:21.161567 containerd[1673]: time="2026-01-13T23:47:21.161465770Z" level=info msg="StartContainer for \"f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08\" returns successfully" Jan 13 23:47:21.296201 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 23:47:21.296315 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 23:47:21.511951 kubelet[2907]: I0113 23:47:21.511912 2907 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-ca-bundle\") pod \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\" (UID: \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\") " Jan 13 23:47:21.512666 kubelet[2907]: I0113 23:47:21.512308 2907 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "d7f5b072-2b7e-422f-9af7-8a879bbd601a" (UID: "d7f5b072-2b7e-422f-9af7-8a879bbd601a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 13 23:47:21.512666 kubelet[2907]: I0113 23:47:21.512526 2907 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-backend-key-pair\") pod \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\" (UID: \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\") " Jan 13 23:47:21.512666 kubelet[2907]: I0113 23:47:21.512564 2907 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jsq7\" (UniqueName: \"kubernetes.io/projected/d7f5b072-2b7e-422f-9af7-8a879bbd601a-kube-api-access-6jsq7\") pod \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\" (UID: \"d7f5b072-2b7e-422f-9af7-8a879bbd601a\") " Jan 13 23:47:21.512985 kubelet[2907]: I0113 23:47:21.512896 2907 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-ca-bundle\") on node \"ci-4547-0-0-n-660efdb355\" DevicePath \"\"" Jan 13 23:47:21.515456 kubelet[2907]: I0113 23:47:21.515411 2907 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f5b072-2b7e-422f-9af7-8a879bbd601a-kube-api-access-6jsq7" (OuterVolumeSpecName: "kube-api-access-6jsq7") pod "d7f5b072-2b7e-422f-9af7-8a879bbd601a" (UID: "d7f5b072-2b7e-422f-9af7-8a879bbd601a"). InnerVolumeSpecName "kube-api-access-6jsq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 13 23:47:21.516535 kubelet[2907]: I0113 23:47:21.516413 2907 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "d7f5b072-2b7e-422f-9af7-8a879bbd601a" (UID: "d7f5b072-2b7e-422f-9af7-8a879bbd601a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 13 23:47:21.614363 kubelet[2907]: I0113 23:47:21.613926 2907 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d7f5b072-2b7e-422f-9af7-8a879bbd601a-whisker-backend-key-pair\") on node \"ci-4547-0-0-n-660efdb355\" DevicePath \"\"" Jan 13 23:47:21.614363 kubelet[2907]: I0113 23:47:21.614139 2907 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jsq7\" (UniqueName: \"kubernetes.io/projected/d7f5b072-2b7e-422f-9af7-8a879bbd601a-kube-api-access-6jsq7\") on node \"ci-4547-0-0-n-660efdb355\" DevicePath \"\"" Jan 13 23:47:21.885542 systemd[1]: Removed slice kubepods-besteffort-podd7f5b072_2b7e_422f_9af7_8a879bbd601a.slice - libcontainer container kubepods-besteffort-podd7f5b072_2b7e_422f_9af7_8a879bbd601a.slice. Jan 13 23:47:21.967405 systemd[1]: var-lib-kubelet-pods-d7f5b072\x2d2b7e\x2d422f\x2d9af7\x2d8a879bbd601a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 13 23:47:21.967494 systemd[1]: var-lib-kubelet-pods-d7f5b072\x2d2b7e\x2d422f\x2d9af7\x2d8a879bbd601a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6jsq7.mount: Deactivated successfully. Jan 13 23:47:22.040461 kubelet[2907]: I0113 23:47:22.040237 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vxg9g" podStartSLOduration=2.005345078 podStartE2EDuration="24.040219502s" podCreationTimestamp="2026-01-13 23:46:58 +0000 UTC" firstStartedPulling="2026-01-13 23:46:58.95892932 +0000 UTC m=+23.219244887" lastFinishedPulling="2026-01-13 23:47:20.993803744 +0000 UTC m=+45.254119311" observedRunningTime="2026-01-13 23:47:22.028981188 +0000 UTC m=+46.289296835" watchObservedRunningTime="2026-01-13 23:47:22.040219502 +0000 UTC m=+46.300535069" Jan 13 23:47:22.085208 systemd[1]: Created slice kubepods-besteffort-pod181c88a3_5f50_4a12_ac78_292f89f9a583.slice - libcontainer container kubepods-besteffort-pod181c88a3_5f50_4a12_ac78_292f89f9a583.slice. Jan 13 23:47:22.219032 kubelet[2907]: I0113 23:47:22.218925 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/181c88a3-5f50-4a12-ac78-292f89f9a583-whisker-backend-key-pair\") pod \"whisker-5748d45b5-b6b8l\" (UID: \"181c88a3-5f50-4a12-ac78-292f89f9a583\") " pod="calico-system/whisker-5748d45b5-b6b8l" Jan 13 23:47:22.219032 kubelet[2907]: I0113 23:47:22.218978 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/181c88a3-5f50-4a12-ac78-292f89f9a583-whisker-ca-bundle\") pod \"whisker-5748d45b5-b6b8l\" (UID: \"181c88a3-5f50-4a12-ac78-292f89f9a583\") " pod="calico-system/whisker-5748d45b5-b6b8l" Jan 13 23:47:22.219032 kubelet[2907]: I0113 23:47:22.219002 2907 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msc9v\" (UniqueName: \"kubernetes.io/projected/181c88a3-5f50-4a12-ac78-292f89f9a583-kube-api-access-msc9v\") pod \"whisker-5748d45b5-b6b8l\" (UID: \"181c88a3-5f50-4a12-ac78-292f89f9a583\") " pod="calico-system/whisker-5748d45b5-b6b8l" Jan 13 23:47:22.392605 containerd[1673]: time="2026-01-13T23:47:22.392537885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5748d45b5-b6b8l,Uid:181c88a3-5f50-4a12-ac78-292f89f9a583,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:22.524254 systemd-networkd[1591]: cali14741dc0575: Link UP Jan 13 23:47:22.524446 systemd-networkd[1591]: cali14741dc0575: Gained carrier Jan 13 23:47:22.539318 containerd[1673]: 2026-01-13 23:47:22.415 [INFO][4080] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 23:47:22.539318 containerd[1673]: 2026-01-13 23:47:22.435 [INFO][4080] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0 whisker-5748d45b5- calico-system 181c88a3-5f50-4a12-ac78-292f89f9a583 916 0 2026-01-13 23:47:22 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5748d45b5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 whisker-5748d45b5-b6b8l eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali14741dc0575 [] [] }} ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-" Jan 13 23:47:22.539318 containerd[1673]: 2026-01-13 23:47:22.435 [INFO][4080] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" Jan 13 23:47:22.539318 containerd[1673]: 2026-01-13 23:47:22.481 [INFO][4093] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" HandleID="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Workload="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.481 [INFO][4093] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" HandleID="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Workload="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2130), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-660efdb355", "pod":"whisker-5748d45b5-b6b8l", "timestamp":"2026-01-13 23:47:22.481622114 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.481 [INFO][4093] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.481 [INFO][4093] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.481 [INFO][4093] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.490 [INFO][4093] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.495 [INFO][4093] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.499 [INFO][4093] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.501 [INFO][4093] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539543 containerd[1673]: 2026-01-13 23:47:22.503 [INFO][4093] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539750 containerd[1673]: 2026-01-13 23:47:22.503 [INFO][4093] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539750 containerd[1673]: 2026-01-13 23:47:22.505 [INFO][4093] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269 Jan 13 23:47:22.539750 containerd[1673]: 2026-01-13 23:47:22.508 [INFO][4093] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539750 containerd[1673]: 2026-01-13 23:47:22.513 [INFO][4093] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.65/26] block=192.168.37.64/26 handle="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539750 containerd[1673]: 2026-01-13 23:47:22.513 [INFO][4093] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.65/26] handle="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:22.539750 containerd[1673]: 2026-01-13 23:47:22.513 [INFO][4093] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:22.539750 containerd[1673]: 2026-01-13 23:47:22.513 [INFO][4093] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.65/26] IPv6=[] ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" HandleID="k8s-pod-network.64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Workload="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" Jan 13 23:47:22.539918 containerd[1673]: 2026-01-13 23:47:22.515 [INFO][4080] cni-plugin/k8s.go 418: Populated endpoint ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0", GenerateName:"whisker-5748d45b5-", Namespace:"calico-system", SelfLink:"", UID:"181c88a3-5f50-4a12-ac78-292f89f9a583", ResourceVersion:"916", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 47, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5748d45b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"whisker-5748d45b5-b6b8l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.37.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali14741dc0575", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:22.539918 containerd[1673]: 2026-01-13 23:47:22.516 [INFO][4080] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.65/32] ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" Jan 13 23:47:22.539989 containerd[1673]: 2026-01-13 23:47:22.516 [INFO][4080] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali14741dc0575 ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" Jan 13 23:47:22.539989 containerd[1673]: 2026-01-13 23:47:22.524 [INFO][4080] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" Jan 13 23:47:22.540030 containerd[1673]: 2026-01-13 23:47:22.526 [INFO][4080] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0", GenerateName:"whisker-5748d45b5-", Namespace:"calico-system", SelfLink:"", UID:"181c88a3-5f50-4a12-ac78-292f89f9a583", ResourceVersion:"916", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 47, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5748d45b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269", Pod:"whisker-5748d45b5-b6b8l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.37.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali14741dc0575", MAC:"be:ad:26:12:97:c6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:22.540095 containerd[1673]: 2026-01-13 23:47:22.537 [INFO][4080] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" Namespace="calico-system" Pod="whisker-5748d45b5-b6b8l" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-whisker--5748d45b5--b6b8l-eth0" Jan 13 23:47:22.561256 containerd[1673]: time="2026-01-13T23:47:22.561208674Z" level=info msg="connecting to shim 64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269" address="unix:///run/containerd/s/269aa584bc3286bc223163e580c2f696d9ae3b53ed66f3ce656d6436ec0c1faf" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:22.604216 systemd[1]: Started cri-containerd-64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269.scope - libcontainer container 64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269. Jan 13 23:47:22.616000 audit: BPF prog-id=175 op=LOAD Jan 13 23:47:22.618000 audit: BPF prog-id=176 op=LOAD Jan 13 23:47:22.618000 audit[4152]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4116 pid=4152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613432626461386439653332363532626138396534393162316431 Jan 13 23:47:22.618000 audit: BPF prog-id=176 op=UNLOAD Jan 13 23:47:22.618000 audit[4152]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4116 pid=4152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.618000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613432626461386439653332363532626138396534393162316431 Jan 13 23:47:22.620000 audit: BPF prog-id=177 op=LOAD Jan 13 23:47:22.620000 audit[4152]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4116 pid=4152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613432626461386439653332363532626138396534393162316431 Jan 13 23:47:22.620000 audit: BPF prog-id=178 op=LOAD Jan 13 23:47:22.620000 audit[4152]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4116 pid=4152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613432626461386439653332363532626138396534393162316431 Jan 13 23:47:22.620000 audit: BPF prog-id=178 op=UNLOAD Jan 13 23:47:22.620000 audit[4152]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4116 pid=4152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613432626461386439653332363532626138396534393162316431 Jan 13 23:47:22.620000 audit: BPF prog-id=177 op=UNLOAD Jan 13 23:47:22.620000 audit[4152]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4116 pid=4152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613432626461386439653332363532626138396534393162316431 Jan 13 23:47:22.620000 audit: BPF prog-id=179 op=LOAD Jan 13 23:47:22.620000 audit[4152]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4116 pid=4152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613432626461386439653332363532626138396534393162316431 Jan 13 23:47:22.666555 containerd[1673]: time="2026-01-13T23:47:22.666312392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5748d45b5-b6b8l,Uid:181c88a3-5f50-4a12-ac78-292f89f9a583,Namespace:calico-system,Attempt:0,} returns sandbox id \"64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269\"" Jan 13 23:47:22.669364 containerd[1673]: time="2026-01-13T23:47:22.669325361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 13 23:47:22.782000 audit: BPF prog-id=180 op=LOAD Jan 13 23:47:22.782000 audit[4286]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdcf7bf88 a2=98 a3=ffffdcf7bf78 items=0 ppid=4144 pid=4286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.782000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 13 23:47:22.782000 audit: BPF prog-id=180 op=UNLOAD Jan 13 23:47:22.782000 audit[4286]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdcf7bf58 a3=0 items=0 ppid=4144 pid=4286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.782000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 13 23:47:22.782000 audit: BPF prog-id=181 op=LOAD Jan 13 23:47:22.782000 audit[4286]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdcf7be38 a2=74 a3=95 items=0 ppid=4144 pid=4286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.782000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 13 23:47:22.782000 audit: BPF prog-id=181 op=UNLOAD Jan 13 23:47:22.782000 audit[4286]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4144 pid=4286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.782000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 13 23:47:22.782000 audit: BPF prog-id=182 op=LOAD Jan 13 23:47:22.782000 audit[4286]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdcf7be68 a2=40 a3=ffffdcf7be98 items=0 ppid=4144 pid=4286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.782000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 13 23:47:22.782000 audit: BPF prog-id=182 op=UNLOAD Jan 13 23:47:22.782000 audit[4286]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffdcf7be98 items=0 ppid=4144 pid=4286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.782000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 13 23:47:22.784000 audit: BPF prog-id=183 op=LOAD Jan 13 23:47:22.784000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffffe81fbf8 a2=98 a3=fffffe81fbe8 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.784000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.784000 audit: BPF prog-id=183 op=UNLOAD Jan 13 23:47:22.784000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffffe81fbc8 a3=0 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.784000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.785000 audit: BPF prog-id=184 op=LOAD Jan 13 23:47:22.785000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffe81f888 a2=74 a3=95 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.785000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.785000 audit: BPF prog-id=184 op=UNLOAD Jan 13 23:47:22.785000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.785000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.785000 audit: BPF prog-id=185 op=LOAD Jan 13 23:47:22.785000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffe81f8e8 a2=94 a3=2 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.785000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.785000 audit: BPF prog-id=185 op=UNLOAD Jan 13 23:47:22.785000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.785000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.886000 audit: BPF prog-id=186 op=LOAD Jan 13 23:47:22.886000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffe81f8a8 a2=40 a3=fffffe81f8d8 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.886000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.886000 audit: BPF prog-id=186 op=UNLOAD Jan 13 23:47:22.886000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=fffffe81f8d8 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.886000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.896000 audit: BPF prog-id=187 op=LOAD Jan 13 23:47:22.896000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffe81f8b8 a2=94 a3=4 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.896000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.896000 audit: BPF prog-id=187 op=UNLOAD Jan 13 23:47:22.896000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.896000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.896000 audit: BPF prog-id=188 op=LOAD Jan 13 23:47:22.896000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffffe81f6f8 a2=94 a3=5 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.896000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.896000 audit: BPF prog-id=188 op=UNLOAD Jan 13 23:47:22.896000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.896000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.896000 audit: BPF prog-id=189 op=LOAD Jan 13 23:47:22.896000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffe81f928 a2=94 a3=6 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.896000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.896000 audit: BPF prog-id=189 op=UNLOAD Jan 13 23:47:22.896000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.896000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.897000 audit: BPF prog-id=190 op=LOAD Jan 13 23:47:22.897000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffe81f0f8 a2=94 a3=83 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.897000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.897000 audit: BPF prog-id=191 op=LOAD Jan 13 23:47:22.897000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=fffffe81eeb8 a2=94 a3=2 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.897000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.897000 audit: BPF prog-id=191 op=UNLOAD Jan 13 23:47:22.897000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.897000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.897000 audit: BPF prog-id=190 op=UNLOAD Jan 13 23:47:22.897000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=24f12620 a3=24f05b00 items=0 ppid=4144 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.897000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 13 23:47:22.906000 audit: BPF prog-id=192 op=LOAD Jan 13 23:47:22.906000 audit[4290]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc8c7fc18 a2=98 a3=ffffc8c7fc08 items=0 ppid=4144 pid=4290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.906000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 13 23:47:22.906000 audit: BPF prog-id=192 op=UNLOAD Jan 13 23:47:22.906000 audit[4290]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc8c7fbe8 a3=0 items=0 ppid=4144 pid=4290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.906000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 13 23:47:22.906000 audit: BPF prog-id=193 op=LOAD Jan 13 23:47:22.906000 audit[4290]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc8c7fac8 a2=74 a3=95 items=0 ppid=4144 pid=4290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.906000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 13 23:47:22.906000 audit: BPF prog-id=193 op=UNLOAD Jan 13 23:47:22.906000 audit[4290]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4144 pid=4290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.906000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 13 23:47:22.906000 audit: BPF prog-id=194 op=LOAD Jan 13 23:47:22.906000 audit[4290]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc8c7faf8 a2=40 a3=ffffc8c7fb28 items=0 ppid=4144 pid=4290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.906000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 13 23:47:22.906000 audit: BPF prog-id=194 op=UNLOAD Jan 13 23:47:22.906000 audit[4290]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffc8c7fb28 items=0 ppid=4144 pid=4290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.906000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 13 23:47:22.968398 systemd-networkd[1591]: vxlan.calico: Link UP Jan 13 23:47:22.968409 systemd-networkd[1591]: vxlan.calico: Gained carrier Jan 13 23:47:22.974000 audit: BPF prog-id=195 op=LOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0903ab8 a2=98 a3=ffffd0903aa8 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=195 op=UNLOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd0903a88 a3=0 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=196 op=LOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0903798 a2=74 a3=95 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=196 op=UNLOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=197 op=LOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd09037f8 a2=94 a3=2 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=197 op=UNLOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=198 op=LOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0903678 a2=40 a3=ffffd09036a8 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=198 op=UNLOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffd09036a8 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.974000 audit: BPF prog-id=199 op=LOAD Jan 13 23:47:22.974000 audit[4316]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd09037c8 a2=94 a3=b7 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.974000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.975000 audit: BPF prog-id=199 op=UNLOAD Jan 13 23:47:22.975000 audit[4316]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.975000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.975000 audit: BPF prog-id=200 op=LOAD Jan 13 23:47:22.975000 audit[4316]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0902e78 a2=94 a3=2 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.975000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.975000 audit: BPF prog-id=200 op=UNLOAD Jan 13 23:47:22.975000 audit[4316]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.975000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.975000 audit: BPF prog-id=201 op=LOAD Jan 13 23:47:22.975000 audit[4316]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0903008 a2=94 a3=30 items=0 ppid=4144 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.975000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 13 23:47:22.982000 audit: BPF prog-id=202 op=LOAD Jan 13 23:47:22.982000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffce714e88 a2=98 a3=ffffce714e78 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.982000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:22.982000 audit: BPF prog-id=202 op=UNLOAD Jan 13 23:47:22.982000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffce714e58 a3=0 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.982000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:22.982000 audit: BPF prog-id=203 op=LOAD Jan 13 23:47:22.982000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffce714b18 a2=74 a3=95 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.982000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:22.982000 audit: BPF prog-id=203 op=UNLOAD Jan 13 23:47:22.982000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.982000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:22.982000 audit: BPF prog-id=204 op=LOAD Jan 13 23:47:22.982000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffce714b78 a2=94 a3=2 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.982000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:22.982000 audit: BPF prog-id=204 op=UNLOAD Jan 13 23:47:22.982000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:22.982000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.002407 containerd[1673]: time="2026-01-13T23:47:23.002360726Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:23.003828 containerd[1673]: time="2026-01-13T23:47:23.003790610Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 13 23:47:23.004388 containerd[1673]: time="2026-01-13T23:47:23.003858170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:23.004671 kubelet[2907]: E0113 23:47:23.004105 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:47:23.004671 kubelet[2907]: E0113 23:47:23.004167 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:47:23.004948 kubelet[2907]: E0113 23:47:23.004388 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b094e94e970498e92a984780514b4e2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:23.006329 containerd[1673]: time="2026-01-13T23:47:23.006299178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 13 23:47:23.092000 audit: BPF prog-id=205 op=LOAD Jan 13 23:47:23.092000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffce714b38 a2=40 a3=ffffce714b68 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.092000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.092000 audit: BPF prog-id=205 op=UNLOAD Jan 13 23:47:23.092000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffce714b68 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.092000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.103000 audit: BPF prog-id=206 op=LOAD Jan 13 23:47:23.103000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffce714b48 a2=94 a3=4 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.103000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.103000 audit: BPF prog-id=206 op=UNLOAD Jan 13 23:47:23.103000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.103000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.103000 audit: BPF prog-id=207 op=LOAD Jan 13 23:47:23.103000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffce714988 a2=94 a3=5 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.103000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.104000 audit: BPF prog-id=207 op=UNLOAD Jan 13 23:47:23.104000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.104000 audit: BPF prog-id=208 op=LOAD Jan 13 23:47:23.104000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffce714bb8 a2=94 a3=6 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.104000 audit: BPF prog-id=208 op=UNLOAD Jan 13 23:47:23.104000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.104000 audit: BPF prog-id=209 op=LOAD Jan 13 23:47:23.104000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffce714388 a2=94 a3=83 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.104000 audit: BPF prog-id=210 op=LOAD Jan 13 23:47:23.104000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffce714148 a2=94 a3=2 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.104000 audit: BPF prog-id=210 op=UNLOAD Jan 13 23:47:23.104000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.104000 audit: BPF prog-id=209 op=UNLOAD Jan 13 23:47:23.104000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1c4ce620 a3=1c4c1b00 items=0 ppid=4144 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.104000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 13 23:47:23.115000 audit: BPF prog-id=201 op=UNLOAD Jan 13 23:47:23.115000 audit[4144]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=40004b3840 a2=0 a3=0 items=0 ppid=4129 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.115000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 13 23:47:23.162000 audit[4348]: NETFILTER_CFG table=mangle:121 family=2 entries=16 op=nft_register_chain pid=4348 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:23.162000 audit[4348]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffff7a08490 a2=0 a3=ffff94f56fa8 items=0 ppid=4144 pid=4348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.162000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:23.166000 audit[4353]: NETFILTER_CFG table=nat:122 family=2 entries=15 op=nft_register_chain pid=4353 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:23.166000 audit[4353]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=fffffe079b10 a2=0 a3=ffff8c9f3fa8 items=0 ppid=4144 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.166000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:23.181000 audit[4347]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4347 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:23.181000 audit[4347]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=fffffc3418a0 a2=0 a3=ffff92f6cfa8 items=0 ppid=4144 pid=4347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.181000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:23.182000 audit[4349]: NETFILTER_CFG table=filter:124 family=2 entries=94 op=nft_register_chain pid=4349 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:23.182000 audit[4349]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=fffff8701370 a2=0 a3=ffff9eb30fa8 items=0 ppid=4144 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:23.182000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:23.334542 containerd[1673]: time="2026-01-13T23:47:23.334459848Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:23.335894 containerd[1673]: time="2026-01-13T23:47:23.335829892Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 13 23:47:23.335894 containerd[1673]: time="2026-01-13T23:47:23.335831492Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:23.336148 kubelet[2907]: E0113 23:47:23.336099 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:47:23.336234 kubelet[2907]: E0113 23:47:23.336152 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:47:23.336320 kubelet[2907]: E0113 23:47:23.336263 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:23.337621 kubelet[2907]: E0113 23:47:23.337567 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:47:23.881780 kubelet[2907]: I0113 23:47:23.881711 2907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f5b072-2b7e-422f-9af7-8a879bbd601a" path="/var/lib/kubelet/pods/d7f5b072-2b7e-422f-9af7-8a879bbd601a/volumes" Jan 13 23:47:24.019378 kubelet[2907]: E0113 23:47:24.019324 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:47:24.036207 systemd-networkd[1591]: cali14741dc0575: Gained IPv6LL Jan 13 23:47:24.045000 audit[4363]: NETFILTER_CFG table=filter:125 family=2 entries=20 op=nft_register_rule pid=4363 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:24.045000 audit[4363]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd5542740 a2=0 a3=1 items=0 ppid=3061 pid=4363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:24.045000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:24.060000 audit[4363]: NETFILTER_CFG table=nat:126 family=2 entries=14 op=nft_register_rule pid=4363 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:24.060000 audit[4363]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd5542740 a2=0 a3=1 items=0 ppid=3061 pid=4363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:24.060000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:24.996317 systemd-networkd[1591]: vxlan.calico: Gained IPv6LL Jan 13 23:47:25.879982 containerd[1673]: time="2026-01-13T23:47:25.879926530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b4b69d7d6-mtxnp,Uid:c4d38e9b-73ce-46dd-9acb-61df83d528d1,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:25.880540 containerd[1673]: time="2026-01-13T23:47:25.880463892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wmjgt,Uid:bd94acc5-d85c-4f68-95ac-32e9cd9a577e,Namespace:kube-system,Attempt:0,}" Jan 13 23:47:25.881454 containerd[1673]: time="2026-01-13T23:47:25.881410255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwjb8,Uid:48b88556-647c-4972-b2c3-222a83018169,Namespace:kube-system,Attempt:0,}" Jan 13 23:47:25.882005 containerd[1673]: time="2026-01-13T23:47:25.881964937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-xghpj,Uid:7ddfa231-61f2-4ab3-bd34-82f93616c2de,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:25.882041 containerd[1673]: time="2026-01-13T23:47:25.882024737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-r8m4t,Uid:04948dc7-2d3d-4260-8809-f8eb4aa6cc17,Namespace:calico-apiserver,Attempt:0,}" Jan 13 23:47:26.061328 systemd-networkd[1591]: calid87d99b48c8: Link UP Jan 13 23:47:26.061892 systemd-networkd[1591]: calid87d99b48c8: Gained carrier Jan 13 23:47:26.076254 containerd[1673]: 2026-01-13 23:47:25.963 [INFO][4378] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0 coredns-668d6bf9bc- kube-system bd94acc5-d85c-4f68-95ac-32e9cd9a577e 849 0 2026-01-13 23:46:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 coredns-668d6bf9bc-wmjgt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid87d99b48c8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-" Jan 13 23:47:26.076254 containerd[1673]: 2026-01-13 23:47:25.963 [INFO][4378] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" Jan 13 23:47:26.076254 containerd[1673]: 2026-01-13 23:47:26.008 [INFO][4440] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" HandleID="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Workload="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.008 [INFO][4440] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" HandleID="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Workload="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137740), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-660efdb355", "pod":"coredns-668d6bf9bc-wmjgt", "timestamp":"2026-01-13 23:47:26.008356838 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.008 [INFO][4440] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.008 [INFO][4440] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.008 [INFO][4440] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.018 [INFO][4440] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.030 [INFO][4440] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.037 [INFO][4440] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.040 [INFO][4440] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076464 containerd[1673]: 2026-01-13 23:47:26.043 [INFO][4440] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076672 containerd[1673]: 2026-01-13 23:47:26.043 [INFO][4440] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076672 containerd[1673]: 2026-01-13 23:47:26.045 [INFO][4440] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c Jan 13 23:47:26.076672 containerd[1673]: 2026-01-13 23:47:26.049 [INFO][4440] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076672 containerd[1673]: 2026-01-13 23:47:26.056 [INFO][4440] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.66/26] block=192.168.37.64/26 handle="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076672 containerd[1673]: 2026-01-13 23:47:26.056 [INFO][4440] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.66/26] handle="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.076672 containerd[1673]: 2026-01-13 23:47:26.056 [INFO][4440] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:26.076672 containerd[1673]: 2026-01-13 23:47:26.056 [INFO][4440] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.66/26] IPv6=[] ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" HandleID="k8s-pod-network.4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Workload="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" Jan 13 23:47:26.076809 containerd[1673]: 2026-01-13 23:47:26.058 [INFO][4378] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"bd94acc5-d85c-4f68-95ac-32e9cd9a577e", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"coredns-668d6bf9bc-wmjgt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.37.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid87d99b48c8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.076809 containerd[1673]: 2026-01-13 23:47:26.058 [INFO][4378] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.66/32] ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" Jan 13 23:47:26.076809 containerd[1673]: 2026-01-13 23:47:26.058 [INFO][4378] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid87d99b48c8 ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" Jan 13 23:47:26.076809 containerd[1673]: 2026-01-13 23:47:26.062 [INFO][4378] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" Jan 13 23:47:26.076809 containerd[1673]: 2026-01-13 23:47:26.063 [INFO][4378] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"bd94acc5-d85c-4f68-95ac-32e9cd9a577e", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c", Pod:"coredns-668d6bf9bc-wmjgt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.37.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid87d99b48c8", MAC:"02:02:51:6d:87:b8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.076809 containerd[1673]: 2026-01-13 23:47:26.073 [INFO][4378] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wmjgt" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--wmjgt-eth0" Jan 13 23:47:26.091000 audit[4490]: NETFILTER_CFG table=filter:127 family=2 entries=42 op=nft_register_chain pid=4490 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:26.091000 audit[4490]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22552 a0=3 a1=ffffde4434e0 a2=0 a3=ffffb6fc0fa8 items=0 ppid=4144 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.091000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:26.102197 containerd[1673]: time="2026-01-13T23:47:26.102142921Z" level=info msg="connecting to shim 4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c" address="unix:///run/containerd/s/7be814d07c55fe53b7e6da5a29bb7fc1dc46b4e41f1ad8aeebdfffa173f2cb39" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:26.129445 systemd[1]: Started cri-containerd-4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c.scope - libcontainer container 4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c. Jan 13 23:47:26.139000 audit: BPF prog-id=211 op=LOAD Jan 13 23:47:26.141749 kernel: kauditd_printk_skb: 234 callbacks suppressed Jan 13 23:47:26.141832 kernel: audit: type=1334 audit(1768348046.139:657): prog-id=211 op=LOAD Jan 13 23:47:26.141000 audit: BPF prog-id=212 op=LOAD Jan 13 23:47:26.143441 kernel: audit: type=1334 audit(1768348046.141:658): prog-id=212 op=LOAD Jan 13 23:47:26.141000 audit[4511]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.147212 kernel: audit: type=1300 audit(1768348046.141:658): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.150990 kernel: audit: type=1327 audit(1768348046.141:658): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.141000 audit: BPF prog-id=212 op=UNLOAD Jan 13 23:47:26.152080 kernel: audit: type=1334 audit(1768348046.141:659): prog-id=212 op=UNLOAD Jan 13 23:47:26.141000 audit[4511]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.155689 kernel: audit: type=1300 audit(1768348046.141:659): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.159654 kernel: audit: type=1327 audit(1768348046.141:659): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.159737 kernel: audit: type=1334 audit(1768348046.141:660): prog-id=213 op=LOAD Jan 13 23:47:26.141000 audit: BPF prog-id=213 op=LOAD Jan 13 23:47:26.141000 audit[4511]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.163722 kernel: audit: type=1300 audit(1768348046.141:660): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.167052 kernel: audit: type=1327 audit(1768348046.141:660): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.142000 audit: BPF prog-id=214 op=LOAD Jan 13 23:47:26.142000 audit[4511]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.142000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.142000 audit: BPF prog-id=214 op=UNLOAD Jan 13 23:47:26.142000 audit[4511]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.142000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.142000 audit: BPF prog-id=213 op=UNLOAD Jan 13 23:47:26.142000 audit[4511]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.142000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.142000 audit: BPF prog-id=215 op=LOAD Jan 13 23:47:26.142000 audit[4511]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4499 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.142000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3466626337333539343662303430653561343836346339306232633465 Jan 13 23:47:26.177127 systemd-networkd[1591]: cali7d6615dac1f: Link UP Jan 13 23:47:26.177614 systemd-networkd[1591]: cali7d6615dac1f: Gained carrier Jan 13 23:47:26.191169 containerd[1673]: time="2026-01-13T23:47:26.190774269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wmjgt,Uid:bd94acc5-d85c-4f68-95ac-32e9cd9a577e,Namespace:kube-system,Attempt:0,} returns sandbox id \"4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c\"" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:25.973 [INFO][4388] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0 coredns-668d6bf9bc- kube-system 48b88556-647c-4972-b2c3-222a83018169 847 0 2026-01-13 23:46:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 coredns-668d6bf9bc-mwjb8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7d6615dac1f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:25.973 [INFO][4388] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.009 [INFO][4453] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" HandleID="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Workload="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.009 [INFO][4453] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" HandleID="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Workload="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400012e690), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-660efdb355", "pod":"coredns-668d6bf9bc-mwjb8", "timestamp":"2026-01-13 23:47:26.009750442 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.009 [INFO][4453] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.056 [INFO][4453] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.056 [INFO][4453] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.119 [INFO][4453] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.131 [INFO][4453] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.137 [INFO][4453] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.140 [INFO][4453] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.143 [INFO][4453] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.145 [INFO][4453] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.152 [INFO][4453] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753 Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.159 [INFO][4453] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.168 [INFO][4453] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.67/26] block=192.168.37.64/26 handle="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.169 [INFO][4453] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.67/26] handle="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.169 [INFO][4453] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:26.194275 containerd[1673]: 2026-01-13 23:47:26.169 [INFO][4453] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.67/26] IPv6=[] ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" HandleID="k8s-pod-network.85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Workload="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" Jan 13 23:47:26.195452 containerd[1673]: 2026-01-13 23:47:26.173 [INFO][4388] cni-plugin/k8s.go 418: Populated endpoint ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"48b88556-647c-4972-b2c3-222a83018169", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"coredns-668d6bf9bc-mwjb8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.37.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7d6615dac1f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.195452 containerd[1673]: 2026-01-13 23:47:26.174 [INFO][4388] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.67/32] ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" Jan 13 23:47:26.195452 containerd[1673]: 2026-01-13 23:47:26.174 [INFO][4388] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7d6615dac1f ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" Jan 13 23:47:26.195452 containerd[1673]: 2026-01-13 23:47:26.177 [INFO][4388] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" Jan 13 23:47:26.195452 containerd[1673]: 2026-01-13 23:47:26.178 [INFO][4388] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"48b88556-647c-4972-b2c3-222a83018169", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753", Pod:"coredns-668d6bf9bc-mwjb8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.37.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7d6615dac1f", MAC:"ae:15:36:5d:f1:f6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.195452 containerd[1673]: 2026-01-13 23:47:26.192 [INFO][4388] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwjb8" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-coredns--668d6bf9bc--mwjb8-eth0" Jan 13 23:47:26.195452 containerd[1673]: time="2026-01-13T23:47:26.195342402Z" level=info msg="CreateContainer within sandbox \"4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 23:47:26.210000 audit[4546]: NETFILTER_CFG table=filter:128 family=2 entries=36 op=nft_register_chain pid=4546 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:26.210000 audit[4546]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19156 a0=3 a1=ffffe2ecaa60 a2=0 a3=ffffad6c6fa8 items=0 ppid=4144 pid=4546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.210000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:26.217951 containerd[1673]: time="2026-01-13T23:47:26.217894870Z" level=info msg="Container 2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:47:26.230106 containerd[1673]: time="2026-01-13T23:47:26.229998707Z" level=info msg="CreateContainer within sandbox \"4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95\"" Jan 13 23:47:26.230982 containerd[1673]: time="2026-01-13T23:47:26.230683509Z" level=info msg="StartContainer for \"2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95\"" Jan 13 23:47:26.231904 containerd[1673]: time="2026-01-13T23:47:26.231863673Z" level=info msg="connecting to shim 2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95" address="unix:///run/containerd/s/7be814d07c55fe53b7e6da5a29bb7fc1dc46b4e41f1ad8aeebdfffa173f2cb39" protocol=ttrpc version=3 Jan 13 23:47:26.233281 containerd[1673]: time="2026-01-13T23:47:26.233237557Z" level=info msg="connecting to shim 85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753" address="unix:///run/containerd/s/c0b885bf5689312a191289ee4a2c53a3b16fa5d72838b3d4a8c0bc5e3bbfaace" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:26.264341 systemd[1]: Started cri-containerd-2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95.scope - libcontainer container 2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95. Jan 13 23:47:26.265587 systemd[1]: Started cri-containerd-85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753.scope - libcontainer container 85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753. Jan 13 23:47:26.271735 systemd-networkd[1591]: cali2fbc98a06bc: Link UP Jan 13 23:47:26.272472 systemd-networkd[1591]: cali2fbc98a06bc: Gained carrier Jan 13 23:47:26.284000 audit: BPF prog-id=216 op=LOAD Jan 13 23:47:26.286000 audit: BPF prog-id=217 op=LOAD Jan 13 23:47:26.286000 audit[4557]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4499 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373763393532343664396434356436373734356535613261346662 Jan 13 23:47:26.287000 audit: BPF prog-id=217 op=UNLOAD Jan 13 23:47:26.287000 audit[4557]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4499 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373763393532343664396434356436373734356535613261346662 Jan 13 23:47:26.287000 audit: BPF prog-id=218 op=LOAD Jan 13 23:47:26.287000 audit[4557]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4499 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373763393532343664396434356436373734356535613261346662 Jan 13 23:47:26.287000 audit: BPF prog-id=219 op=LOAD Jan 13 23:47:26.287000 audit[4557]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4499 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373763393532343664396434356436373734356535613261346662 Jan 13 23:47:26.287000 audit: BPF prog-id=219 op=UNLOAD Jan 13 23:47:26.287000 audit[4557]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4499 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373763393532343664396434356436373734356535613261346662 Jan 13 23:47:26.287000 audit: BPF prog-id=218 op=UNLOAD Jan 13 23:47:26.287000 audit[4557]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4499 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373763393532343664396434356436373734356535613261346662 Jan 13 23:47:26.288000 audit: BPF prog-id=220 op=LOAD Jan 13 23:47:26.288000 audit: BPF prog-id=221 op=LOAD Jan 13 23:47:26.288000 audit[4557]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4499 pid=4557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.288000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266373763393532343664396434356436373734356535613261346662 Jan 13 23:47:26.289000 audit: BPF prog-id=222 op=LOAD Jan 13 23:47:26.289000 audit[4573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4556 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835653665306433613062373766323833383139643465316532663630 Jan 13 23:47:26.289000 audit: BPF prog-id=222 op=UNLOAD Jan 13 23:47:26.289000 audit[4573]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4556 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835653665306433613062373766323833383139643465316532663630 Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:25.988 [INFO][4402] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0 calico-apiserver-68644f6664- calico-apiserver 04948dc7-2d3d-4260-8809-f8eb4aa6cc17 845 0 2026-01-13 23:46:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:68644f6664 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 calico-apiserver-68644f6664-r8m4t eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2fbc98a06bc [] [] }} ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:25.989 [INFO][4402] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.028 [INFO][4463] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" HandleID="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.028 [INFO][4463] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" HandleID="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b000), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-660efdb355", "pod":"calico-apiserver-68644f6664-r8m4t", "timestamp":"2026-01-13 23:47:26.028161578 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.028 [INFO][4463] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.169 [INFO][4463] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.169 [INFO][4463] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.220 [INFO][4463] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.232 [INFO][4463] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.238 [INFO][4463] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.242 [INFO][4463] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.245 [INFO][4463] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.245 [INFO][4463] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.248 [INFO][4463] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47 Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.253 [INFO][4463] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.260 [INFO][4463] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.68/26] block=192.168.37.64/26 handle="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.260 [INFO][4463] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.68/26] handle="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.260 [INFO][4463] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:26.291303 containerd[1673]: 2026-01-13 23:47:26.260 [INFO][4463] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.68/26] IPv6=[] ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" HandleID="k8s-pod-network.15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" Jan 13 23:47:26.290000 audit: BPF prog-id=223 op=LOAD Jan 13 23:47:26.290000 audit[4573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4556 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.290000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835653665306433613062373766323833383139643465316532663630 Jan 13 23:47:26.290000 audit: BPF prog-id=224 op=LOAD Jan 13 23:47:26.290000 audit[4573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4556 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.290000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835653665306433613062373766323833383139643465316532663630 Jan 13 23:47:26.290000 audit: BPF prog-id=224 op=UNLOAD Jan 13 23:47:26.290000 audit[4573]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4556 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.290000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835653665306433613062373766323833383139643465316532663630 Jan 13 23:47:26.290000 audit: BPF prog-id=223 op=UNLOAD Jan 13 23:47:26.290000 audit[4573]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4556 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.290000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835653665306433613062373766323833383139643465316532663630 Jan 13 23:47:26.290000 audit: BPF prog-id=225 op=LOAD Jan 13 23:47:26.290000 audit[4573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4556 pid=4573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.290000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835653665306433613062373766323833383139643465316532663630 Jan 13 23:47:26.293525 containerd[1673]: 2026-01-13 23:47:26.262 [INFO][4402] cni-plugin/k8s.go 418: Populated endpoint ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0", GenerateName:"calico-apiserver-68644f6664-", Namespace:"calico-apiserver", SelfLink:"", UID:"04948dc7-2d3d-4260-8809-f8eb4aa6cc17", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68644f6664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"calico-apiserver-68644f6664-r8m4t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.37.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2fbc98a06bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.293525 containerd[1673]: 2026-01-13 23:47:26.263 [INFO][4402] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.68/32] ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" Jan 13 23:47:26.293525 containerd[1673]: 2026-01-13 23:47:26.263 [INFO][4402] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2fbc98a06bc ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" Jan 13 23:47:26.293525 containerd[1673]: 2026-01-13 23:47:26.273 [INFO][4402] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" Jan 13 23:47:26.293525 containerd[1673]: 2026-01-13 23:47:26.275 [INFO][4402] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0", GenerateName:"calico-apiserver-68644f6664-", Namespace:"calico-apiserver", SelfLink:"", UID:"04948dc7-2d3d-4260-8809-f8eb4aa6cc17", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68644f6664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47", Pod:"calico-apiserver-68644f6664-r8m4t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.37.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2fbc98a06bc", MAC:"66:fe:bd:e4:7c:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.293525 containerd[1673]: 2026-01-13 23:47:26.288 [INFO][4402] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-r8m4t" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--r8m4t-eth0" Jan 13 23:47:26.306000 audit[4620]: NETFILTER_CFG table=filter:129 family=2 entries=58 op=nft_register_chain pid=4620 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:26.306000 audit[4620]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=30584 a0=3 a1=fffffaceefa0 a2=0 a3=ffff9dc8bfa8 items=0 ppid=4144 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.306000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:26.318452 containerd[1673]: time="2026-01-13T23:47:26.318398134Z" level=info msg="StartContainer for \"2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95\" returns successfully" Jan 13 23:47:26.333549 containerd[1673]: time="2026-01-13T23:47:26.333442499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwjb8,Uid:48b88556-647c-4972-b2c3-222a83018169,Namespace:kube-system,Attempt:0,} returns sandbox id \"85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753\"" Jan 13 23:47:26.338129 containerd[1673]: time="2026-01-13T23:47:26.338054473Z" level=info msg="CreateContainer within sandbox \"85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 23:47:26.342299 containerd[1673]: time="2026-01-13T23:47:26.341912405Z" level=info msg="connecting to shim 15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47" address="unix:///run/containerd/s/639c0d85411b0d56981aff7e92c2402262ae8618017e899eaa3195c4a77d0be0" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:26.350260 containerd[1673]: time="2026-01-13T23:47:26.350207910Z" level=info msg="Container d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:47:26.358378 containerd[1673]: time="2026-01-13T23:47:26.357796773Z" level=info msg="CreateContainer within sandbox \"85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888\"" Jan 13 23:47:26.359338 containerd[1673]: time="2026-01-13T23:47:26.359288497Z" level=info msg="StartContainer for \"d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888\"" Jan 13 23:47:26.361879 containerd[1673]: time="2026-01-13T23:47:26.361748105Z" level=info msg="connecting to shim d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888" address="unix:///run/containerd/s/c0b885bf5689312a191289ee4a2c53a3b16fa5d72838b3d4a8c0bc5e3bbfaace" protocol=ttrpc version=3 Jan 13 23:47:26.384286 systemd[1]: Started cri-containerd-15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47.scope - libcontainer container 15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47. Jan 13 23:47:26.388438 systemd-networkd[1591]: cali5a0585309b5: Link UP Jan 13 23:47:26.388498 systemd[1]: Started cri-containerd-d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888.scope - libcontainer container d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888. Jan 13 23:47:26.390395 systemd-networkd[1591]: cali5a0585309b5: Gained carrier Jan 13 23:47:26.411000 audit: BPF prog-id=226 op=LOAD Jan 13 23:47:26.411000 audit: BPF prog-id=227 op=LOAD Jan 13 23:47:26.411000 audit[4652]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4642 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135656165323432663561353734366433666538313630303432613634 Jan 13 23:47:26.411000 audit: BPF prog-id=227 op=UNLOAD Jan 13 23:47:26.411000 audit[4652]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4642 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135656165323432663561353734366433666538313630303432613634 Jan 13 23:47:26.412000 audit: BPF prog-id=228 op=LOAD Jan 13 23:47:26.412000 audit[4652]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4642 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135656165323432663561353734366433666538313630303432613634 Jan 13 23:47:26.412000 audit: BPF prog-id=229 op=LOAD Jan 13 23:47:26.412000 audit[4652]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4642 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135656165323432663561353734366433666538313630303432613634 Jan 13 23:47:26.412000 audit: BPF prog-id=229 op=UNLOAD Jan 13 23:47:26.412000 audit[4652]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4642 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135656165323432663561353734366433666538313630303432613634 Jan 13 23:47:26.412000 audit: BPF prog-id=228 op=UNLOAD Jan 13 23:47:26.412000 audit[4652]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4642 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135656165323432663561353734366433666538313630303432613634 Jan 13 23:47:26.412000 audit: BPF prog-id=230 op=LOAD Jan 13 23:47:26.412000 audit[4652]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4642 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135656165323432663561353734366433666538313630303432613634 Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:25.964 [INFO][4366] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0 calico-kube-controllers-7b4b69d7d6- calico-system c4d38e9b-73ce-46dd-9acb-61df83d528d1 848 0 2026-01-13 23:46:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7b4b69d7d6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 calico-kube-controllers-7b4b69d7d6-mtxnp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali5a0585309b5 [] [] }} ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:25.965 [INFO][4366] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.032 [INFO][4447] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" HandleID="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.032 [INFO][4447] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" HandleID="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004cea0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-660efdb355", "pod":"calico-kube-controllers-7b4b69d7d6-mtxnp", "timestamp":"2026-01-13 23:47:26.032040989 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.032 [INFO][4447] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.260 [INFO][4447] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.260 [INFO][4447] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.320 [INFO][4447] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.333 [INFO][4447] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.343 [INFO][4447] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.345 [INFO][4447] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.351 [INFO][4447] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.351 [INFO][4447] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.353 [INFO][4447] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.361 [INFO][4447] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.377 [INFO][4447] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.69/26] block=192.168.37.64/26 handle="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.377 [INFO][4447] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.69/26] handle="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.377 [INFO][4447] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:26.414933 containerd[1673]: 2026-01-13 23:47:26.381 [INFO][4447] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.69/26] IPv6=[] ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" HandleID="k8s-pod-network.f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" Jan 13 23:47:26.415529 containerd[1673]: 2026-01-13 23:47:26.384 [INFO][4366] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0", GenerateName:"calico-kube-controllers-7b4b69d7d6-", Namespace:"calico-system", SelfLink:"", UID:"c4d38e9b-73ce-46dd-9acb-61df83d528d1", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b4b69d7d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"calico-kube-controllers-7b4b69d7d6-mtxnp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.37.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5a0585309b5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.415529 containerd[1673]: 2026-01-13 23:47:26.384 [INFO][4366] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.69/32] ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" Jan 13 23:47:26.415529 containerd[1673]: 2026-01-13 23:47:26.384 [INFO][4366] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5a0585309b5 ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" Jan 13 23:47:26.415529 containerd[1673]: 2026-01-13 23:47:26.390 [INFO][4366] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" Jan 13 23:47:26.415529 containerd[1673]: 2026-01-13 23:47:26.393 [INFO][4366] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0", GenerateName:"calico-kube-controllers-7b4b69d7d6-", Namespace:"calico-system", SelfLink:"", UID:"c4d38e9b-73ce-46dd-9acb-61df83d528d1", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b4b69d7d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce", Pod:"calico-kube-controllers-7b4b69d7d6-mtxnp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.37.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5a0585309b5", MAC:"6e:b2:e2:55:99:9a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.415529 containerd[1673]: 2026-01-13 23:47:26.411 [INFO][4366] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" Namespace="calico-system" Pod="calico-kube-controllers-7b4b69d7d6-mtxnp" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--kube--controllers--7b4b69d7d6--mtxnp-eth0" Jan 13 23:47:26.420000 audit: BPF prog-id=231 op=LOAD Jan 13 23:47:26.422000 audit: BPF prog-id=232 op=LOAD Jan 13 23:47:26.422000 audit[4659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4556 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.422000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430396661653265643236666263323938643139663239373363336666 Jan 13 23:47:26.422000 audit: BPF prog-id=232 op=UNLOAD Jan 13 23:47:26.422000 audit[4659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4556 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.422000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430396661653265643236666263323938643139663239373363336666 Jan 13 23:47:26.422000 audit: BPF prog-id=233 op=LOAD Jan 13 23:47:26.422000 audit[4659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4556 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.422000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430396661653265643236666263323938643139663239373363336666 Jan 13 23:47:26.423000 audit: BPF prog-id=234 op=LOAD Jan 13 23:47:26.423000 audit[4659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4556 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.423000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430396661653265643236666263323938643139663239373363336666 Jan 13 23:47:26.423000 audit: BPF prog-id=234 op=UNLOAD Jan 13 23:47:26.423000 audit[4659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4556 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.423000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430396661653265643236666263323938643139663239373363336666 Jan 13 23:47:26.423000 audit: BPF prog-id=233 op=UNLOAD Jan 13 23:47:26.423000 audit[4659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4556 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.423000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430396661653265643236666263323938643139663239373363336666 Jan 13 23:47:26.423000 audit: BPF prog-id=235 op=LOAD Jan 13 23:47:26.423000 audit[4659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4556 pid=4659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.423000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430396661653265643236666263323938643139663239373363336666 Jan 13 23:47:26.445000 audit[4707]: NETFILTER_CFG table=filter:130 family=2 entries=48 op=nft_register_chain pid=4707 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:26.445000 audit[4707]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23140 a0=3 a1=fffffab24ef0 a2=0 a3=ffff97dd8fa8 items=0 ppid=4144 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.445000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:26.460766 containerd[1673]: time="2026-01-13T23:47:26.460716443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-r8m4t,Uid:04948dc7-2d3d-4260-8809-f8eb4aa6cc17,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47\"" Jan 13 23:47:26.463285 containerd[1673]: time="2026-01-13T23:47:26.463054490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:47:26.472028 containerd[1673]: time="2026-01-13T23:47:26.471909397Z" level=info msg="StartContainer for \"d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888\" returns successfully" Jan 13 23:47:26.472028 containerd[1673]: time="2026-01-13T23:47:26.471962957Z" level=info msg="connecting to shim f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce" address="unix:///run/containerd/s/66c401924567151fb5bacdd97ad251d064ae3d7193f9a699e7bf7b03f5d83f66" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:26.492152 systemd-networkd[1591]: cali2a9f045e02a: Link UP Jan 13 23:47:26.493274 systemd-networkd[1591]: cali2a9f045e02a: Gained carrier Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:25.985 [INFO][4399] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0 goldmane-666569f655- calico-system 7ddfa231-61f2-4ab3-bd34-82f93616c2de 851 0 2026-01-13 23:46:55 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 goldmane-666569f655-xghpj eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali2a9f045e02a [] [] }} ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:25.985 [INFO][4399] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.037 [INFO][4461] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" HandleID="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Workload="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.038 [INFO][4461] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" HandleID="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Workload="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323390), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-660efdb355", "pod":"goldmane-666569f655-xghpj", "timestamp":"2026-01-13 23:47:26.037973687 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.038 [INFO][4461] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.377 [INFO][4461] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.378 [INFO][4461] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.423 [INFO][4461] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.436 [INFO][4461] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.445 [INFO][4461] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.449 [INFO][4461] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.454 [INFO][4461] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.454 [INFO][4461] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.458 [INFO][4461] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003 Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.466 [INFO][4461] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.485 [INFO][4461] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.70/26] block=192.168.37.64/26 handle="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.485 [INFO][4461] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.70/26] handle="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.485 [INFO][4461] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:26.515889 containerd[1673]: 2026-01-13 23:47:26.485 [INFO][4461] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.70/26] IPv6=[] ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" HandleID="k8s-pod-network.c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Workload="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" Jan 13 23:47:26.516438 containerd[1673]: 2026-01-13 23:47:26.489 [INFO][4399] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"7ddfa231-61f2-4ab3-bd34-82f93616c2de", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"goldmane-666569f655-xghpj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.37.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2a9f045e02a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.516438 containerd[1673]: 2026-01-13 23:47:26.489 [INFO][4399] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.70/32] ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" Jan 13 23:47:26.516438 containerd[1673]: 2026-01-13 23:47:26.489 [INFO][4399] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2a9f045e02a ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" Jan 13 23:47:26.516438 containerd[1673]: 2026-01-13 23:47:26.493 [INFO][4399] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" Jan 13 23:47:26.516438 containerd[1673]: 2026-01-13 23:47:26.494 [INFO][4399] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"7ddfa231-61f2-4ab3-bd34-82f93616c2de", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003", Pod:"goldmane-666569f655-xghpj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.37.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2a9f045e02a", MAC:"c6:49:26:93:56:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:26.516438 containerd[1673]: 2026-01-13 23:47:26.511 [INFO][4399] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" Namespace="calico-system" Pod="goldmane-666569f655-xghpj" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-goldmane--666569f655--xghpj-eth0" Jan 13 23:47:26.521292 systemd[1]: Started cri-containerd-f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce.scope - libcontainer container f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce. Jan 13 23:47:26.531000 audit: BPF prog-id=236 op=LOAD Jan 13 23:47:26.532000 audit: BPF prog-id=237 op=LOAD Jan 13 23:47:26.532000 audit[4743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4731 pid=4743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6636323564623631393639303037373430633133636336313733353762 Jan 13 23:47:26.532000 audit: BPF prog-id=237 op=UNLOAD Jan 13 23:47:26.532000 audit[4743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4731 pid=4743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6636323564623631393639303037373430633133636336313733353762 Jan 13 23:47:26.532000 audit: BPF prog-id=238 op=LOAD Jan 13 23:47:26.532000 audit[4743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=4731 pid=4743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6636323564623631393639303037373430633133636336313733353762 Jan 13 23:47:26.532000 audit: BPF prog-id=239 op=LOAD Jan 13 23:47:26.532000 audit[4743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=4731 pid=4743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6636323564623631393639303037373430633133636336313733353762 Jan 13 23:47:26.532000 audit: BPF prog-id=239 op=UNLOAD Jan 13 23:47:26.532000 audit[4743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4731 pid=4743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6636323564623631393639303037373430633133636336313733353762 Jan 13 23:47:26.532000 audit: BPF prog-id=238 op=UNLOAD Jan 13 23:47:26.532000 audit[4743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4731 pid=4743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6636323564623631393639303037373430633133636336313733353762 Jan 13 23:47:26.532000 audit: BPF prog-id=240 op=LOAD Jan 13 23:47:26.532000 audit[4743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=4731 pid=4743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6636323564623631393639303037373430633133636336313733353762 Jan 13 23:47:26.549381 containerd[1673]: time="2026-01-13T23:47:26.549286431Z" level=info msg="connecting to shim c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003" address="unix:///run/containerd/s/a2a3be51a9596473585165c75fcc58926ff495124671ce84727ca85b0da50237" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:26.569189 containerd[1673]: time="2026-01-13T23:47:26.569035170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b4b69d7d6-mtxnp,Uid:c4d38e9b-73ce-46dd-9acb-61df83d528d1,Namespace:calico-system,Attempt:0,} returns sandbox id \"f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce\"" Jan 13 23:47:26.588344 systemd[1]: Started cri-containerd-c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003.scope - libcontainer container c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003. Jan 13 23:47:26.599000 audit: BPF prog-id=241 op=LOAD Jan 13 23:47:26.600000 audit: BPF prog-id=242 op=LOAD Jan 13 23:47:26.600000 audit[4798]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4786 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331346636323462363261396233383835656463333362656566626261 Jan 13 23:47:26.600000 audit: BPF prog-id=242 op=UNLOAD Jan 13 23:47:26.600000 audit[4798]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4786 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331346636323462363261396233383835656463333362656566626261 Jan 13 23:47:26.600000 audit: BPF prog-id=243 op=LOAD Jan 13 23:47:26.600000 audit[4798]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4786 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.600000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331346636323462363261396233383835656463333362656566626261 Jan 13 23:47:26.601000 audit: BPF prog-id=244 op=LOAD Jan 13 23:47:26.601000 audit[4798]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4786 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331346636323462363261396233383835656463333362656566626261 Jan 13 23:47:26.601000 audit: BPF prog-id=244 op=UNLOAD Jan 13 23:47:26.601000 audit[4798]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4786 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331346636323462363261396233383835656463333362656566626261 Jan 13 23:47:26.601000 audit: BPF prog-id=243 op=UNLOAD Jan 13 23:47:26.601000 audit[4798]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4786 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331346636323462363261396233383835656463333362656566626261 Jan 13 23:47:26.601000 audit: BPF prog-id=245 op=LOAD Jan 13 23:47:26.601000 audit[4798]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4786 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331346636323462363261396233383835656463333362656566626261 Jan 13 23:47:26.635524 containerd[1673]: time="2026-01-13T23:47:26.635044609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-xghpj,Uid:7ddfa231-61f2-4ab3-bd34-82f93616c2de,Namespace:calico-system,Attempt:0,} returns sandbox id \"c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003\"" Jan 13 23:47:26.641000 audit[4831]: NETFILTER_CFG table=filter:131 family=2 entries=66 op=nft_register_chain pid=4831 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:26.641000 audit[4831]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=32784 a0=3 a1=ffffc4d0fc10 a2=0 a3=ffff8250efa8 items=0 ppid=4144 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:26.641000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:26.787132 containerd[1673]: time="2026-01-13T23:47:26.786872188Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:26.791651 containerd[1673]: time="2026-01-13T23:47:26.791595882Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:47:26.791889 containerd[1673]: time="2026-01-13T23:47:26.791751402Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:26.792037 kubelet[2907]: E0113 23:47:26.791972 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:26.792037 kubelet[2907]: E0113 23:47:26.792022 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:26.792351 kubelet[2907]: E0113 23:47:26.792244 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghb74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-r8m4t_calico-apiserver(04948dc7-2d3d-4260-8809-f8eb4aa6cc17): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:26.792846 containerd[1673]: time="2026-01-13T23:47:26.792380724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 13 23:47:26.793975 kubelet[2907]: E0113 23:47:26.793925 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:47:27.030993 kubelet[2907]: E0113 23:47:27.030954 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:47:27.050083 kubelet[2907]: I0113 23:47:27.049937 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-mwjb8" podStartSLOduration=46.049918741 podStartE2EDuration="46.049918741s" podCreationTimestamp="2026-01-13 23:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-13 23:47:27.038043826 +0000 UTC m=+51.298359433" watchObservedRunningTime="2026-01-13 23:47:27.049918741 +0000 UTC m=+51.310234308" Jan 13 23:47:27.053000 audit[4833]: NETFILTER_CFG table=filter:132 family=2 entries=20 op=nft_register_rule pid=4833 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:27.053000 audit[4833]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd8d73790 a2=0 a3=1 items=0 ppid=3061 pid=4833 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:27.053000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:27.061000 audit[4833]: NETFILTER_CFG table=nat:133 family=2 entries=14 op=nft_register_rule pid=4833 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:27.061000 audit[4833]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd8d73790 a2=0 a3=1 items=0 ppid=3061 pid=4833 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:27.061000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:27.078610 kubelet[2907]: I0113 23:47:27.078538 2907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-wmjgt" podStartSLOduration=46.078517188 podStartE2EDuration="46.078517188s" podCreationTimestamp="2026-01-13 23:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-13 23:47:27.077200104 +0000 UTC m=+51.337515711" watchObservedRunningTime="2026-01-13 23:47:27.078517188 +0000 UTC m=+51.338832755" Jan 13 23:47:27.082000 audit[4835]: NETFILTER_CFG table=filter:134 family=2 entries=17 op=nft_register_rule pid=4835 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:27.082000 audit[4835]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffef1da980 a2=0 a3=1 items=0 ppid=3061 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:27.082000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:27.089000 audit[4835]: NETFILTER_CFG table=nat:135 family=2 entries=35 op=nft_register_chain pid=4835 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:27.089000 audit[4835]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffef1da980 a2=0 a3=1 items=0 ppid=3061 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:27.089000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:27.118825 containerd[1673]: time="2026-01-13T23:47:27.118772909Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:27.120256 containerd[1673]: time="2026-01-13T23:47:27.120123713Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 13 23:47:27.120256 containerd[1673]: time="2026-01-13T23:47:27.120174313Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:27.120577 kubelet[2907]: E0113 23:47:27.120334 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:47:27.120577 kubelet[2907]: E0113 23:47:27.120377 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:47:27.120773 kubelet[2907]: E0113 23:47:27.120588 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sknl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b4b69d7d6-mtxnp_calico-system(c4d38e9b-73ce-46dd-9acb-61df83d528d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:27.121003 containerd[1673]: time="2026-01-13T23:47:27.120969596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 13 23:47:27.122109 kubelet[2907]: E0113 23:47:27.122081 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:47:27.300391 systemd-networkd[1591]: cali7d6615dac1f: Gained IPv6LL Jan 13 23:47:27.429269 systemd-networkd[1591]: calid87d99b48c8: Gained IPv6LL Jan 13 23:47:27.457515 containerd[1673]: time="2026-01-13T23:47:27.457372291Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:27.458793 containerd[1673]: time="2026-01-13T23:47:27.458650455Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 13 23:47:27.458793 containerd[1673]: time="2026-01-13T23:47:27.458715415Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:27.458921 kubelet[2907]: E0113 23:47:27.458891 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:47:27.458957 kubelet[2907]: E0113 23:47:27.458935 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:47:27.459182 kubelet[2907]: E0113 23:47:27.459133 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dt7lw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-xghpj_calico-system(7ddfa231-61f2-4ab3-bd34-82f93616c2de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:27.460479 kubelet[2907]: E0113 23:47:27.460416 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:47:27.685355 systemd-networkd[1591]: cali5a0585309b5: Gained IPv6LL Jan 13 23:47:27.880229 containerd[1673]: time="2026-01-13T23:47:27.879658486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-bwfss,Uid:c47b5326-d31f-4680-9c2d-bdd28d584c69,Namespace:calico-apiserver,Attempt:0,}" Jan 13 23:47:27.880229 containerd[1673]: time="2026-01-13T23:47:27.880093807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p6cc5,Uid:89f76643-e37a-4094-9d85-ab46009d2c90,Namespace:calico-system,Attempt:0,}" Jan 13 23:47:28.006732 systemd-networkd[1591]: cali66c2ee16355: Link UP Jan 13 23:47:28.006977 systemd-networkd[1591]: cali66c2ee16355: Gained carrier Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.938 [INFO][4849] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0 calico-apiserver-68644f6664- calico-apiserver c47b5326-d31f-4680-9c2d-bdd28d584c69 850 0 2026-01-13 23:46:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:68644f6664 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 calico-apiserver-68644f6664-bwfss eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali66c2ee16355 [] [] }} ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.938 [INFO][4849] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4866] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" HandleID="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4866] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" HandleID="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000492f80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-660efdb355", "pod":"calico-apiserver-68644f6664-bwfss", "timestamp":"2026-01-13 23:47:27.963667779 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4866] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4866] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4866] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.974 [INFO][4866] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.978 [INFO][4866] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.981 [INFO][4866] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.983 [INFO][4866] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.986 [INFO][4866] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.986 [INFO][4866] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.987 [INFO][4866] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.991 [INFO][4866] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.998 [INFO][4866] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.71/26] block=192.168.37.64/26 handle="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.998 [INFO][4866] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.71/26] handle="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.998 [INFO][4866] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:28.022039 containerd[1673]: 2026-01-13 23:47:27.999 [INFO][4866] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.71/26] IPv6=[] ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" HandleID="k8s-pod-network.9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Workload="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" Jan 13 23:47:28.022799 containerd[1673]: 2026-01-13 23:47:28.001 [INFO][4849] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0", GenerateName:"calico-apiserver-68644f6664-", Namespace:"calico-apiserver", SelfLink:"", UID:"c47b5326-d31f-4680-9c2d-bdd28d584c69", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68644f6664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"calico-apiserver-68644f6664-bwfss", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.37.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali66c2ee16355", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:28.022799 containerd[1673]: 2026-01-13 23:47:28.001 [INFO][4849] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.71/32] ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" Jan 13 23:47:28.022799 containerd[1673]: 2026-01-13 23:47:28.001 [INFO][4849] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66c2ee16355 ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" Jan 13 23:47:28.022799 containerd[1673]: 2026-01-13 23:47:28.006 [INFO][4849] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" Jan 13 23:47:28.022799 containerd[1673]: 2026-01-13 23:47:28.006 [INFO][4849] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0", GenerateName:"calico-apiserver-68644f6664-", Namespace:"calico-apiserver", SelfLink:"", UID:"c47b5326-d31f-4680-9c2d-bdd28d584c69", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68644f6664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac", Pod:"calico-apiserver-68644f6664-bwfss", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.37.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali66c2ee16355", MAC:"26:dc:75:0d:39:c5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:28.022799 containerd[1673]: 2026-01-13 23:47:28.019 [INFO][4849] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" Namespace="calico-apiserver" Pod="calico-apiserver-68644f6664-bwfss" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-calico--apiserver--68644f6664--bwfss-eth0" Jan 13 23:47:28.034009 kubelet[2907]: E0113 23:47:28.033822 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:47:28.034009 kubelet[2907]: E0113 23:47:28.033941 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:47:28.034674 kubelet[2907]: E0113 23:47:28.034093 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:47:28.033000 audit[4893]: NETFILTER_CFG table=filter:136 family=2 entries=53 op=nft_register_chain pid=4893 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:28.033000 audit[4893]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26624 a0=3 a1=fffff9149760 a2=0 a3=ffff9f6a4fa8 items=0 ppid=4144 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.033000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:28.056274 containerd[1673]: time="2026-01-13T23:47:28.056114658Z" level=info msg="connecting to shim 9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac" address="unix:///run/containerd/s/fded2e4a4717bd30462e428e43d8363557a17a4381a05353436f16568d6e400f" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:28.064000 audit[4915]: NETFILTER_CFG table=filter:137 family=2 entries=14 op=nft_register_rule pid=4915 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:28.064000 audit[4915]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd2c01120 a2=0 a3=1 items=0 ppid=3061 pid=4915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.064000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:28.077000 audit[4915]: NETFILTER_CFG table=nat:138 family=2 entries=56 op=nft_register_chain pid=4915 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:28.077000 audit[4915]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffd2c01120 a2=0 a3=1 items=0 ppid=3061 pid=4915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.077000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:28.092360 systemd[1]: Started cri-containerd-9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac.scope - libcontainer container 9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac. Jan 13 23:47:28.102000 audit: BPF prog-id=246 op=LOAD Jan 13 23:47:28.103000 audit: BPF prog-id=247 op=LOAD Jan 13 23:47:28.103000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4901 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.103000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963386336656264383865653965313133316363626630366433306464 Jan 13 23:47:28.103000 audit: BPF prog-id=247 op=UNLOAD Jan 13 23:47:28.103000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4901 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.103000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963386336656264383865653965313133316363626630366433306464 Jan 13 23:47:28.104000 audit: BPF prog-id=248 op=LOAD Jan 13 23:47:28.104000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4901 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.104000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963386336656264383865653965313133316363626630366433306464 Jan 13 23:47:28.104000 audit: BPF prog-id=249 op=LOAD Jan 13 23:47:28.104000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4901 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.104000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963386336656264383865653965313133316363626630366433306464 Jan 13 23:47:28.104000 audit: BPF prog-id=249 op=UNLOAD Jan 13 23:47:28.104000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4901 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.104000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963386336656264383865653965313133316363626630366433306464 Jan 13 23:47:28.104000 audit: BPF prog-id=248 op=UNLOAD Jan 13 23:47:28.104000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4901 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.104000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963386336656264383865653965313133316363626630366433306464 Jan 13 23:47:28.104000 audit: BPF prog-id=250 op=LOAD Jan 13 23:47:28.104000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4901 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.104000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963386336656264383865653965313133316363626630366433306464 Jan 13 23:47:28.117710 systemd-networkd[1591]: cali530fd8a0ef0: Link UP Jan 13 23:47:28.118128 systemd-networkd[1591]: cali530fd8a0ef0: Gained carrier Jan 13 23:47:28.136018 containerd[1673]: time="2026-01-13T23:47:28.135872299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68644f6664-bwfss,Uid:c47b5326-d31f-4680-9c2d-bdd28d584c69,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac\"" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:27.938 [INFO][4837] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0 csi-node-driver- calico-system 89f76643-e37a-4094-9d85-ab46009d2c90 736 0 2026-01-13 23:46:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-n-660efdb355 csi-node-driver-p6cc5 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali530fd8a0ef0 [] [] }} ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:27.938 [INFO][4837] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4868] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" HandleID="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Workload="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4868] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" HandleID="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Workload="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-660efdb355", "pod":"csi-node-driver-p6cc5", "timestamp":"2026-01-13 23:47:27.963660259 +0000 UTC"}, Hostname:"ci-4547-0-0-n-660efdb355", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:27.963 [INFO][4868] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:27.999 [INFO][4868] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:27.999 [INFO][4868] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-660efdb355' Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.076 [INFO][4868] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.083 [INFO][4868] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.093 [INFO][4868] ipam/ipam.go 511: Trying affinity for 192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.095 [INFO][4868] ipam/ipam.go 158: Attempting to load block cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.098 [INFO][4868] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.37.64/26 host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.098 [INFO][4868] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.37.64/26 handle="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.100 [INFO][4868] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151 Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.106 [INFO][4868] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.37.64/26 handle="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.113 [INFO][4868] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.37.72/26] block=192.168.37.64/26 handle="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.113 [INFO][4868] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.37.72/26] handle="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" host="ci-4547-0-0-n-660efdb355" Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.113 [INFO][4868] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 13 23:47:28.136979 containerd[1673]: 2026-01-13 23:47:28.114 [INFO][4868] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.37.72/26] IPv6=[] ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" HandleID="k8s-pod-network.243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Workload="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" Jan 13 23:47:28.138657 containerd[1673]: 2026-01-13 23:47:28.116 [INFO][4837] cni-plugin/k8s.go 418: Populated endpoint ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"89f76643-e37a-4094-9d85-ab46009d2c90", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"", Pod:"csi-node-driver-p6cc5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.37.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali530fd8a0ef0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:28.138657 containerd[1673]: 2026-01-13 23:47:28.116 [INFO][4837] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.37.72/32] ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" Jan 13 23:47:28.138657 containerd[1673]: 2026-01-13 23:47:28.116 [INFO][4837] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali530fd8a0ef0 ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" Jan 13 23:47:28.138657 containerd[1673]: 2026-01-13 23:47:28.118 [INFO][4837] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" Jan 13 23:47:28.138657 containerd[1673]: 2026-01-13 23:47:28.119 [INFO][4837] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"89f76643-e37a-4094-9d85-ab46009d2c90", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2026, time.January, 13, 23, 46, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-660efdb355", ContainerID:"243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151", Pod:"csi-node-driver-p6cc5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.37.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali530fd8a0ef0", MAC:"1e:a8:e3:92:73:8c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 13 23:47:28.138657 containerd[1673]: 2026-01-13 23:47:28.133 [INFO][4837] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" Namespace="calico-system" Pod="csi-node-driver-p6cc5" WorkloadEndpoint="ci--4547--0--0--n--660efdb355-k8s-csi--node--driver--p6cc5-eth0" Jan 13 23:47:28.139243 containerd[1673]: time="2026-01-13T23:47:28.139046828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:47:28.150000 audit[4951]: NETFILTER_CFG table=filter:139 family=2 entries=62 op=nft_register_chain pid=4951 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 13 23:47:28.150000 audit[4951]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28352 a0=3 a1=ffffe460e670 a2=0 a3=ffffbdbecfa8 items=0 ppid=4144 pid=4951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.150000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 13 23:47:28.166003 containerd[1673]: time="2026-01-13T23:47:28.165961310Z" level=info msg="connecting to shim 243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151" address="unix:///run/containerd/s/b5cef61e0ccd07381392ccabae381acf2f7278afd4c57ef1fbd963dab634df41" namespace=k8s.io protocol=ttrpc version=3 Jan 13 23:47:28.193336 systemd[1]: Started cri-containerd-243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151.scope - libcontainer container 243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151. Jan 13 23:47:28.206000 audit: BPF prog-id=251 op=LOAD Jan 13 23:47:28.207000 audit: BPF prog-id=252 op=LOAD Jan 13 23:47:28.207000 audit[4972]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4960 pid=4972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234333631333337316261613433666235656331343663393362653537 Jan 13 23:47:28.207000 audit: BPF prog-id=252 op=UNLOAD Jan 13 23:47:28.207000 audit[4972]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4960 pid=4972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234333631333337316261613433666235656331343663393362653537 Jan 13 23:47:28.207000 audit: BPF prog-id=253 op=LOAD Jan 13 23:47:28.207000 audit[4972]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4960 pid=4972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234333631333337316261613433666235656331343663393362653537 Jan 13 23:47:28.207000 audit: BPF prog-id=254 op=LOAD Jan 13 23:47:28.207000 audit[4972]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4960 pid=4972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234333631333337316261613433666235656331343663393362653537 Jan 13 23:47:28.208000 audit: BPF prog-id=254 op=UNLOAD Jan 13 23:47:28.208000 audit[4972]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4960 pid=4972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.208000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234333631333337316261613433666235656331343663393362653537 Jan 13 23:47:28.208000 audit: BPF prog-id=253 op=UNLOAD Jan 13 23:47:28.208000 audit[4972]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4960 pid=4972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.208000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234333631333337316261613433666235656331343663393362653537 Jan 13 23:47:28.208000 audit: BPF prog-id=255 op=LOAD Jan 13 23:47:28.208000 audit[4972]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4960 pid=4972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:28.208000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234333631333337316261613433666235656331343663393362653537 Jan 13 23:47:28.226759 containerd[1673]: time="2026-01-13T23:47:28.226721973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p6cc5,Uid:89f76643-e37a-4094-9d85-ab46009d2c90,Namespace:calico-system,Attempt:0,} returns sandbox id \"243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151\"" Jan 13 23:47:28.260309 systemd-networkd[1591]: cali2fbc98a06bc: Gained IPv6LL Jan 13 23:47:28.324435 systemd-networkd[1591]: cali2a9f045e02a: Gained IPv6LL Jan 13 23:47:28.470099 containerd[1673]: time="2026-01-13T23:47:28.469842827Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:28.471247 containerd[1673]: time="2026-01-13T23:47:28.471114871Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:47:28.471247 containerd[1673]: time="2026-01-13T23:47:28.471196951Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:28.471820 kubelet[2907]: E0113 23:47:28.471783 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:28.471875 kubelet[2907]: E0113 23:47:28.471835 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:28.472186 kubelet[2907]: E0113 23:47:28.472108 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hnlgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-bwfss_calico-apiserver(c47b5326-d31f-4680-9c2d-bdd28d584c69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:28.472492 containerd[1673]: time="2026-01-13T23:47:28.472411235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 13 23:47:28.473794 kubelet[2907]: E0113 23:47:28.473730 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:47:28.813233 containerd[1673]: time="2026-01-13T23:47:28.813134823Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:28.814733 containerd[1673]: time="2026-01-13T23:47:28.814693028Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 13 23:47:28.814780 containerd[1673]: time="2026-01-13T23:47:28.814729468Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:28.814979 kubelet[2907]: E0113 23:47:28.814920 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:47:28.815052 kubelet[2907]: E0113 23:47:28.814974 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:47:28.815204 kubelet[2907]: E0113 23:47:28.815118 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:28.817396 containerd[1673]: time="2026-01-13T23:47:28.817365116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 13 23:47:29.039088 kubelet[2907]: E0113 23:47:29.039005 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:47:29.059000 audit[5003]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=5003 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:29.059000 audit[5003]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe4cd1510 a2=0 a3=1 items=0 ppid=3061 pid=5003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:29.059000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:29.069000 audit[5003]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=5003 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 13 23:47:29.069000 audit[5003]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe4cd1510 a2=0 a3=1 items=0 ppid=3061 pid=5003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:47:29.069000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 13 23:47:29.140770 containerd[1673]: time="2026-01-13T23:47:29.140681771Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:29.142304 containerd[1673]: time="2026-01-13T23:47:29.142255576Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 13 23:47:29.142358 containerd[1673]: time="2026-01-13T23:47:29.142302376Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:29.142548 kubelet[2907]: E0113 23:47:29.142505 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:47:29.142609 kubelet[2907]: E0113 23:47:29.142558 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:47:29.142711 kubelet[2907]: E0113 23:47:29.142673 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:29.144179 kubelet[2907]: E0113 23:47:29.144115 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:29.220366 systemd-networkd[1591]: cali530fd8a0ef0: Gained IPv6LL Jan 13 23:47:29.924190 systemd-networkd[1591]: cali66c2ee16355: Gained IPv6LL Jan 13 23:47:30.040225 kubelet[2907]: E0113 23:47:30.040174 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:47:30.040942 kubelet[2907]: E0113 23:47:30.040911 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:37.882491 containerd[1673]: time="2026-01-13T23:47:37.882227234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 13 23:47:38.208290 containerd[1673]: time="2026-01-13T23:47:38.208164498Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:38.210235 containerd[1673]: time="2026-01-13T23:47:38.210116144Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 13 23:47:38.210417 containerd[1673]: time="2026-01-13T23:47:38.210160424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:38.210581 kubelet[2907]: E0113 23:47:38.210340 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:47:38.210581 kubelet[2907]: E0113 23:47:38.210408 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:47:38.210581 kubelet[2907]: E0113 23:47:38.210520 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b094e94e970498e92a984780514b4e2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:38.213048 containerd[1673]: time="2026-01-13T23:47:38.212813832Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 13 23:47:38.540669 containerd[1673]: time="2026-01-13T23:47:38.540589621Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:38.542563 containerd[1673]: time="2026-01-13T23:47:38.542500027Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 13 23:47:38.542622 containerd[1673]: time="2026-01-13T23:47:38.542511507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:38.542858 kubelet[2907]: E0113 23:47:38.542785 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:47:38.542858 kubelet[2907]: E0113 23:47:38.542845 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:47:38.542989 kubelet[2907]: E0113 23:47:38.542952 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:38.544332 kubelet[2907]: E0113 23:47:38.544278 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:47:38.879420 containerd[1673]: time="2026-01-13T23:47:38.879304523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 13 23:47:39.208391 containerd[1673]: time="2026-01-13T23:47:39.208168956Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:39.209761 containerd[1673]: time="2026-01-13T23:47:39.209707640Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 13 23:47:39.209820 containerd[1673]: time="2026-01-13T23:47:39.209778641Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:39.210342 kubelet[2907]: E0113 23:47:39.210099 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:47:39.210342 kubelet[2907]: E0113 23:47:39.210148 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:47:39.210342 kubelet[2907]: E0113 23:47:39.210292 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dt7lw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-xghpj_calico-system(7ddfa231-61f2-4ab3-bd34-82f93616c2de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:39.211656 kubelet[2907]: E0113 23:47:39.211628 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:47:41.880753 containerd[1673]: time="2026-01-13T23:47:41.880712742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:47:42.235951 containerd[1673]: time="2026-01-13T23:47:42.235684173Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:42.239334 containerd[1673]: time="2026-01-13T23:47:42.239256264Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:47:42.239494 containerd[1673]: time="2026-01-13T23:47:42.239285784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:42.239631 kubelet[2907]: E0113 23:47:42.239578 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:42.239631 kubelet[2907]: E0113 23:47:42.239628 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:42.239981 kubelet[2907]: E0113 23:47:42.239883 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hnlgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-bwfss_calico-apiserver(c47b5326-d31f-4680-9c2d-bdd28d584c69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:42.240098 containerd[1673]: time="2026-01-13T23:47:42.239908386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:47:42.241489 kubelet[2907]: E0113 23:47:42.241406 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:47:42.565978 containerd[1673]: time="2026-01-13T23:47:42.565916250Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:42.567366 containerd[1673]: time="2026-01-13T23:47:42.567294294Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:47:42.567436 containerd[1673]: time="2026-01-13T23:47:42.567372374Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:42.567534 kubelet[2907]: E0113 23:47:42.567497 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:42.567580 kubelet[2907]: E0113 23:47:42.567543 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:47:42.567849 kubelet[2907]: E0113 23:47:42.567758 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghb74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-r8m4t_calico-apiserver(04948dc7-2d3d-4260-8809-f8eb4aa6cc17): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:42.568097 containerd[1673]: time="2026-01-13T23:47:42.567990696Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 13 23:47:42.569266 kubelet[2907]: E0113 23:47:42.569169 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:47:42.892141 containerd[1673]: time="2026-01-13T23:47:42.891790033Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:42.894076 containerd[1673]: time="2026-01-13T23:47:42.893956120Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 13 23:47:42.894076 containerd[1673]: time="2026-01-13T23:47:42.894000160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:42.894285 kubelet[2907]: E0113 23:47:42.894187 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:47:42.894285 kubelet[2907]: E0113 23:47:42.894275 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:47:42.894482 kubelet[2907]: E0113 23:47:42.894435 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sknl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b4b69d7d6-mtxnp_calico-system(c4d38e9b-73ce-46dd-9acb-61df83d528d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:42.895647 kubelet[2907]: E0113 23:47:42.895604 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:47:44.881192 containerd[1673]: time="2026-01-13T23:47:44.880998797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 13 23:47:45.221869 containerd[1673]: time="2026-01-13T23:47:45.221736865Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:45.223285 containerd[1673]: time="2026-01-13T23:47:45.223178510Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 13 23:47:45.223285 containerd[1673]: time="2026-01-13T23:47:45.223227310Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:45.223527 kubelet[2907]: E0113 23:47:45.223489 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:47:45.224096 kubelet[2907]: E0113 23:47:45.223843 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:47:45.224096 kubelet[2907]: E0113 23:47:45.224003 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:45.225864 containerd[1673]: time="2026-01-13T23:47:45.225773838Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 13 23:47:45.560133 containerd[1673]: time="2026-01-13T23:47:45.559956726Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:47:45.561316 containerd[1673]: time="2026-01-13T23:47:45.561285810Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 13 23:47:45.561449 containerd[1673]: time="2026-01-13T23:47:45.561351810Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 13 23:47:45.561513 kubelet[2907]: E0113 23:47:45.561455 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:47:45.561582 kubelet[2907]: E0113 23:47:45.561515 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:47:45.561875 kubelet[2907]: E0113 23:47:45.561623 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 13 23:47:45.562896 kubelet[2907]: E0113 23:47:45.562770 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:47:49.881537 kubelet[2907]: E0113 23:47:49.881479 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:47:53.884400 kubelet[2907]: E0113 23:47:53.884354 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:47:53.884999 kubelet[2907]: E0113 23:47:53.884354 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:47:53.884999 kubelet[2907]: E0113 23:47:53.884385 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:47:54.879442 kubelet[2907]: E0113 23:47:54.879370 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:47:57.880354 kubelet[2907]: E0113 23:47:57.880112 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:48:00.880088 containerd[1673]: time="2026-01-13T23:48:00.879672762Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 13 23:48:01.224717 containerd[1673]: time="2026-01-13T23:48:01.224511163Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:01.226916 containerd[1673]: time="2026-01-13T23:48:01.226798009Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 13 23:48:01.227019 containerd[1673]: time="2026-01-13T23:48:01.226897690Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:01.227148 kubelet[2907]: E0113 23:48:01.227097 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:48:01.227445 kubelet[2907]: E0113 23:48:01.227154 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:48:01.227445 kubelet[2907]: E0113 23:48:01.227267 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b094e94e970498e92a984780514b4e2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:01.229617 containerd[1673]: time="2026-01-13T23:48:01.229405937Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 13 23:48:01.580886 containerd[1673]: time="2026-01-13T23:48:01.580673357Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:01.581969 containerd[1673]: time="2026-01-13T23:48:01.581936641Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 13 23:48:01.582125 containerd[1673]: time="2026-01-13T23:48:01.581979881Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:01.582384 kubelet[2907]: E0113 23:48:01.582337 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:48:01.582441 kubelet[2907]: E0113 23:48:01.582393 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:48:01.582542 kubelet[2907]: E0113 23:48:01.582507 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:01.583819 kubelet[2907]: E0113 23:48:01.583774 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:48:05.880576 containerd[1673]: time="2026-01-13T23:48:05.880326374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 13 23:48:06.214744 containerd[1673]: time="2026-01-13T23:48:06.214348182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:06.215831 containerd[1673]: time="2026-01-13T23:48:06.215798106Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 13 23:48:06.216040 containerd[1673]: time="2026-01-13T23:48:06.215892187Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:06.216277 kubelet[2907]: E0113 23:48:06.216193 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:48:06.216277 kubelet[2907]: E0113 23:48:06.216270 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:48:06.216636 kubelet[2907]: E0113 23:48:06.216397 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dt7lw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-xghpj_calico-system(7ddfa231-61f2-4ab3-bd34-82f93616c2de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:06.217536 kubelet[2907]: E0113 23:48:06.217501 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:48:06.880947 containerd[1673]: time="2026-01-13T23:48:06.880700393Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:48:07.223543 containerd[1673]: time="2026-01-13T23:48:07.223310387Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:07.225171 containerd[1673]: time="2026-01-13T23:48:07.224995512Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:48:07.225527 containerd[1673]: time="2026-01-13T23:48:07.225161513Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:07.225656 kubelet[2907]: E0113 23:48:07.225604 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:07.225942 kubelet[2907]: E0113 23:48:07.225661 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:07.226414 kubelet[2907]: E0113 23:48:07.226099 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghb74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-r8m4t_calico-apiserver(04948dc7-2d3d-4260-8809-f8eb4aa6cc17): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:07.226911 containerd[1673]: time="2026-01-13T23:48:07.226269436Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 13 23:48:07.227556 kubelet[2907]: E0113 23:48:07.227512 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:48:07.569422 containerd[1673]: time="2026-01-13T23:48:07.569340472Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:07.571141 containerd[1673]: time="2026-01-13T23:48:07.571103957Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 13 23:48:07.571319 containerd[1673]: time="2026-01-13T23:48:07.571176077Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:07.571393 kubelet[2907]: E0113 23:48:07.571334 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:48:07.571465 kubelet[2907]: E0113 23:48:07.571396 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:48:07.571893 kubelet[2907]: E0113 23:48:07.571518 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sknl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b4b69d7d6-mtxnp_calico-system(c4d38e9b-73ce-46dd-9acb-61df83d528d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:07.572773 kubelet[2907]: E0113 23:48:07.572730 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:48:07.881422 containerd[1673]: time="2026-01-13T23:48:07.881308093Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:48:08.216144 containerd[1673]: time="2026-01-13T23:48:08.215743942Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:08.221512 containerd[1673]: time="2026-01-13T23:48:08.221438560Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:48:08.221619 containerd[1673]: time="2026-01-13T23:48:08.221465360Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:08.221789 kubelet[2907]: E0113 23:48:08.221758 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:08.221897 kubelet[2907]: E0113 23:48:08.221881 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:08.222154 kubelet[2907]: E0113 23:48:08.222109 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hnlgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-bwfss_calico-apiserver(c47b5326-d31f-4680-9c2d-bdd28d584c69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:08.223443 kubelet[2907]: E0113 23:48:08.223406 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:48:09.880361 containerd[1673]: time="2026-01-13T23:48:09.880321806Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 13 23:48:10.189197 containerd[1673]: time="2026-01-13T23:48:10.188786257Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:10.190540 containerd[1673]: time="2026-01-13T23:48:10.190487622Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 13 23:48:10.190603 containerd[1673]: time="2026-01-13T23:48:10.190560582Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:10.190799 kubelet[2907]: E0113 23:48:10.190752 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:48:10.191310 kubelet[2907]: E0113 23:48:10.191126 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:48:10.191310 kubelet[2907]: E0113 23:48:10.191257 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:10.193283 containerd[1673]: time="2026-01-13T23:48:10.193250271Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 13 23:48:10.513084 containerd[1673]: time="2026-01-13T23:48:10.512997036Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:10.514394 containerd[1673]: time="2026-01-13T23:48:10.514339160Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 13 23:48:10.514435 containerd[1673]: time="2026-01-13T23:48:10.514383840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:10.514609 kubelet[2907]: E0113 23:48:10.514560 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:48:10.514658 kubelet[2907]: E0113 23:48:10.514614 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:48:10.514758 kubelet[2907]: E0113 23:48:10.514721 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:10.516221 kubelet[2907]: E0113 23:48:10.516170 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:48:16.880966 kubelet[2907]: E0113 23:48:16.880891 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:48:17.880176 kubelet[2907]: E0113 23:48:17.880126 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:48:18.880051 kubelet[2907]: E0113 23:48:18.879753 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:48:20.882095 kubelet[2907]: E0113 23:48:20.880309 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:48:20.882095 kubelet[2907]: E0113 23:48:20.880326 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:48:20.882095 kubelet[2907]: E0113 23:48:20.881246 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:48:28.880502 kubelet[2907]: E0113 23:48:28.880330 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:48:29.881129 kubelet[2907]: E0113 23:48:29.880507 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:48:30.879544 kubelet[2907]: E0113 23:48:30.879487 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:48:33.879925 kubelet[2907]: E0113 23:48:33.879870 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:48:33.881696 kubelet[2907]: E0113 23:48:33.881647 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:48:34.882291 kubelet[2907]: E0113 23:48:34.882238 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:48:40.879603 kubelet[2907]: E0113 23:48:40.879547 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:48:41.879684 kubelet[2907]: E0113 23:48:41.879629 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:48:42.880123 kubelet[2907]: E0113 23:48:42.879483 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:48:44.880197 kubelet[2907]: E0113 23:48:44.880153 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:48:47.880937 kubelet[2907]: E0113 23:48:47.880884 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:48:48.880508 containerd[1673]: time="2026-01-13T23:48:48.880270751Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:48:49.222749 containerd[1673]: time="2026-01-13T23:48:49.222377263Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:49.225240 containerd[1673]: time="2026-01-13T23:48:49.225198072Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:48:49.225339 containerd[1673]: time="2026-01-13T23:48:49.225277312Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:49.225476 kubelet[2907]: E0113 23:48:49.225437 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:49.225730 kubelet[2907]: E0113 23:48:49.225486 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:49.225730 kubelet[2907]: E0113 23:48:49.225609 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghb74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-r8m4t_calico-apiserver(04948dc7-2d3d-4260-8809-f8eb4aa6cc17): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:49.227106 kubelet[2907]: E0113 23:48:49.227041 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:48:53.879949 containerd[1673]: time="2026-01-13T23:48:53.879855960Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 13 23:48:54.201301 containerd[1673]: time="2026-01-13T23:48:54.201036969Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:54.203184 containerd[1673]: time="2026-01-13T23:48:54.203097695Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 13 23:48:54.203184 containerd[1673]: time="2026-01-13T23:48:54.203131935Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:54.203952 kubelet[2907]: E0113 23:48:54.203302 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:48:54.203952 kubelet[2907]: E0113 23:48:54.203359 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:48:54.203952 kubelet[2907]: E0113 23:48:54.203574 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dt7lw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-xghpj_calico-system(7ddfa231-61f2-4ab3-bd34-82f93616c2de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:54.204619 containerd[1673]: time="2026-01-13T23:48:54.204214539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:48:54.205413 kubelet[2907]: E0113 23:48:54.205368 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:48:54.545751 containerd[1673]: time="2026-01-13T23:48:54.545665329Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:54.547911 containerd[1673]: time="2026-01-13T23:48:54.547844616Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:48:54.547974 containerd[1673]: time="2026-01-13T23:48:54.547906936Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:54.548281 kubelet[2907]: E0113 23:48:54.548152 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:54.548281 kubelet[2907]: E0113 23:48:54.548250 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:48:54.548697 containerd[1673]: time="2026-01-13T23:48:54.548662458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 13 23:48:54.549077 kubelet[2907]: E0113 23:48:54.548986 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hnlgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-bwfss_calico-apiserver(c47b5326-d31f-4680-9c2d-bdd28d584c69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:54.550338 kubelet[2907]: E0113 23:48:54.550285 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:48:54.880632 containerd[1673]: time="2026-01-13T23:48:54.880479460Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:54.886362 containerd[1673]: time="2026-01-13T23:48:54.886164517Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 13 23:48:54.886362 containerd[1673]: time="2026-01-13T23:48:54.886235077Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:54.886516 kubelet[2907]: E0113 23:48:54.886392 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:48:54.886516 kubelet[2907]: E0113 23:48:54.886436 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:48:54.886585 kubelet[2907]: E0113 23:48:54.886527 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b094e94e970498e92a984780514b4e2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:54.888870 containerd[1673]: time="2026-01-13T23:48:54.888826725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 13 23:48:55.219105 containerd[1673]: time="2026-01-13T23:48:55.218982441Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:55.221313 containerd[1673]: time="2026-01-13T23:48:55.221269008Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 13 23:48:55.221383 containerd[1673]: time="2026-01-13T23:48:55.221308888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:55.221606 kubelet[2907]: E0113 23:48:55.221514 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:48:55.221606 kubelet[2907]: E0113 23:48:55.221593 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:48:55.221897 kubelet[2907]: E0113 23:48:55.221718 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:55.223015 kubelet[2907]: E0113 23:48:55.222967 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:48:55.880238 containerd[1673]: time="2026-01-13T23:48:55.880169357Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 13 23:48:56.205852 containerd[1673]: time="2026-01-13T23:48:56.205731499Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:48:56.207434 containerd[1673]: time="2026-01-13T23:48:56.207394104Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 13 23:48:56.207714 containerd[1673]: time="2026-01-13T23:48:56.207665025Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 13 23:48:56.207928 kubelet[2907]: E0113 23:48:56.207883 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:48:56.208105 kubelet[2907]: E0113 23:48:56.208084 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:48:56.208305 kubelet[2907]: E0113 23:48:56.208217 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sknl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b4b69d7d6-mtxnp_calico-system(c4d38e9b-73ce-46dd-9acb-61df83d528d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 13 23:48:56.209463 kubelet[2907]: E0113 23:48:56.209423 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:49:02.879723 containerd[1673]: time="2026-01-13T23:49:02.879680321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 13 23:49:03.556557 containerd[1673]: time="2026-01-13T23:49:03.556363204Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:49:03.557811 containerd[1673]: time="2026-01-13T23:49:03.557708168Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 13 23:49:03.557811 containerd[1673]: time="2026-01-13T23:49:03.557744408Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 13 23:49:03.558050 kubelet[2907]: E0113 23:49:03.558015 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:49:03.558638 kubelet[2907]: E0113 23:49:03.558234 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:49:03.558638 kubelet[2907]: E0113 23:49:03.558370 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 13 23:49:03.561180 containerd[1673]: time="2026-01-13T23:49:03.560809737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 13 23:49:03.879604 kubelet[2907]: E0113 23:49:03.879487 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:49:03.903723 containerd[1673]: time="2026-01-13T23:49:03.903655812Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:49:03.912365 containerd[1673]: time="2026-01-13T23:49:03.912096397Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 13 23:49:03.912365 containerd[1673]: time="2026-01-13T23:49:03.912213918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 13 23:49:03.912531 kubelet[2907]: E0113 23:49:03.912380 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:49:03.912531 kubelet[2907]: E0113 23:49:03.912492 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:49:03.912661 kubelet[2907]: E0113 23:49:03.912610 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 13 23:49:03.913944 kubelet[2907]: E0113 23:49:03.913787 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:49:07.882387 kubelet[2907]: E0113 23:49:07.882143 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:49:08.880968 kubelet[2907]: E0113 23:49:08.880874 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:49:09.881115 kubelet[2907]: E0113 23:49:09.880073 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:49:10.879484 kubelet[2907]: E0113 23:49:10.879424 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:49:15.881078 kubelet[2907]: E0113 23:49:15.880932 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:49:18.879681 kubelet[2907]: E0113 23:49:18.879590 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:49:20.880920 kubelet[2907]: E0113 23:49:20.880862 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:49:22.881170 kubelet[2907]: E0113 23:49:22.881127 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:49:22.883264 kubelet[2907]: E0113 23:49:22.883214 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:49:23.880299 kubelet[2907]: E0113 23:49:23.880141 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:49:28.879969 kubelet[2907]: E0113 23:49:28.879898 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:49:31.879546 kubelet[2907]: E0113 23:49:31.879477 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:49:32.880119 kubelet[2907]: E0113 23:49:32.880050 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:49:34.879735 kubelet[2907]: E0113 23:49:34.879681 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:49:34.881645 kubelet[2907]: E0113 23:49:34.880080 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:49:35.881832 kubelet[2907]: E0113 23:49:35.881772 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:49:42.879883 kubelet[2907]: E0113 23:49:42.879814 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:49:45.880058 kubelet[2907]: E0113 23:49:45.879987 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:49:46.879249 kubelet[2907]: E0113 23:49:46.879190 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:49:47.879663 kubelet[2907]: E0113 23:49:47.879586 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:49:48.880123 kubelet[2907]: E0113 23:49:48.880043 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:49:48.881212 kubelet[2907]: E0113 23:49:48.881027 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:49:56.880439 kubelet[2907]: E0113 23:49:56.880349 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:49:59.879994 kubelet[2907]: E0113 23:49:59.879948 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:49:59.880672 kubelet[2907]: E0113 23:49:59.880596 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:50:00.879946 kubelet[2907]: E0113 23:50:00.879873 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:50:01.880071 kubelet[2907]: E0113 23:50:01.879844 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:50:03.880576 kubelet[2907]: E0113 23:50:03.880522 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:50:10.879932 kubelet[2907]: E0113 23:50:10.879751 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:50:10.879932 kubelet[2907]: E0113 23:50:10.879838 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:50:12.880096 containerd[1673]: time="2026-01-13T23:50:12.879845786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:50:13.242870 containerd[1673]: time="2026-01-13T23:50:13.242701081Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:13.244318 containerd[1673]: time="2026-01-13T23:50:13.244263646Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:50:13.244380 containerd[1673]: time="2026-01-13T23:50:13.244334686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:13.244571 kubelet[2907]: E0113 23:50:13.244509 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:50:13.244571 kubelet[2907]: E0113 23:50:13.244567 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:50:13.244883 kubelet[2907]: E0113 23:50:13.244707 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghb74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-r8m4t_calico-apiserver(04948dc7-2d3d-4260-8809-f8eb4aa6cc17): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:13.246183 kubelet[2907]: E0113 23:50:13.246137 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:50:13.880150 kubelet[2907]: E0113 23:50:13.879977 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:50:14.879619 kubelet[2907]: E0113 23:50:14.879556 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:50:15.880367 containerd[1673]: time="2026-01-13T23:50:15.880242681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 13 23:50:16.220196 containerd[1673]: time="2026-01-13T23:50:16.219893626Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:16.221573 containerd[1673]: time="2026-01-13T23:50:16.221435191Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 13 23:50:16.221573 containerd[1673]: time="2026-01-13T23:50:16.221517191Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:16.221752 kubelet[2907]: E0113 23:50:16.221703 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:50:16.221752 kubelet[2907]: E0113 23:50:16.221759 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 13 23:50:16.222219 kubelet[2907]: E0113 23:50:16.221885 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hnlgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-68644f6664-bwfss_calico-apiserver(c47b5326-d31f-4680-9c2d-bdd28d584c69): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:16.223130 kubelet[2907]: E0113 23:50:16.223085 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:50:23.880446 containerd[1673]: time="2026-01-13T23:50:23.880357866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 13 23:50:24.208949 containerd[1673]: time="2026-01-13T23:50:24.208808537Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:24.210046 containerd[1673]: time="2026-01-13T23:50:24.209988501Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 13 23:50:24.210231 containerd[1673]: time="2026-01-13T23:50:24.210030221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:24.210295 kubelet[2907]: E0113 23:50:24.210235 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:50:24.210597 kubelet[2907]: E0113 23:50:24.210304 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 13 23:50:24.210597 kubelet[2907]: E0113 23:50:24.210449 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:24.212488 containerd[1673]: time="2026-01-13T23:50:24.212462548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 13 23:50:24.617496 containerd[1673]: time="2026-01-13T23:50:24.617436651Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:24.618965 containerd[1673]: time="2026-01-13T23:50:24.618898415Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 13 23:50:24.619044 containerd[1673]: time="2026-01-13T23:50:24.618941575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:24.619188 kubelet[2907]: E0113 23:50:24.619137 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:50:24.619239 kubelet[2907]: E0113 23:50:24.619188 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 13 23:50:24.619350 kubelet[2907]: E0113 23:50:24.619307 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhnfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p6cc5_calico-system(89f76643-e37a-4094-9d85-ab46009d2c90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:24.620854 kubelet[2907]: E0113 23:50:24.620803 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:50:25.884086 containerd[1673]: time="2026-01-13T23:50:25.882234948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 13 23:50:26.208322 containerd[1673]: time="2026-01-13T23:50:26.207905611Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:26.209802 containerd[1673]: time="2026-01-13T23:50:26.209668736Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 13 23:50:26.209802 containerd[1673]: time="2026-01-13T23:50:26.209723776Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:26.210163 kubelet[2907]: E0113 23:50:26.210051 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:50:26.210163 kubelet[2907]: E0113 23:50:26.210122 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 13 23:50:26.210602 kubelet[2907]: E0113 23:50:26.210219 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1b094e94e970498e92a984780514b4e2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:26.212805 containerd[1673]: time="2026-01-13T23:50:26.212433344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 13 23:50:26.545762 containerd[1673]: time="2026-01-13T23:50:26.545667870Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:26.546959 containerd[1673]: time="2026-01-13T23:50:26.546921554Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 13 23:50:26.547057 containerd[1673]: time="2026-01-13T23:50:26.547000794Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:26.547433 kubelet[2907]: E0113 23:50:26.547201 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:50:26.547433 kubelet[2907]: E0113 23:50:26.547265 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 13 23:50:26.547433 kubelet[2907]: E0113 23:50:26.547379 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msc9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5748d45b5-b6b8l_calico-system(181c88a3-5f50-4a12-ac78-292f89f9a583): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:26.548602 kubelet[2907]: E0113 23:50:26.548524 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:50:26.880134 containerd[1673]: time="2026-01-13T23:50:26.879888439Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 13 23:50:26.880323 kubelet[2907]: E0113 23:50:26.880280 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:50:26.880766 kubelet[2907]: E0113 23:50:26.880593 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:50:27.215727 containerd[1673]: time="2026-01-13T23:50:27.215500972Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:27.217178 containerd[1673]: time="2026-01-13T23:50:27.217118737Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 13 23:50:27.217247 containerd[1673]: time="2026-01-13T23:50:27.217211297Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:27.217453 kubelet[2907]: E0113 23:50:27.217404 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:50:27.217659 kubelet[2907]: E0113 23:50:27.217455 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 13 23:50:27.217659 kubelet[2907]: E0113 23:50:27.217585 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dt7lw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-xghpj_calico-system(7ddfa231-61f2-4ab3-bd34-82f93616c2de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:27.218790 kubelet[2907]: E0113 23:50:27.218757 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:50:27.880150 containerd[1673]: time="2026-01-13T23:50:27.880088697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 13 23:50:28.218232 containerd[1673]: time="2026-01-13T23:50:28.217850517Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 13 23:50:28.220149 containerd[1673]: time="2026-01-13T23:50:28.220032683Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 13 23:50:28.220149 containerd[1673]: time="2026-01-13T23:50:28.220083364Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 13 23:50:28.220370 kubelet[2907]: E0113 23:50:28.220333 2907 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:50:28.220742 kubelet[2907]: E0113 23:50:28.220384 2907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 13 23:50:28.220742 kubelet[2907]: E0113 23:50:28.220500 2907 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sknl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b4b69d7d6-mtxnp_calico-system(c4d38e9b-73ce-46dd-9acb-61df83d528d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 13 23:50:28.221769 kubelet[2907]: E0113 23:50:28.221725 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:50:37.879717 kubelet[2907]: E0113 23:50:37.879661 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:50:39.879678 kubelet[2907]: E0113 23:50:39.879624 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:50:39.880966 kubelet[2907]: E0113 23:50:39.880890 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:50:40.879449 kubelet[2907]: E0113 23:50:40.879397 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:50:40.879615 kubelet[2907]: E0113 23:50:40.879502 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:50:41.880613 kubelet[2907]: E0113 23:50:41.880554 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:50:50.880015 kubelet[2907]: E0113 23:50:50.879944 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:50:51.880657 kubelet[2907]: E0113 23:50:51.880163 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:50:51.880657 kubelet[2907]: E0113 23:50:51.880566 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:50:52.880389 kubelet[2907]: E0113 23:50:52.880232 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:50:52.883121 kubelet[2907]: E0113 23:50:52.880899 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:50:53.880602 kubelet[2907]: E0113 23:50:53.880548 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:51:02.880024 kubelet[2907]: E0113 23:51:02.879968 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:51:03.880013 kubelet[2907]: E0113 23:51:03.879461 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:51:03.880013 kubelet[2907]: E0113 23:51:03.879784 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:51:05.882487 kubelet[2907]: E0113 23:51:05.882192 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:51:05.882487 kubelet[2907]: E0113 23:51:05.882242 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:51:06.879929 kubelet[2907]: E0113 23:51:06.879825 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:51:14.880592 kubelet[2907]: E0113 23:51:14.880529 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:51:16.879535 kubelet[2907]: E0113 23:51:16.879472 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:51:17.880769 kubelet[2907]: E0113 23:51:17.880684 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:51:19.880128 kubelet[2907]: E0113 23:51:19.880074 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:51:20.879668 kubelet[2907]: E0113 23:51:20.879336 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:51:20.880216 kubelet[2907]: E0113 23:51:20.880183 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:51:25.880442 kubelet[2907]: E0113 23:51:25.880386 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:51:29.880757 kubelet[2907]: E0113 23:51:29.880605 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:51:31.223580 containerd[1673]: time="2026-01-13T23:51:31.223409551Z" level=info msg="container event discarded" container=e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9 type=CONTAINER_CREATED_EVENT Jan 13 23:51:31.223580 containerd[1673]: time="2026-01-13T23:51:31.223534912Z" level=info msg="container event discarded" container=e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9 type=CONTAINER_STARTED_EVENT Jan 13 23:51:31.264462 containerd[1673]: time="2026-01-13T23:51:31.264332155Z" level=info msg="container event discarded" container=de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359 type=CONTAINER_CREATED_EVENT Jan 13 23:51:31.264595 containerd[1673]: time="2026-01-13T23:51:31.264446035Z" level=info msg="container event discarded" container=b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f type=CONTAINER_CREATED_EVENT Jan 13 23:51:31.264595 containerd[1673]: time="2026-01-13T23:51:31.264503675Z" level=info msg="container event discarded" container=b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f type=CONTAINER_STARTED_EVENT Jan 13 23:51:31.275849 containerd[1673]: time="2026-01-13T23:51:31.275779509Z" level=info msg="container event discarded" container=19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a type=CONTAINER_CREATED_EVENT Jan 13 23:51:31.275849 containerd[1673]: time="2026-01-13T23:51:31.275828469Z" level=info msg="container event discarded" container=19701298dd115f99c112ec9009a6fad6311561d556a3d67a6c7d9ea601e80c1a type=CONTAINER_STARTED_EVENT Jan 13 23:51:31.275849 containerd[1673]: time="2026-01-13T23:51:31.275838549Z" level=info msg="container event discarded" container=204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e type=CONTAINER_CREATED_EVENT Jan 13 23:51:31.311034 containerd[1673]: time="2026-01-13T23:51:31.310975215Z" level=info msg="container event discarded" container=e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e type=CONTAINER_CREATED_EVENT Jan 13 23:51:31.339396 containerd[1673]: time="2026-01-13T23:51:31.339331061Z" level=info msg="container event discarded" container=de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359 type=CONTAINER_STARTED_EVENT Jan 13 23:51:31.373684 containerd[1673]: time="2026-01-13T23:51:31.373587844Z" level=info msg="container event discarded" container=204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e type=CONTAINER_STARTED_EVENT Jan 13 23:51:31.393882 containerd[1673]: time="2026-01-13T23:51:31.393828466Z" level=info msg="container event discarded" container=e9f057f26bbb3c9fdf44d14ded20f4b1482081d3a048869c1ed4c6a060b2248e type=CONTAINER_STARTED_EVENT Jan 13 23:51:32.880336 kubelet[2907]: E0113 23:51:32.880259 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:51:34.879699 kubelet[2907]: E0113 23:51:34.879557 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:51:34.879699 kubelet[2907]: E0113 23:51:34.879662 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:51:35.880468 kubelet[2907]: E0113 23:51:35.880422 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:51:36.880613 kubelet[2907]: E0113 23:51:36.880538 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:51:40.879587 kubelet[2907]: E0113 23:51:40.879523 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:51:42.213433 containerd[1673]: time="2026-01-13T23:51:42.213304439Z" level=info msg="container event discarded" container=a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670 type=CONTAINER_CREATED_EVENT Jan 13 23:51:42.213433 containerd[1673]: time="2026-01-13T23:51:42.213374999Z" level=info msg="container event discarded" container=a72018fecee8330c2a4e0a07c88c6a7546b9475ad1513398e58e1596fbaa6670 type=CONTAINER_STARTED_EVENT Jan 13 23:51:42.238035 containerd[1673]: time="2026-01-13T23:51:42.237976874Z" level=info msg="container event discarded" container=30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb type=CONTAINER_CREATED_EVENT Jan 13 23:51:42.316311 containerd[1673]: time="2026-01-13T23:51:42.316235510Z" level=info msg="container event discarded" container=a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29 type=CONTAINER_CREATED_EVENT Jan 13 23:51:42.316311 containerd[1673]: time="2026-01-13T23:51:42.316280710Z" level=info msg="container event discarded" container=a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29 type=CONTAINER_STARTED_EVENT Jan 13 23:51:42.337577 containerd[1673]: time="2026-01-13T23:51:42.337533894Z" level=info msg="container event discarded" container=30e7734b5954555ffabb5f25d2124c62182e5b1d2ffe8c4093270ef56ad88ccb type=CONTAINER_STARTED_EVENT Jan 13 23:51:43.881011 kubelet[2907]: E0113 23:51:43.880958 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:51:44.290890 containerd[1673]: time="2026-01-13T23:51:44.290729989Z" level=info msg="container event discarded" container=78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b type=CONTAINER_CREATED_EVENT Jan 13 23:51:44.344390 containerd[1673]: time="2026-01-13T23:51:44.344277911Z" level=info msg="container event discarded" container=78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b type=CONTAINER_STARTED_EVENT Jan 13 23:51:46.881651 kubelet[2907]: E0113 23:51:46.881585 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:51:47.881137 kubelet[2907]: E0113 23:51:47.880108 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:51:47.881137 kubelet[2907]: E0113 23:51:47.880461 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:51:47.881137 kubelet[2907]: E0113 23:51:47.880710 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:51:51.498681 systemd[1780]: Created slice background.slice - User Background Tasks Slice. Jan 13 23:51:51.500257 systemd[1780]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 13 23:51:51.522974 systemd[1780]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 13 23:51:51.880201 kubelet[2907]: E0113 23:51:51.880083 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:51:53.730952 update_engine[1652]: I20260113 23:51:53.730840 1652 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 13 23:51:53.730952 update_engine[1652]: I20260113 23:51:53.730939 1652 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 13 23:51:53.731405 update_engine[1652]: I20260113 23:51:53.731366 1652 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 13 23:51:53.731743 update_engine[1652]: I20260113 23:51:53.731695 1652 omaha_request_params.cc:62] Current group set to alpha Jan 13 23:51:53.731821 update_engine[1652]: I20260113 23:51:53.731794 1652 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 13 23:51:53.731821 update_engine[1652]: I20260113 23:51:53.731807 1652 update_attempter.cc:643] Scheduling an action processor start. Jan 13 23:51:53.731871 update_engine[1652]: I20260113 23:51:53.731823 1652 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 13 23:51:53.732039 update_engine[1652]: I20260113 23:51:53.732016 1652 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 13 23:51:53.732106 update_engine[1652]: I20260113 23:51:53.732090 1652 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 13 23:51:53.732136 update_engine[1652]: I20260113 23:51:53.732103 1652 omaha_request_action.cc:272] Request: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: Jan 13 23:51:53.732136 update_engine[1652]: I20260113 23:51:53.732111 1652 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 23:51:53.733000 locksmithd[1702]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 13 23:51:53.733808 update_engine[1652]: I20260113 23:51:53.733752 1652 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 23:51:53.734480 update_engine[1652]: I20260113 23:51:53.734441 1652 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 23:51:53.741691 update_engine[1652]: E20260113 23:51:53.741650 1652 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 13 23:51:53.741756 update_engine[1652]: I20260113 23:51:53.741720 1652 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 13 23:51:55.885104 kubelet[2907]: E0113 23:51:55.883713 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:51:58.739768 containerd[1673]: time="2026-01-13T23:51:58.739576517Z" level=info msg="container event discarded" container=196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89 type=CONTAINER_CREATED_EVENT Jan 13 23:51:58.739768 containerd[1673]: time="2026-01-13T23:51:58.739693238Z" level=info msg="container event discarded" container=196f09a66fec0cc81097e4ed6d9de3bd8a29780b901d4695882d545606d37a89 type=CONTAINER_STARTED_EVENT Jan 13 23:51:58.968607 containerd[1673]: time="2026-01-13T23:51:58.968550048Z" level=info msg="container event discarded" container=5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef type=CONTAINER_CREATED_EVENT Jan 13 23:51:58.968607 containerd[1673]: time="2026-01-13T23:51:58.968599048Z" level=info msg="container event discarded" container=5e83c1ef120c84459730c263aed2b9275080aadca4f906ae27e4358a5a8698ef type=CONTAINER_STARTED_EVENT Jan 13 23:51:59.880202 kubelet[2907]: E0113 23:51:59.879771 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:51:59.880649 kubelet[2907]: E0113 23:51:59.880585 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:52:00.880435 kubelet[2907]: E0113 23:52:00.880382 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:52:00.991227 containerd[1673]: time="2026-01-13T23:52:00.991170553Z" level=info msg="container event discarded" container=067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045 type=CONTAINER_CREATED_EVENT Jan 13 23:52:01.075889 containerd[1673]: time="2026-01-13T23:52:01.075820768Z" level=info msg="container event discarded" container=067bad612ea7ea547e7cc05c96d9338549518590897e4bddc37f8a18dd53c045 type=CONTAINER_STARTED_EVENT Jan 13 23:52:02.678368 containerd[1673]: time="2026-01-13T23:52:02.678292404Z" level=info msg="container event discarded" container=eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641 type=CONTAINER_CREATED_EVENT Jan 13 23:52:02.791131 containerd[1673]: time="2026-01-13T23:52:02.791054105Z" level=info msg="container event discarded" container=eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641 type=CONTAINER_STARTED_EVENT Jan 13 23:52:02.879512 kubelet[2907]: E0113 23:52:02.879470 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:52:02.879873 kubelet[2907]: E0113 23:52:02.879584 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:52:03.663130 update_engine[1652]: I20260113 23:52:03.662958 1652 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 23:52:03.663130 update_engine[1652]: I20260113 23:52:03.663049 1652 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 23:52:03.664185 update_engine[1652]: I20260113 23:52:03.664140 1652 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 23:52:03.669276 update_engine[1652]: E20260113 23:52:03.669183 1652 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 13 23:52:03.669276 update_engine[1652]: I20260113 23:52:03.669252 1652 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 13 23:52:06.434388 containerd[1673]: time="2026-01-13T23:52:06.434267140Z" level=info msg="container event discarded" container=eb9db8cb8908dd1e89813dcae03c452e88473cbc40143633dddd81de4f277641 type=CONTAINER_STOPPED_EVENT Jan 13 23:52:08.881695 kubelet[2907]: E0113 23:52:08.881624 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:52:10.420486 containerd[1673]: time="2026-01-13T23:52:10.420417931Z" level=info msg="container event discarded" container=8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263 type=CONTAINER_CREATED_EVENT Jan 13 23:52:10.557828 containerd[1673]: time="2026-01-13T23:52:10.557734905Z" level=info msg="container event discarded" container=8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263 type=CONTAINER_STARTED_EVENT Jan 13 23:52:11.880688 kubelet[2907]: E0113 23:52:11.880643 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:52:11.880688 kubelet[2907]: E0113 23:52:11.880670 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:52:12.880525 kubelet[2907]: E0113 23:52:12.880469 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:52:13.293691 containerd[1673]: time="2026-01-13T23:52:13.293574242Z" level=info msg="container event discarded" container=8aeda852f0b5b0429e0a0214c296b53e0baa9edd64e57178468fa42d743f9263 type=CONTAINER_STOPPED_EVENT Jan 13 23:52:13.658761 update_engine[1652]: I20260113 23:52:13.658095 1652 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 23:52:13.658761 update_engine[1652]: I20260113 23:52:13.658172 1652 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 23:52:13.658761 update_engine[1652]: I20260113 23:52:13.658605 1652 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 23:52:13.664023 update_engine[1652]: E20260113 23:52:13.663991 1652 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 13 23:52:13.664298 update_engine[1652]: I20260113 23:52:13.664247 1652 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 13 23:52:13.879722 kubelet[2907]: E0113 23:52:13.879663 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:52:16.879938 kubelet[2907]: E0113 23:52:16.879856 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:52:20.358135 systemd[1]: cri-containerd-78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b.scope: Deactivated successfully. Jan 13 23:52:20.358495 systemd[1]: cri-containerd-78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b.scope: Consumed 57.782s CPU time, 111.7M memory peak. Jan 13 23:52:20.360052 containerd[1673]: time="2026-01-13T23:52:20.360010929Z" level=info msg="received container exit event container_id:\"78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b\" id:\"78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b\" pid:3246 exit_status:1 exited_at:{seconds:1768348340 nanos:359741568}" Jan 13 23:52:20.367000 audit: BPF prog-id=146 op=UNLOAD Jan 13 23:52:20.371040 kernel: kauditd_printk_skb: 230 callbacks suppressed Jan 13 23:52:20.371146 kernel: audit: type=1334 audit(1768348340.367:743): prog-id=146 op=UNLOAD Jan 13 23:52:20.371171 kernel: audit: type=1334 audit(1768348340.368:744): prog-id=150 op=UNLOAD Jan 13 23:52:20.368000 audit: BPF prog-id=150 op=UNLOAD Jan 13 23:52:20.384565 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b-rootfs.mount: Deactivated successfully. Jan 13 23:52:20.617021 kubelet[2907]: E0113 23:52:20.615940 2907 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.248:40838->10.0.21.138:2379: read: connection timed out" Jan 13 23:52:20.619595 systemd[1]: cri-containerd-204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e.scope: Deactivated successfully. Jan 13 23:52:20.620169 systemd[1]: cri-containerd-204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e.scope: Consumed 4.847s CPU time, 26.6M memory peak. Jan 13 23:52:20.620000 audit: BPF prog-id=256 op=LOAD Jan 13 23:52:20.620000 audit: BPF prog-id=88 op=UNLOAD Jan 13 23:52:20.622509 containerd[1673]: time="2026-01-13T23:52:20.622455441Z" level=info msg="received container exit event container_id:\"204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e\" id:\"204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e\" pid:2753 exit_status:1 exited_at:{seconds:1768348340 nanos:621175277}" Jan 13 23:52:20.622835 kernel: audit: type=1334 audit(1768348340.620:745): prog-id=256 op=LOAD Jan 13 23:52:20.622889 kernel: audit: type=1334 audit(1768348340.620:746): prog-id=88 op=UNLOAD Jan 13 23:52:20.626000 audit: BPF prog-id=103 op=UNLOAD Jan 13 23:52:20.626000 audit: BPF prog-id=107 op=UNLOAD Jan 13 23:52:20.629261 kernel: audit: type=1334 audit(1768348340.626:747): prog-id=103 op=UNLOAD Jan 13 23:52:20.629318 kernel: audit: type=1334 audit(1768348340.626:748): prog-id=107 op=UNLOAD Jan 13 23:52:20.644423 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e-rootfs.mount: Deactivated successfully. Jan 13 23:52:20.664507 kubelet[2907]: I0113 23:52:20.664473 2907 scope.go:117] "RemoveContainer" containerID="204343e9bb0d82b78513a9bf8f011c21521118138249eda873d39096c167157e" Jan 13 23:52:20.666806 containerd[1673]: time="2026-01-13T23:52:20.666747454Z" level=info msg="CreateContainer within sandbox \"b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 13 23:52:20.666961 kubelet[2907]: I0113 23:52:20.666767 2907 scope.go:117] "RemoveContainer" containerID="78fa946e1d53df399388d06863e691e455f89d1860c32672a6e82a674116b45b" Jan 13 23:52:20.668276 containerd[1673]: time="2026-01-13T23:52:20.668250499Z" level=info msg="CreateContainer within sandbox \"a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 13 23:52:20.677088 containerd[1673]: time="2026-01-13T23:52:20.676121763Z" level=info msg="Container fb4c3d392070f2ca7a12437d22e941fb2ace208d8bc086c2e949a5a53a0f3419: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:52:20.681762 containerd[1673]: time="2026-01-13T23:52:20.681428619Z" level=info msg="Container 6d5e9326a973bcd3b932b02ef2e155ef11624646b7ffdd22face1c4e15abd7ca: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:52:20.689290 containerd[1673]: time="2026-01-13T23:52:20.689240162Z" level=info msg="CreateContainer within sandbox \"b1e64f5c0125da1848c10bc1ccca13567650c7ac807d5ad262371b1a52be870f\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"fb4c3d392070f2ca7a12437d22e941fb2ace208d8bc086c2e949a5a53a0f3419\"" Jan 13 23:52:20.690162 containerd[1673]: time="2026-01-13T23:52:20.690125805Z" level=info msg="StartContainer for \"fb4c3d392070f2ca7a12437d22e941fb2ace208d8bc086c2e949a5a53a0f3419\"" Jan 13 23:52:20.691157 containerd[1673]: time="2026-01-13T23:52:20.691121408Z" level=info msg="connecting to shim fb4c3d392070f2ca7a12437d22e941fb2ace208d8bc086c2e949a5a53a0f3419" address="unix:///run/containerd/s/a2662928acb8cf6f7bde3880480561138f7ee67f7cc250e7777a374faa5e1059" protocol=ttrpc version=3 Jan 13 23:52:20.692217 containerd[1673]: time="2026-01-13T23:52:20.692185611Z" level=info msg="CreateContainer within sandbox \"a120225352dfd10b5f27650a66dc3c0b2ffddf30d83ef5e72219796794f1bc29\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"6d5e9326a973bcd3b932b02ef2e155ef11624646b7ffdd22face1c4e15abd7ca\"" Jan 13 23:52:20.692638 containerd[1673]: time="2026-01-13T23:52:20.692541452Z" level=info msg="StartContainer for \"6d5e9326a973bcd3b932b02ef2e155ef11624646b7ffdd22face1c4e15abd7ca\"" Jan 13 23:52:20.693395 containerd[1673]: time="2026-01-13T23:52:20.693369095Z" level=info msg="connecting to shim 6d5e9326a973bcd3b932b02ef2e155ef11624646b7ffdd22face1c4e15abd7ca" address="unix:///run/containerd/s/1107ef2b7f3507e75436b23ad64fc933b92ebe26f3b598af600d9a69877dc0a2" protocol=ttrpc version=3 Jan 13 23:52:20.712252 systemd[1]: Started cri-containerd-fb4c3d392070f2ca7a12437d22e941fb2ace208d8bc086c2e949a5a53a0f3419.scope - libcontainer container fb4c3d392070f2ca7a12437d22e941fb2ace208d8bc086c2e949a5a53a0f3419. Jan 13 23:52:20.715913 systemd[1]: Started cri-containerd-6d5e9326a973bcd3b932b02ef2e155ef11624646b7ffdd22face1c4e15abd7ca.scope - libcontainer container 6d5e9326a973bcd3b932b02ef2e155ef11624646b7ffdd22face1c4e15abd7ca. Jan 13 23:52:20.724000 audit: BPF prog-id=257 op=LOAD Jan 13 23:52:20.725000 audit: BPF prog-id=258 op=LOAD Jan 13 23:52:20.727316 kernel: audit: type=1334 audit(1768348340.724:749): prog-id=257 op=LOAD Jan 13 23:52:20.727425 kernel: audit: type=1334 audit(1768348340.725:750): prog-id=258 op=LOAD Jan 13 23:52:20.727456 kernel: audit: type=1300 audit(1768348340.725:750): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.725000 audit[5444]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.725000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.734598 kernel: audit: type=1327 audit(1768348340.725:750): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.725000 audit: BPF prog-id=258 op=UNLOAD Jan 13 23:52:20.725000 audit[5444]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.725000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.725000 audit: BPF prog-id=259 op=LOAD Jan 13 23:52:20.725000 audit[5444]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.725000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.726000 audit: BPF prog-id=260 op=LOAD Jan 13 23:52:20.726000 audit[5444]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.729000 audit: BPF prog-id=260 op=UNLOAD Jan 13 23:52:20.729000 audit[5444]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.729000 audit: BPF prog-id=259 op=UNLOAD Jan 13 23:52:20.729000 audit[5444]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.729000 audit: BPF prog-id=261 op=LOAD Jan 13 23:52:20.729000 audit[5444]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=2606 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662346333643339323037306632636137613132343337643232653934 Jan 13 23:52:20.734000 audit: BPF prog-id=262 op=LOAD Jan 13 23:52:20.734000 audit: BPF prog-id=263 op=LOAD Jan 13 23:52:20.734000 audit[5445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3037 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.734000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664356539333236613937336263643362393332623032656632653135 Jan 13 23:52:20.735000 audit: BPF prog-id=263 op=UNLOAD Jan 13 23:52:20.735000 audit[5445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664356539333236613937336263643362393332623032656632653135 Jan 13 23:52:20.735000 audit: BPF prog-id=264 op=LOAD Jan 13 23:52:20.735000 audit[5445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3037 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664356539333236613937336263643362393332623032656632653135 Jan 13 23:52:20.735000 audit: BPF prog-id=265 op=LOAD Jan 13 23:52:20.735000 audit[5445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3037 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664356539333236613937336263643362393332623032656632653135 Jan 13 23:52:20.735000 audit: BPF prog-id=265 op=UNLOAD Jan 13 23:52:20.735000 audit[5445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664356539333236613937336263643362393332623032656632653135 Jan 13 23:52:20.735000 audit: BPF prog-id=264 op=UNLOAD Jan 13 23:52:20.735000 audit[5445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3037 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664356539333236613937336263643362393332623032656632653135 Jan 13 23:52:20.735000 audit: BPF prog-id=266 op=LOAD Jan 13 23:52:20.735000 audit[5445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3037 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:20.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664356539333236613937336263643362393332623032656632653135 Jan 13 23:52:20.755285 containerd[1673]: time="2026-01-13T23:52:20.755169041Z" level=info msg="StartContainer for \"6d5e9326a973bcd3b932b02ef2e155ef11624646b7ffdd22face1c4e15abd7ca\" returns successfully" Jan 13 23:52:20.763089 containerd[1673]: time="2026-01-13T23:52:20.763019865Z" level=info msg="StartContainer for \"fb4c3d392070f2ca7a12437d22e941fb2ace208d8bc086c2e949a5a53a0f3419\" returns successfully" Jan 13 23:52:20.879840 kubelet[2907]: E0113 23:52:20.879695 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5748d45b5-b6b8l" podUID="181c88a3-5f50-4a12-ac78-292f89f9a583" Jan 13 23:52:20.995535 systemd[1]: cri-containerd-de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359.scope: Deactivated successfully. Jan 13 23:52:20.995866 systemd[1]: cri-containerd-de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359.scope: Consumed 5.460s CPU time, 58.5M memory peak. Jan 13 23:52:20.995000 audit: BPF prog-id=267 op=LOAD Jan 13 23:52:20.995000 audit: BPF prog-id=83 op=UNLOAD Jan 13 23:52:20.997363 containerd[1673]: time="2026-01-13T23:52:20.997208172Z" level=info msg="received container exit event container_id:\"de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359\" id:\"de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359\" pid:2727 exit_status:1 exited_at:{seconds:1768348340 nanos:996457089}" Jan 13 23:52:21.001000 audit: BPF prog-id=98 op=UNLOAD Jan 13 23:52:21.001000 audit: BPF prog-id=102 op=UNLOAD Jan 13 23:52:21.036706 containerd[1673]: time="2026-01-13T23:52:21.036611731Z" level=info msg="container event discarded" container=f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08 type=CONTAINER_CREATED_EVENT Jan 13 23:52:21.171188 containerd[1673]: time="2026-01-13T23:52:21.170957176Z" level=info msg="container event discarded" container=f39a49d45c293956f9eb1bba12d3556bbb0b187a2f17ac3aa07a4b34ecd27d08 type=CONTAINER_STARTED_EVENT Jan 13 23:52:21.387919 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359-rootfs.mount: Deactivated successfully. Jan 13 23:52:21.675048 kubelet[2907]: I0113 23:52:21.674829 2907 scope.go:117] "RemoveContainer" containerID="de1aefd0797f92f0f7fcfc5a65f76d556c448439bfbf3a2f11370bd2ccb24359" Jan 13 23:52:21.676495 containerd[1673]: time="2026-01-13T23:52:21.676460942Z" level=info msg="CreateContainer within sandbox \"e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 13 23:52:21.693236 containerd[1673]: time="2026-01-13T23:52:21.692402270Z" level=info msg="Container 4ddb1c72d706bc56e2ef48b0903a442ba0f0e3db7438b9fed3997b1a9376c36d: CDI devices from CRI Config.CDIDevices: []" Jan 13 23:52:21.702673 containerd[1673]: time="2026-01-13T23:52:21.702636101Z" level=info msg="CreateContainer within sandbox \"e41c389c923dd8ad9589533be90dfee7d02ee50707848f2398944cf651246ae9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"4ddb1c72d706bc56e2ef48b0903a442ba0f0e3db7438b9fed3997b1a9376c36d\"" Jan 13 23:52:21.703393 containerd[1673]: time="2026-01-13T23:52:21.703368303Z" level=info msg="StartContainer for \"4ddb1c72d706bc56e2ef48b0903a442ba0f0e3db7438b9fed3997b1a9376c36d\"" Jan 13 23:52:21.705144 containerd[1673]: time="2026-01-13T23:52:21.705108948Z" level=info msg="connecting to shim 4ddb1c72d706bc56e2ef48b0903a442ba0f0e3db7438b9fed3997b1a9376c36d" address="unix:///run/containerd/s/a5d70e2eea7671738ec9ec5607fd26d7a5f9ad5f54c6e3887d77df30ba51f8de" protocol=ttrpc version=3 Jan 13 23:52:21.741293 systemd[1]: Started cri-containerd-4ddb1c72d706bc56e2ef48b0903a442ba0f0e3db7438b9fed3997b1a9376c36d.scope - libcontainer container 4ddb1c72d706bc56e2ef48b0903a442ba0f0e3db7438b9fed3997b1a9376c36d. Jan 13 23:52:21.753000 audit: BPF prog-id=268 op=LOAD Jan 13 23:52:21.753000 audit: BPF prog-id=269 op=LOAD Jan 13 23:52:21.753000 audit[5522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2581 pid=5522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:21.753000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464646231633732643730366263353665326566343862303930336134 Jan 13 23:52:21.753000 audit: BPF prog-id=269 op=UNLOAD Jan 13 23:52:21.753000 audit[5522]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=5522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:21.753000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464646231633732643730366263353665326566343862303930336134 Jan 13 23:52:21.754000 audit: BPF prog-id=270 op=LOAD Jan 13 23:52:21.754000 audit[5522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2581 pid=5522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:21.754000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464646231633732643730366263353665326566343862303930336134 Jan 13 23:52:21.754000 audit: BPF prog-id=271 op=LOAD Jan 13 23:52:21.754000 audit[5522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2581 pid=5522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:21.754000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464646231633732643730366263353665326566343862303930336134 Jan 13 23:52:21.754000 audit: BPF prog-id=271 op=UNLOAD Jan 13 23:52:21.754000 audit[5522]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=5522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:21.754000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464646231633732643730366263353665326566343862303930336134 Jan 13 23:52:21.754000 audit: BPF prog-id=270 op=UNLOAD Jan 13 23:52:21.754000 audit[5522]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2581 pid=5522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:21.754000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464646231633732643730366263353665326566343862303930336134 Jan 13 23:52:21.754000 audit: BPF prog-id=272 op=LOAD Jan 13 23:52:21.754000 audit[5522]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2581 pid=5522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 13 23:52:21.754000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464646231633732643730366263353665326566343862303930336134 Jan 13 23:52:21.781118 containerd[1673]: time="2026-01-13T23:52:21.781038137Z" level=info msg="StartContainer for \"4ddb1c72d706bc56e2ef48b0903a442ba0f0e3db7438b9fed3997b1a9376c36d\" returns successfully" Jan 13 23:52:22.342501 kubelet[2907]: E0113 23:52:22.342345 2907 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.248:40638->10.0.21.138:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-68644f6664-bwfss.188a6f2e7638edb8 calico-apiserver 1859 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-68644f6664-bwfss,UID:c47b5326-d31f-4680-9c2d-bdd28d584c69,APIVersion:v1,ResourceVersion:836,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-660efdb355,},FirstTimestamp:2026-01-13 23:47:29 +0000 UTC,LastTimestamp:2026-01-13 23:52:11.880589457 +0000 UTC m=+336.140905024,Count:20,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-660efdb355,}" Jan 13 23:52:22.676526 containerd[1673]: time="2026-01-13T23:52:22.676373800Z" level=info msg="container event discarded" container=64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269 type=CONTAINER_CREATED_EVENT Jan 13 23:52:22.676526 containerd[1673]: time="2026-01-13T23:52:22.676408520Z" level=info msg="container event discarded" container=64a42bda8d9e32652ba89e491b1d106726ff3fd0943d04161a3ba768f225c269 type=CONTAINER_STARTED_EVENT Jan 13 23:52:23.654812 update_engine[1652]: I20260113 23:52:23.654702 1652 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 23:52:23.654812 update_engine[1652]: I20260113 23:52:23.654794 1652 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 23:52:23.655513 update_engine[1652]: I20260113 23:52:23.655148 1652 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 23:52:23.661510 update_engine[1652]: E20260113 23:52:23.661461 1652 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 13 23:52:23.661632 update_engine[1652]: I20260113 23:52:23.661577 1652 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 13 23:52:23.661632 update_engine[1652]: I20260113 23:52:23.661604 1652 omaha_request_action.cc:617] Omaha request response: Jan 13 23:52:23.661815 update_engine[1652]: E20260113 23:52:23.661756 1652 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 13 23:52:23.661815 update_engine[1652]: I20260113 23:52:23.661803 1652 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 13 23:52:23.661893 update_engine[1652]: I20260113 23:52:23.661820 1652 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 13 23:52:23.661893 update_engine[1652]: I20260113 23:52:23.661826 1652 update_attempter.cc:306] Processing Done. Jan 13 23:52:23.661893 update_engine[1652]: E20260113 23:52:23.661841 1652 update_attempter.cc:619] Update failed. Jan 13 23:52:23.661893 update_engine[1652]: I20260113 23:52:23.661845 1652 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 13 23:52:23.661893 update_engine[1652]: I20260113 23:52:23.661850 1652 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 13 23:52:23.661893 update_engine[1652]: I20260113 23:52:23.661854 1652 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 13 23:52:23.662008 update_engine[1652]: I20260113 23:52:23.661919 1652 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 13 23:52:23.662008 update_engine[1652]: I20260113 23:52:23.661938 1652 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 13 23:52:23.662008 update_engine[1652]: I20260113 23:52:23.661943 1652 omaha_request_action.cc:272] Request: Jan 13 23:52:23.662008 update_engine[1652]: Jan 13 23:52:23.662008 update_engine[1652]: Jan 13 23:52:23.662008 update_engine[1652]: Jan 13 23:52:23.662008 update_engine[1652]: Jan 13 23:52:23.662008 update_engine[1652]: Jan 13 23:52:23.662008 update_engine[1652]: Jan 13 23:52:23.662008 update_engine[1652]: I20260113 23:52:23.661949 1652 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 13 23:52:23.662008 update_engine[1652]: I20260113 23:52:23.661965 1652 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 13 23:52:23.662448 update_engine[1652]: I20260113 23:52:23.662397 1652 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 13 23:52:23.662583 locksmithd[1702]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 13 23:52:23.668432 update_engine[1652]: E20260113 23:52:23.668358 1652 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 13 23:52:23.668541 update_engine[1652]: I20260113 23:52:23.668504 1652 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 13 23:52:23.668581 update_engine[1652]: I20260113 23:52:23.668540 1652 omaha_request_action.cc:617] Omaha request response: Jan 13 23:52:23.668581 update_engine[1652]: I20260113 23:52:23.668565 1652 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 13 23:52:23.668633 update_engine[1652]: I20260113 23:52:23.668581 1652 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 13 23:52:23.668633 update_engine[1652]: I20260113 23:52:23.668594 1652 update_attempter.cc:306] Processing Done. Jan 13 23:52:23.668675 update_engine[1652]: I20260113 23:52:23.668610 1652 update_attempter.cc:310] Error event sent. Jan 13 23:52:23.668694 update_engine[1652]: I20260113 23:52:23.668660 1652 update_check_scheduler.cc:74] Next update check in 44m41s Jan 13 23:52:23.669024 locksmithd[1702]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 13 23:52:23.879997 kubelet[2907]: E0113 23:52:23.879945 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-xghpj" podUID="7ddfa231-61f2-4ab3-bd34-82f93616c2de" Jan 13 23:52:23.879997 kubelet[2907]: E0113 23:52:23.879949 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-bwfss" podUID="c47b5326-d31f-4680-9c2d-bdd28d584c69" Jan 13 23:52:24.880010 kubelet[2907]: E0113 23:52:24.879955 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-68644f6664-r8m4t" podUID="04948dc7-2d3d-4260-8809-f8eb4aa6cc17" Jan 13 23:52:25.880471 kubelet[2907]: E0113 23:52:25.880405 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p6cc5" podUID="89f76643-e37a-4094-9d85-ab46009d2c90" Jan 13 23:52:26.201225 containerd[1673]: time="2026-01-13T23:52:26.201071237Z" level=info msg="container event discarded" container=4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c type=CONTAINER_CREATED_EVENT Jan 13 23:52:26.201225 containerd[1673]: time="2026-01-13T23:52:26.201123397Z" level=info msg="container event discarded" container=4fbc735946b040e5a4864c90b2c4e0c40978c1254862750bd124793ed2dfb00c type=CONTAINER_STARTED_EVENT Jan 13 23:52:26.240429 containerd[1673]: time="2026-01-13T23:52:26.240326036Z" level=info msg="container event discarded" container=2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95 type=CONTAINER_CREATED_EVENT Jan 13 23:52:26.328058 containerd[1673]: time="2026-01-13T23:52:26.327956460Z" level=info msg="container event discarded" container=2f77c95246d9d45d67745e5a2a4fb5501a47d218b3306d1df2adc6e9afd61e95 type=CONTAINER_STARTED_EVENT Jan 13 23:52:26.344372 containerd[1673]: time="2026-01-13T23:52:26.344295229Z" level=info msg="container event discarded" container=85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753 type=CONTAINER_CREATED_EVENT Jan 13 23:52:26.344372 containerd[1673]: time="2026-01-13T23:52:26.344329070Z" level=info msg="container event discarded" container=85e6e0d3a0b77f283819d4e1e2f60a788194e38c818c785b67274348cddea753 type=CONTAINER_STARTED_EVENT Jan 13 23:52:26.367606 containerd[1673]: time="2026-01-13T23:52:26.367524300Z" level=info msg="container event discarded" container=d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888 type=CONTAINER_CREATED_EVENT Jan 13 23:52:26.470909 containerd[1673]: time="2026-01-13T23:52:26.470776611Z" level=info msg="container event discarded" container=15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47 type=CONTAINER_CREATED_EVENT Jan 13 23:52:26.470909 containerd[1673]: time="2026-01-13T23:52:26.470833291Z" level=info msg="container event discarded" container=15eae242f5a5746d3fe8160042a64a7084bc5b7e14dd178572ec1b421cb01e47 type=CONTAINER_STARTED_EVENT Jan 13 23:52:26.470909 containerd[1673]: time="2026-01-13T23:52:26.470843611Z" level=info msg="container event discarded" container=d09fae2ed26fbc298d19f2973c3fff4fe4d8dd3af9d00c6da94297b3a0cf9888 type=CONTAINER_STARTED_EVENT Jan 13 23:52:26.579325 containerd[1673]: time="2026-01-13T23:52:26.579231379Z" level=info msg="container event discarded" container=f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce type=CONTAINER_CREATED_EVENT Jan 13 23:52:26.579325 containerd[1673]: time="2026-01-13T23:52:26.579288739Z" level=info msg="container event discarded" container=f625db61969007740c13cc617357bbdb7ffd131f73617b94333b3bbdb99fbbce type=CONTAINER_STARTED_EVENT Jan 13 23:52:26.645605 containerd[1673]: time="2026-01-13T23:52:26.645537339Z" level=info msg="container event discarded" container=c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003 type=CONTAINER_CREATED_EVENT Jan 13 23:52:26.645605 containerd[1673]: time="2026-01-13T23:52:26.645585419Z" level=info msg="container event discarded" container=c14f624b62a9b3885edc33beefbbaa1719ea02256005492a445a53da4de78003 type=CONTAINER_STARTED_EVENT Jan 13 23:52:27.375107 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Jan 13 23:52:27.879920 kubelet[2907]: E0113 23:52:27.879851 2907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b4b69d7d6-mtxnp" podUID="c4d38e9b-73ce-46dd-9acb-61df83d528d1" Jan 13 23:52:28.146487 containerd[1673]: time="2026-01-13T23:52:28.146340228Z" level=info msg="container event discarded" container=9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac type=CONTAINER_CREATED_EVENT Jan 13 23:52:28.146487 containerd[1673]: time="2026-01-13T23:52:28.146388188Z" level=info msg="container event discarded" container=9c8c6ebd88ee9e1131ccbf06d30dd12e50e789b54407519260b5da2a3e336eac type=CONTAINER_STARTED_EVENT Jan 13 23:52:28.237242 containerd[1673]: time="2026-01-13T23:52:28.237120382Z" level=info msg="container event discarded" container=243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151 type=CONTAINER_CREATED_EVENT Jan 13 23:52:28.237242 containerd[1673]: time="2026-01-13T23:52:28.237187262Z" level=info msg="container event discarded" container=243613371baa43fb5ec146c93be5738d1da30eedd5e2d6c43949427defd2c151 type=CONTAINER_STARTED_EVENT