Jan 23 23:30:43.487745 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 23 23:30:43.487767 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Fri Jan 23 21:37:22 -00 2026 Jan 23 23:30:43.487777 kernel: KASLR enabled Jan 23 23:30:43.487783 kernel: efi: EFI v2.7 by EDK II Jan 23 23:30:43.487789 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Jan 23 23:30:43.487796 kernel: random: crng init done Jan 23 23:30:43.487803 kernel: secureboot: Secure boot disabled Jan 23 23:30:43.487809 kernel: ACPI: Early table checksum verification disabled Jan 23 23:30:43.487815 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 23 23:30:43.487823 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 23 23:30:43.487829 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487835 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487842 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487848 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487858 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487864 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487871 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487878 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487884 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487891 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 23:30:43.487897 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 23 23:30:43.487904 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 23 23:30:43.487910 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 23 23:30:43.487918 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 23 23:30:43.487924 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 23 23:30:43.487931 kernel: Zone ranges: Jan 23 23:30:43.487937 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 23 23:30:43.487943 kernel: DMA32 empty Jan 23 23:30:43.487950 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 23 23:30:43.488028 kernel: Device empty Jan 23 23:30:43.488035 kernel: Movable zone start for each node Jan 23 23:30:43.488041 kernel: Early memory node ranges Jan 23 23:30:43.488048 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 23 23:30:43.488055 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 23 23:30:43.488061 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 23 23:30:43.488070 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 23 23:30:43.488076 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 23 23:30:43.488083 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 23 23:30:43.488090 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 23 23:30:43.488096 kernel: psci: probing for conduit method from ACPI. Jan 23 23:30:43.488106 kernel: psci: PSCIv1.3 detected in firmware. Jan 23 23:30:43.488114 kernel: psci: Using standard PSCI v0.2 function IDs Jan 23 23:30:43.488121 kernel: psci: Trusted OS migration not required Jan 23 23:30:43.488128 kernel: psci: SMC Calling Convention v1.1 Jan 23 23:30:43.488135 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 23 23:30:43.488142 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 23 23:30:43.488149 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 23 23:30:43.488156 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 23 23:30:43.488162 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 23 23:30:43.488171 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 23 23:30:43.488178 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 23 23:30:43.488185 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 23 23:30:43.488192 kernel: Detected PIPT I-cache on CPU0 Jan 23 23:30:43.488199 kernel: CPU features: detected: GIC system register CPU interface Jan 23 23:30:43.488206 kernel: CPU features: detected: Spectre-v4 Jan 23 23:30:43.488213 kernel: CPU features: detected: Spectre-BHB Jan 23 23:30:43.488219 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 23 23:30:43.488227 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 23 23:30:43.488234 kernel: CPU features: detected: ARM erratum 1418040 Jan 23 23:30:43.488240 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 23 23:30:43.488249 kernel: alternatives: applying boot alternatives Jan 23 23:30:43.488257 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=0b7aa2947ffddc152dd47eebbcf7a95dcd57c97b69958c2bfdf6c1781ecaf3c1 Jan 23 23:30:43.488264 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 23 23:30:43.488271 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 23 23:30:43.488278 kernel: Fallback order for Node 0: 0 Jan 23 23:30:43.488285 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 23 23:30:43.488292 kernel: Policy zone: Normal Jan 23 23:30:43.488299 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 23 23:30:43.488306 kernel: software IO TLB: area num 4. Jan 23 23:30:43.488313 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 23 23:30:43.488333 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 23 23:30:43.488342 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 23 23:30:43.488349 kernel: rcu: RCU event tracing is enabled. Jan 23 23:30:43.488357 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 23 23:30:43.488364 kernel: Trampoline variant of Tasks RCU enabled. Jan 23 23:30:43.488371 kernel: Tracing variant of Tasks RCU enabled. Jan 23 23:30:43.488378 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 23 23:30:43.488385 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 23 23:30:43.488392 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 23 23:30:43.488399 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 23 23:30:43.488406 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 23 23:30:43.488426 kernel: GICv3: 256 SPIs implemented Jan 23 23:30:43.488434 kernel: GICv3: 0 Extended SPIs implemented Jan 23 23:30:43.488441 kernel: Root IRQ handler: gic_handle_irq Jan 23 23:30:43.488448 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 23 23:30:43.488455 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 23 23:30:43.488462 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 23 23:30:43.488469 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 23 23:30:43.488476 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 23 23:30:43.488503 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 23 23:30:43.488511 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 23 23:30:43.488518 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 23 23:30:43.488525 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 23 23:30:43.488534 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 23:30:43.488541 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 23 23:30:43.488548 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 23 23:30:43.488555 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 23 23:30:43.488562 kernel: arm-pv: using stolen time PV Jan 23 23:30:43.488570 kernel: Console: colour dummy device 80x25 Jan 23 23:30:43.488577 kernel: ACPI: Core revision 20240827 Jan 23 23:30:43.488585 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 23 23:30:43.488594 kernel: pid_max: default: 32768 minimum: 301 Jan 23 23:30:43.488602 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 23 23:30:43.488609 kernel: landlock: Up and running. Jan 23 23:30:43.488616 kernel: SELinux: Initializing. Jan 23 23:30:43.488624 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 23:30:43.488631 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 23:30:43.488639 kernel: rcu: Hierarchical SRCU implementation. Jan 23 23:30:43.488646 kernel: rcu: Max phase no-delay instances is 400. Jan 23 23:30:43.488655 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 23 23:30:43.488663 kernel: Remapping and enabling EFI services. Jan 23 23:30:43.488670 kernel: smp: Bringing up secondary CPUs ... Jan 23 23:30:43.488678 kernel: Detected PIPT I-cache on CPU1 Jan 23 23:30:43.488685 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 23 23:30:43.488693 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 23 23:30:43.488700 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 23:30:43.488709 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 23 23:30:43.488716 kernel: Detected PIPT I-cache on CPU2 Jan 23 23:30:43.488729 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 23 23:30:43.488738 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 23 23:30:43.488745 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 23:30:43.488753 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 23 23:30:43.488760 kernel: Detected PIPT I-cache on CPU3 Jan 23 23:30:43.488768 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 23 23:30:43.488777 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 23 23:30:43.488785 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 23:30:43.488792 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 23 23:30:43.488800 kernel: smp: Brought up 1 node, 4 CPUs Jan 23 23:30:43.488808 kernel: SMP: Total of 4 processors activated. Jan 23 23:30:43.488815 kernel: CPU: All CPU(s) started at EL1 Jan 23 23:30:43.488824 kernel: CPU features: detected: 32-bit EL0 Support Jan 23 23:30:43.488833 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 23 23:30:43.488841 kernel: CPU features: detected: Common not Private translations Jan 23 23:30:43.488848 kernel: CPU features: detected: CRC32 instructions Jan 23 23:30:43.488856 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 23 23:30:43.488864 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 23 23:30:43.488871 kernel: CPU features: detected: LSE atomic instructions Jan 23 23:30:43.488881 kernel: CPU features: detected: Privileged Access Never Jan 23 23:30:43.488888 kernel: CPU features: detected: RAS Extension Support Jan 23 23:30:43.488896 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 23 23:30:43.488904 kernel: alternatives: applying system-wide alternatives Jan 23 23:30:43.488911 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 23 23:30:43.488920 kernel: Memory: 16324368K/16777216K available (11200K kernel code, 2458K rwdata, 9092K rodata, 12480K init, 1038K bss, 430064K reserved, 16384K cma-reserved) Jan 23 23:30:43.488928 kernel: devtmpfs: initialized Jan 23 23:30:43.488937 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 23 23:30:43.488945 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 23 23:30:43.488952 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 23 23:30:43.488968 kernel: 0 pages in range for non-PLT usage Jan 23 23:30:43.488976 kernel: 515152 pages in range for PLT usage Jan 23 23:30:43.488984 kernel: pinctrl core: initialized pinctrl subsystem Jan 23 23:30:43.488992 kernel: SMBIOS 3.0.0 present. Jan 23 23:30:43.489000 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 23 23:30:43.489009 kernel: DMI: Memory slots populated: 1/1 Jan 23 23:30:43.489017 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 23 23:30:43.489025 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 23 23:30:43.489032 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 23 23:30:43.489040 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 23 23:30:43.489048 kernel: audit: initializing netlink subsys (disabled) Jan 23 23:30:43.489056 kernel: audit: type=2000 audit(0.037:1): state=initialized audit_enabled=0 res=1 Jan 23 23:30:43.489065 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 23 23:30:43.489073 kernel: cpuidle: using governor menu Jan 23 23:30:43.489080 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 23 23:30:43.489088 kernel: ASID allocator initialised with 32768 entries Jan 23 23:30:43.489096 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 23 23:30:43.489103 kernel: Serial: AMBA PL011 UART driver Jan 23 23:30:43.489111 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 23 23:30:43.489120 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 23 23:30:43.489128 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 23 23:30:43.489135 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 23 23:30:43.489143 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 23 23:30:43.489151 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 23 23:30:43.489159 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 23 23:30:43.489166 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 23 23:30:43.489175 kernel: ACPI: Added _OSI(Module Device) Jan 23 23:30:43.489183 kernel: ACPI: Added _OSI(Processor Device) Jan 23 23:30:43.489190 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 23 23:30:43.489198 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 23 23:30:43.489206 kernel: ACPI: Interpreter enabled Jan 23 23:30:43.489213 kernel: ACPI: Using GIC for interrupt routing Jan 23 23:30:43.489221 kernel: ACPI: MCFG table detected, 1 entries Jan 23 23:30:43.489229 kernel: ACPI: CPU0 has been hot-added Jan 23 23:30:43.489238 kernel: ACPI: CPU1 has been hot-added Jan 23 23:30:43.489245 kernel: ACPI: CPU2 has been hot-added Jan 23 23:30:43.489253 kernel: ACPI: CPU3 has been hot-added Jan 23 23:30:43.489261 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 23 23:30:43.489268 kernel: printk: legacy console [ttyAMA0] enabled Jan 23 23:30:43.489276 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 23 23:30:43.489462 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 23 23:30:43.489557 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 23 23:30:43.489640 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 23 23:30:43.489767 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 23 23:30:43.489852 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 23 23:30:43.489862 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 23 23:30:43.489870 kernel: PCI host bridge to bus 0000:00 Jan 23 23:30:43.489984 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 23 23:30:43.490064 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 23 23:30:43.490135 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 23 23:30:43.490207 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 23 23:30:43.490303 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 23 23:30:43.490397 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.490483 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 23 23:30:43.490563 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 23 23:30:43.490642 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 23 23:30:43.490719 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 23 23:30:43.490806 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.490887 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 23 23:30:43.490983 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 23 23:30:43.491071 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 23 23:30:43.491163 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.491244 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 23 23:30:43.491326 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 23 23:30:43.491404 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 23 23:30:43.491503 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 23 23:30:43.491621 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.491703 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 23 23:30:43.491782 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 23 23:30:43.491864 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 23 23:30:43.491950 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.492050 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 23 23:30:43.492147 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 23 23:30:43.492234 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 23 23:30:43.492312 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 23 23:30:43.492402 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.492498 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 23 23:30:43.492577 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 23 23:30:43.492657 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 23 23:30:43.492737 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 23 23:30:43.492824 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.492910 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 23 23:30:43.493006 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 23 23:30:43.493094 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.493174 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 23 23:30:43.493253 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 23 23:30:43.493337 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.493421 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 23 23:30:43.493500 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 23 23:30:43.493591 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.493672 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 23 23:30:43.493756 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 23 23:30:43.493867 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.493948 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 23 23:30:43.494045 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 23 23:30:43.494134 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.494213 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 23 23:30:43.494294 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 23 23:30:43.494379 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.494460 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 23 23:30:43.494538 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 23 23:30:43.494623 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.494705 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 23 23:30:43.494783 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 23 23:30:43.494876 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.494967 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 23 23:30:43.495055 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 23 23:30:43.495144 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.495229 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 23 23:30:43.495309 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 23 23:30:43.495397 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.495477 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 23 23:30:43.495554 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 23 23:30:43.495639 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.495720 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 23 23:30:43.495799 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 23 23:30:43.495877 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 23 23:30:43.495964 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 23 23:30:43.496076 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.496159 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 23 23:30:43.496241 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 23 23:30:43.496321 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 23 23:30:43.496399 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 23 23:30:43.496513 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.496595 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 23 23:30:43.496675 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 23 23:30:43.496757 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 23 23:30:43.496835 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 23 23:30:43.496920 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.497029 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 23 23:30:43.497113 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 23 23:30:43.497194 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 23 23:30:43.497280 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 23 23:30:43.497369 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.497452 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 23 23:30:43.497532 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 23 23:30:43.497611 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 23 23:30:43.497690 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 23 23:30:43.497775 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.497855 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 23 23:30:43.497933 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 23 23:30:43.498025 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 23 23:30:43.498103 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 23 23:30:43.498190 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.498270 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 23 23:30:43.498348 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 23 23:30:43.498429 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 23 23:30:43.498509 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 23 23:30:43.498613 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.498694 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 23 23:30:43.498777 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 23 23:30:43.498857 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 23 23:30:43.498936 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 23 23:30:43.499041 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.499123 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 23 23:30:43.499204 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 23 23:30:43.499282 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 23 23:30:43.499360 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 23 23:30:43.499445 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.499525 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 23 23:30:43.499605 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 23 23:30:43.499689 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 23 23:30:43.499766 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 23 23:30:43.499851 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.499932 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 23 23:30:43.500024 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 23 23:30:43.500106 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 23 23:30:43.500185 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 23 23:30:43.500280 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.500359 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 23 23:30:43.500452 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 23 23:30:43.500538 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 23 23:30:43.500617 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 23 23:30:43.500701 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.500780 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 23 23:30:43.500879 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 23 23:30:43.500971 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 23 23:30:43.501064 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 23 23:30:43.501155 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.501234 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 23 23:30:43.501312 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 23 23:30:43.501390 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 23 23:30:43.501467 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 23 23:30:43.501554 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.501633 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 23 23:30:43.501711 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 23 23:30:43.501789 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 23 23:30:43.501873 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 23 23:30:43.501977 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 23:30:43.502075 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 23 23:30:43.502155 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 23 23:30:43.502233 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 23 23:30:43.502311 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 23 23:30:43.502400 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 23 23:30:43.502484 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 23 23:30:43.502584 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 23 23:30:43.502664 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 23 23:30:43.502754 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 23 23:30:43.502835 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 23 23:30:43.502921 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 23 23:30:43.503022 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 23 23:30:43.503109 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 23 23:30:43.503204 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 23 23:30:43.503328 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 23 23:30:43.503424 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 23 23:30:43.503515 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 23 23:30:43.503597 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 23 23:30:43.503687 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 23 23:30:43.503770 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 23 23:30:43.503851 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 23 23:30:43.503933 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 23 23:30:43.504038 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 23 23:30:43.504122 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 23 23:30:43.504203 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 23 23:30:43.504283 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 23 23:30:43.504363 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 23 23:30:43.504464 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 23 23:30:43.504549 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 23 23:30:43.504628 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 23 23:30:43.504711 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 23 23:30:43.504794 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 23 23:30:43.504872 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 23 23:30:43.504964 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 23 23:30:43.505054 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 23 23:30:43.505133 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 23 23:30:43.505215 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 23 23:30:43.505296 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 23 23:30:43.505375 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 23 23:30:43.505456 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 23 23:30:43.505535 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 23 23:30:43.505616 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 23 23:30:43.505699 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 23 23:30:43.505780 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 23 23:30:43.505858 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 23 23:30:43.505940 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 23 23:30:43.506035 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 23 23:30:43.506115 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 23 23:30:43.506201 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 23 23:30:43.506280 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 23 23:30:43.506359 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 23 23:30:43.506440 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 23 23:30:43.506520 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 23 23:30:43.506597 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 23 23:30:43.506680 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 23 23:30:43.506765 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 23 23:30:43.506844 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 23 23:30:43.506931 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 23 23:30:43.507021 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 23 23:30:43.507100 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 23 23:30:43.507187 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 23 23:30:43.507265 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 23 23:30:43.507344 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 23 23:30:43.507425 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 23 23:30:43.507506 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 23 23:30:43.507587 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 23 23:30:43.507668 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 23 23:30:43.507747 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 23 23:30:43.507825 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 23 23:30:43.507906 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 23 23:30:43.508000 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 23 23:30:43.508084 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 23 23:30:43.508167 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 23 23:30:43.508247 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 23 23:30:43.508326 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 23 23:30:43.508407 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 23 23:30:43.508510 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 23 23:30:43.508591 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 23 23:30:43.508675 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 23 23:30:43.508754 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 23 23:30:43.508832 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 23 23:30:43.508912 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 23 23:30:43.509015 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 23 23:30:43.509100 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 23 23:30:43.509183 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 23 23:30:43.509263 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 23 23:30:43.509341 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 23 23:30:43.509425 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 23 23:30:43.509505 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 23 23:30:43.509584 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 23 23:30:43.509666 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 23 23:30:43.509745 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 23 23:30:43.509824 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 23 23:30:43.509906 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 23 23:30:43.510001 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 23 23:30:43.510083 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 23 23:30:43.510165 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 23 23:30:43.510244 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 23 23:30:43.510322 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 23 23:30:43.510404 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 23 23:30:43.510483 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 23 23:30:43.510561 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 23 23:30:43.510642 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 23 23:30:43.510721 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 23 23:30:43.510801 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 23 23:30:43.510881 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 23 23:30:43.510972 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 23 23:30:43.511053 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 23 23:30:43.511176 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 23 23:30:43.511257 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 23 23:30:43.511340 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 23 23:30:43.511423 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 23 23:30:43.511503 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 23 23:30:43.511581 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 23 23:30:43.511664 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 23 23:30:43.511743 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 23 23:30:43.512759 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 23 23:30:43.512853 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 23 23:30:43.512933 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 23 23:30:43.513033 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 23 23:30:43.513116 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 23 23:30:43.513198 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 23 23:30:43.513279 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 23 23:30:43.513358 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 23 23:30:43.513439 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 23 23:30:43.513521 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 23 23:30:43.513602 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 23 23:30:43.513682 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 23 23:30:43.513766 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 23 23:30:43.513845 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 23 23:30:43.513925 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 23 23:30:43.514021 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 23 23:30:43.514106 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 23 23:30:43.514190 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 23 23:30:43.514275 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 23 23:30:43.514354 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 23 23:30:43.514434 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 23 23:30:43.514516 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 23 23:30:43.514598 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 23 23:30:43.514680 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 23 23:30:43.514766 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 23 23:30:43.514846 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 23 23:30:43.514928 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 23 23:30:43.515035 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 23 23:30:43.515121 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 23 23:30:43.515203 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 23 23:30:43.515285 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 23 23:30:43.515404 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 23 23:30:43.515490 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 23 23:30:43.515570 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 23 23:30:43.515652 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 23 23:30:43.515733 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 23 23:30:43.515814 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 23 23:30:43.515897 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 23 23:30:43.515993 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 23 23:30:43.516077 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 23 23:30:43.516162 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 23 23:30:43.516244 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 23 23:30:43.516328 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 23 23:30:43.516422 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 23 23:30:43.516556 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 23 23:30:43.516643 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 23 23:30:43.516728 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 23 23:30:43.516808 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 23 23:30:43.516888 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 23 23:30:43.516982 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 23 23:30:43.517069 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 23 23:30:43.517151 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 23 23:30:43.517231 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 23 23:30:43.517311 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 23 23:30:43.517392 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 23 23:30:43.517472 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 23 23:30:43.517555 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 23 23:30:43.517643 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 23 23:30:43.517729 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 23 23:30:43.517814 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 23 23:30:43.517896 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 23 23:30:43.517991 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 23 23:30:43.518078 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 23 23:30:43.518161 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 23 23:30:43.518246 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 23 23:30:43.518348 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 23 23:30:43.518452 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 23 23:30:43.518534 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 23 23:30:43.518618 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 23 23:30:43.518699 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 23 23:30:43.518779 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 23 23:30:43.518858 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 23 23:30:43.518938 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 23 23:30:43.519037 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 23 23:30:43.519124 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 23 23:30:43.519218 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 23 23:30:43.519301 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 23 23:30:43.519383 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 23 23:30:43.519464 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 23 23:30:43.519542 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 23 23:30:43.519627 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 23 23:30:43.519707 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 23 23:30:43.519788 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 23 23:30:43.519867 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 23 23:30:43.519948 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 23 23:30:43.520048 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 23 23:30:43.520131 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 23 23:30:43.520210 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 23 23:30:43.520294 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 23 23:30:43.520374 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 23 23:30:43.520472 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 23 23:30:43.520554 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 23 23:30:43.520635 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 23 23:30:43.520718 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 23 23:30:43.520805 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 23 23:30:43.520885 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 23 23:30:43.520982 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 23 23:30:43.521065 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 23 23:30:43.521146 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 23 23:30:43.521229 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 23 23:30:43.521311 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 23 23:30:43.521391 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.521469 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.521551 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 23 23:30:43.521639 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.521719 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.521800 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 23 23:30:43.521880 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.521966 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.522051 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 23 23:30:43.522130 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.522212 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.522294 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 23 23:30:43.522373 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.522451 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.522530 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 23 23:30:43.522609 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.522687 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.522771 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 23 23:30:43.522849 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.522926 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.523042 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 23 23:30:43.523124 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.523203 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.523287 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 23 23:30:43.523367 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.523446 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.523527 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 23 23:30:43.523606 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.523686 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.523768 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 23 23:30:43.523846 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.523925 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.524020 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 23 23:30:43.524101 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.524182 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.524265 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 23 23:30:43.524351 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.524449 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.524553 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 23 23:30:43.524635 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.524716 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.524796 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 23 23:30:43.524878 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.524970 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.525056 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 23 23:30:43.525136 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.525215 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.525297 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 23 23:30:43.525375 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.525466 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.525553 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 23 23:30:43.525636 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.525761 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.525848 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 23 23:30:43.525929 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 23 23:30:43.526026 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 23 23:30:43.526125 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 23 23:30:43.526205 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 23 23:30:43.526287 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 23 23:30:43.526366 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 23 23:30:43.526445 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 23 23:30:43.526525 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 23 23:30:43.526607 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 23 23:30:43.526686 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 23 23:30:43.526766 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 23 23:30:43.526846 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 23 23:30:43.526929 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 23 23:30:43.527023 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 23 23:30:43.527106 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.527188 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.527267 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.527347 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.527427 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.527506 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.527586 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.527701 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.527788 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.527867 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.529908 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.530018 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.530101 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.530181 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.530262 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.530346 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.530428 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.530509 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.530590 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.530670 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.530751 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.530832 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.530912 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.531006 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.531090 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.531169 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.531251 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.531335 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.531416 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.531497 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.531579 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.531659 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.531743 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.531824 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.531906 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 23:30:43.532017 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 23 23:30:43.532112 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 23 23:30:43.532199 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 23 23:30:43.532281 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 23 23:30:43.532361 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 23 23:30:43.532460 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 23 23:30:43.532548 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 23 23:30:43.532638 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 23 23:30:43.532720 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 23 23:30:43.532825 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 23 23:30:43.532910 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 23 23:30:43.533048 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 23 23:30:43.533135 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 23 23:30:43.533215 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 23 23:30:43.533296 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 23 23:30:43.533378 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 23 23:30:43.533462 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 23 23:30:43.533541 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 23 23:30:43.533620 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 23 23:30:43.533699 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 23 23:30:43.533787 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 23 23:30:43.533875 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 23 23:30:43.533970 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 23 23:30:43.534070 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 23 23:30:43.534153 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 23 23:30:43.534241 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 23 23:30:43.534324 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 23 23:30:43.534409 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 23 23:30:43.534492 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 23 23:30:43.534573 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 23 23:30:43.534656 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 23 23:30:43.534741 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 23 23:30:43.534823 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 23 23:30:43.534924 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 23 23:30:43.535025 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 23 23:30:43.535106 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 23 23:30:43.535186 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 23 23:30:43.535265 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 23 23:30:43.535366 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 23 23:30:43.535446 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 23 23:30:43.535526 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 23 23:30:43.535605 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 23 23:30:43.535693 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 23 23:30:43.535773 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 23 23:30:43.535854 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 23 23:30:43.535935 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 23 23:30:43.536040 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 23 23:30:43.536123 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 23 23:30:43.536204 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 23 23:30:43.536285 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 23 23:30:43.536364 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 23 23:30:43.536465 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 23 23:30:43.536548 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 23 23:30:43.536627 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 23 23:30:43.536711 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 23 23:30:43.536790 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 23 23:30:43.536872 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 23 23:30:43.536967 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 23 23:30:43.537053 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 23 23:30:43.537135 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 23 23:30:43.537220 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 23 23:30:43.537304 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 23 23:30:43.537385 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 23 23:30:43.537468 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 23 23:30:43.537549 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 23 23:30:43.537627 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 23 23:30:43.537709 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 23 23:30:43.537788 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 23 23:30:43.537866 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 23 23:30:43.537945 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 23 23:30:43.538058 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 23 23:30:43.538140 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 23 23:30:43.538223 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 23 23:30:43.538305 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 23 23:30:43.538387 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 23 23:30:43.538466 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 23 23:30:43.538545 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 23 23:30:43.538623 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 23 23:30:43.538704 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 23 23:30:43.538785 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 23 23:30:43.538862 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 23 23:30:43.538941 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 23 23:30:43.539037 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 23 23:30:43.539117 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 23 23:30:43.539194 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 23 23:30:43.539272 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 23 23:30:43.539355 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 23 23:30:43.539436 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 23 23:30:43.539515 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 23 23:30:43.539596 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 23 23:30:43.539677 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 23 23:30:43.539756 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 23 23:30:43.539838 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 23 23:30:43.539918 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 23 23:30:43.540013 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 23 23:30:43.540094 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 23 23:30:43.540173 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 23 23:30:43.540253 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 23 23:30:43.540335 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 23 23:30:43.540430 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 23 23:30:43.540517 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 23 23:30:43.540597 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 23 23:30:43.540678 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 23 23:30:43.540758 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 23 23:30:43.540836 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 23 23:30:43.540916 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 23 23:30:43.541036 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 23 23:30:43.541121 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 23 23:30:43.541204 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 23 23:30:43.541284 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 23 23:30:43.541365 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 23 23:30:43.541445 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 23 23:30:43.541527 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 23 23:30:43.541607 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 23 23:30:43.541688 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 23 23:30:43.541770 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 23 23:30:43.541850 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 23 23:30:43.541928 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 23 23:30:43.542032 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 23 23:30:43.542122 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 23 23:30:43.542201 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 23 23:30:43.542281 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 23 23:30:43.542362 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 23 23:30:43.542445 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 23 23:30:43.542524 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 23 23:30:43.542603 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 23 23:30:43.542683 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 23 23:30:43.542755 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 23 23:30:43.542828 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 23 23:30:43.542915 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 23 23:30:43.543004 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 23 23:30:43.543091 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 23 23:30:43.543166 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 23 23:30:43.543246 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 23 23:30:43.543320 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 23 23:30:43.543403 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 23 23:30:43.543480 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 23 23:30:43.543562 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 23 23:30:43.543637 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 23 23:30:43.543717 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 23 23:30:43.543792 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 23 23:30:43.543879 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 23 23:30:43.543965 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 23 23:30:43.544050 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 23 23:30:43.544127 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 23 23:30:43.544209 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 23 23:30:43.544282 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 23 23:30:43.544365 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 23 23:30:43.544453 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 23 23:30:43.544540 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 23 23:30:43.544613 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 23 23:30:43.544693 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 23 23:30:43.544769 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 23 23:30:43.544849 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 23 23:30:43.544922 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 23 23:30:43.545025 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 23 23:30:43.545104 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 23 23:30:43.545187 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 23 23:30:43.545261 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 23 23:30:43.545346 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 23 23:30:43.545419 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 23 23:30:43.545500 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 23 23:30:43.545575 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 23 23:30:43.545655 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 23 23:30:43.545728 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 23 23:30:43.545806 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 23 23:30:43.545879 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 23 23:30:43.545975 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 23 23:30:43.546067 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 23 23:30:43.546143 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 23 23:30:43.546217 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 23 23:30:43.546296 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 23 23:30:43.546369 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 23 23:30:43.546446 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 23 23:30:43.546523 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 23 23:30:43.546597 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 23 23:30:43.546669 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 23 23:30:43.546748 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 23 23:30:43.546823 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 23 23:30:43.546895 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 23 23:30:43.546997 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 23 23:30:43.547074 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 23 23:30:43.547148 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 23 23:30:43.547288 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 23 23:30:43.547370 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 23 23:30:43.547443 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 23 23:30:43.547524 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 23 23:30:43.547597 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 23 23:30:43.547671 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 23 23:30:43.547751 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 23 23:30:43.547827 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 23 23:30:43.547900 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 23 23:30:43.548011 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 23 23:30:43.548087 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 23 23:30:43.548160 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 23 23:30:43.548244 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 23 23:30:43.548318 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 23 23:30:43.548390 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 23 23:30:43.548491 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 23 23:30:43.548567 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 23 23:30:43.548662 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 23 23:30:43.548752 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 23 23:30:43.548826 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 23 23:30:43.548902 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 23 23:30:43.549003 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 23 23:30:43.549081 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 23 23:30:43.549159 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 23 23:30:43.549270 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 23 23:30:43.549347 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 23 23:30:43.549420 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 23 23:30:43.549430 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 23 23:30:43.549438 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 23 23:30:43.549447 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 23 23:30:43.549457 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 23 23:30:43.549465 kernel: iommu: Default domain type: Translated Jan 23 23:30:43.549474 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 23 23:30:43.549482 kernel: efivars: Registered efivars operations Jan 23 23:30:43.549490 kernel: vgaarb: loaded Jan 23 23:30:43.549498 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 23 23:30:43.549506 kernel: VFS: Disk quotas dquot_6.6.0 Jan 23 23:30:43.549516 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 23 23:30:43.549524 kernel: pnp: PnP ACPI init Jan 23 23:30:43.549613 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 23 23:30:43.549625 kernel: pnp: PnP ACPI: found 1 devices Jan 23 23:30:43.549633 kernel: NET: Registered PF_INET protocol family Jan 23 23:30:43.549641 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 23 23:30:43.549651 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 23 23:30:43.549659 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 23 23:30:43.549683 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 23 23:30:43.549693 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 23 23:30:43.549701 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 23 23:30:43.549709 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 23 23:30:43.549717 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 23 23:30:43.549728 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 23 23:30:43.549821 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 23 23:30:43.549832 kernel: PCI: CLS 0 bytes, default 64 Jan 23 23:30:43.549840 kernel: kvm [1]: HYP mode not available Jan 23 23:30:43.549848 kernel: Initialise system trusted keyrings Jan 23 23:30:43.549857 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 23 23:30:43.549865 kernel: Key type asymmetric registered Jan 23 23:30:43.549875 kernel: Asymmetric key parser 'x509' registered Jan 23 23:30:43.549883 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 23 23:30:43.549891 kernel: io scheduler mq-deadline registered Jan 23 23:30:43.549900 kernel: io scheduler kyber registered Jan 23 23:30:43.549908 kernel: io scheduler bfq registered Jan 23 23:30:43.549916 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 23 23:30:43.550013 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 23 23:30:43.550126 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 23 23:30:43.550213 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.550296 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 23 23:30:43.550376 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 23 23:30:43.550454 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.550537 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 23 23:30:43.550617 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 23 23:30:43.550697 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.550778 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 23 23:30:43.550858 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 23 23:30:43.550938 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.551042 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 23 23:30:43.551123 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 23 23:30:43.551232 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.551340 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 23 23:30:43.552398 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 23 23:30:43.552498 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.552582 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 23 23:30:43.552661 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 23 23:30:43.552740 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.552825 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 23 23:30:43.552903 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 23 23:30:43.552995 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.553007 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 23 23:30:43.553087 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 23 23:30:43.553165 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 23 23:30:43.553247 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.553327 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 23 23:30:43.553406 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 23 23:30:43.553484 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.553565 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 23 23:30:43.553644 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 23 23:30:43.553724 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.553806 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 23 23:30:43.553891 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 23 23:30:43.553983 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.554067 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 23 23:30:43.554147 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 23 23:30:43.554225 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.554309 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 23 23:30:43.554389 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 23 23:30:43.554467 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.554547 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 23 23:30:43.554645 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 23 23:30:43.554723 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.554809 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 23 23:30:43.554889 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 23 23:30:43.554981 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.554993 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 23 23:30:43.555073 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 23 23:30:43.555154 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 23 23:30:43.555236 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.555325 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 23 23:30:43.555406 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 23 23:30:43.555485 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.555565 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 23 23:30:43.555643 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 23 23:30:43.555722 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.555806 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 23 23:30:43.555886 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 23 23:30:43.555989 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.556077 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 23 23:30:43.556158 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 23 23:30:43.556238 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.556327 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 23 23:30:43.556423 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 23 23:30:43.556513 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.556596 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 23 23:30:43.556676 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 23 23:30:43.556756 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.556841 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 23 23:30:43.556922 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 23 23:30:43.557021 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.557033 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 23 23:30:43.557116 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 23 23:30:43.557197 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 23 23:30:43.557276 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.557362 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 23 23:30:43.557442 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 23 23:30:43.557520 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.557602 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 23 23:30:43.557683 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 23 23:30:43.557765 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.557851 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 23 23:30:43.557934 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 23 23:30:43.558047 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.558132 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 23 23:30:43.558218 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 23 23:30:43.558297 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.558382 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 23 23:30:43.558461 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 23 23:30:43.558539 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.558621 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 23 23:30:43.558700 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 23 23:30:43.558778 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.558862 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 23 23:30:43.558942 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 23 23:30:43.559035 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.559127 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 23 23:30:43.559211 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 23 23:30:43.559293 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 23:30:43.559305 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 23 23:30:43.559315 kernel: ACPI: button: Power Button [PWRB] Jan 23 23:30:43.559399 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 23 23:30:43.559485 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 23 23:30:43.559497 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 23 23:30:43.559505 kernel: thunder_xcv, ver 1.0 Jan 23 23:30:43.559513 kernel: thunder_bgx, ver 1.0 Jan 23 23:30:43.559521 kernel: nicpf, ver 1.0 Jan 23 23:30:43.559531 kernel: nicvf, ver 1.0 Jan 23 23:30:43.559642 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 23 23:30:43.559722 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-23T23:30:42 UTC (1769211042) Jan 23 23:30:43.559738 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 23 23:30:43.559747 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 23 23:30:43.559755 kernel: NET: Registered PF_INET6 protocol family Jan 23 23:30:43.559765 kernel: watchdog: NMI not fully supported Jan 23 23:30:43.559774 kernel: watchdog: Hard watchdog permanently disabled Jan 23 23:30:43.559782 kernel: Segment Routing with IPv6 Jan 23 23:30:43.559790 kernel: In-situ OAM (IOAM) with IPv6 Jan 23 23:30:43.559798 kernel: NET: Registered PF_PACKET protocol family Jan 23 23:30:43.559806 kernel: Key type dns_resolver registered Jan 23 23:30:43.559816 kernel: registered taskstats version 1 Jan 23 23:30:43.559842 kernel: Loading compiled-in X.509 certificates Jan 23 23:30:43.559850 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: efe9666f272e2216f3315e00dc27df84b73ce009' Jan 23 23:30:43.559859 kernel: Demotion targets for Node 0: null Jan 23 23:30:43.559867 kernel: Key type .fscrypt registered Jan 23 23:30:43.559874 kernel: Key type fscrypt-provisioning registered Jan 23 23:30:43.559882 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 23 23:30:43.559890 kernel: ima: Allocated hash algorithm: sha1 Jan 23 23:30:43.559898 kernel: ima: No architecture policies found Jan 23 23:30:43.559908 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 23 23:30:43.559916 kernel: clk: Disabling unused clocks Jan 23 23:30:43.559924 kernel: PM: genpd: Disabling unused power domains Jan 23 23:30:43.559933 kernel: Freeing unused kernel memory: 12480K Jan 23 23:30:43.559941 kernel: Run /init as init process Jan 23 23:30:43.559949 kernel: with arguments: Jan 23 23:30:43.559967 kernel: /init Jan 23 23:30:43.559977 kernel: with environment: Jan 23 23:30:43.559985 kernel: HOME=/ Jan 23 23:30:43.559994 kernel: TERM=linux Jan 23 23:30:43.560001 kernel: ACPI: bus type USB registered Jan 23 23:30:43.560009 kernel: usbcore: registered new interface driver usbfs Jan 23 23:30:43.560018 kernel: usbcore: registered new interface driver hub Jan 23 23:30:43.560026 kernel: usbcore: registered new device driver usb Jan 23 23:30:43.560123 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 23:30:43.560206 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 23 23:30:43.560287 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 23 23:30:43.560368 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 23:30:43.560463 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 23 23:30:43.560547 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 23 23:30:43.560657 kernel: hub 1-0:1.0: USB hub found Jan 23 23:30:43.560758 kernel: hub 1-0:1.0: 4 ports detected Jan 23 23:30:43.560861 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 23 23:30:43.561001 kernel: hub 2-0:1.0: USB hub found Jan 23 23:30:43.561101 kernel: hub 2-0:1.0: 4 ports detected Jan 23 23:30:43.561197 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 23 23:30:43.561284 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 23 23:30:43.561297 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 23 23:30:43.561306 kernel: GPT:25804799 != 104857599 Jan 23 23:30:43.561314 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 23 23:30:43.561323 kernel: GPT:25804799 != 104857599 Jan 23 23:30:43.561331 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 23 23:30:43.561340 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 23:30:43.561349 kernel: SCSI subsystem initialized Jan 23 23:30:43.561357 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 23 23:30:43.561366 kernel: device-mapper: uevent: version 1.0.3 Jan 23 23:30:43.561375 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 23 23:30:43.561383 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 23 23:30:43.561393 kernel: raid6: neonx8 gen() 15801 MB/s Jan 23 23:30:43.561401 kernel: raid6: neonx4 gen() 15767 MB/s Jan 23 23:30:43.561410 kernel: raid6: neonx2 gen() 13198 MB/s Jan 23 23:30:43.561418 kernel: raid6: neonx1 gen() 10570 MB/s Jan 23 23:30:43.561427 kernel: raid6: int64x8 gen() 6842 MB/s Jan 23 23:30:43.561435 kernel: raid6: int64x4 gen() 7353 MB/s Jan 23 23:30:43.561443 kernel: raid6: int64x2 gen() 6123 MB/s Jan 23 23:30:43.561452 kernel: raid6: int64x1 gen() 5053 MB/s Jan 23 23:30:43.561461 kernel: raid6: using algorithm neonx8 gen() 15801 MB/s Jan 23 23:30:43.561470 kernel: raid6: .... xor() 12062 MB/s, rmw enabled Jan 23 23:30:43.561478 kernel: raid6: using neon recovery algorithm Jan 23 23:30:43.561487 kernel: xor: measuring software checksum speed Jan 23 23:30:43.561497 kernel: 8regs : 21539 MB/sec Jan 23 23:30:43.561506 kernel: 32regs : 21687 MB/sec Jan 23 23:30:43.561515 kernel: arm64_neon : 25260 MB/sec Jan 23 23:30:43.561524 kernel: xor: using function: arm64_neon (25260 MB/sec) Jan 23 23:30:43.561626 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 23 23:30:43.561639 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 23 23:30:43.561649 kernel: BTRFS: device fsid 21279126-4100-4897-a95e-923d96100946 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (272) Jan 23 23:30:43.561658 kernel: BTRFS info (device dm-0): first mount of filesystem 21279126-4100-4897-a95e-923d96100946 Jan 23 23:30:43.561666 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 23 23:30:43.561677 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 23 23:30:43.561685 kernel: BTRFS info (device dm-0): enabling free space tree Jan 23 23:30:43.561694 kernel: loop: module loaded Jan 23 23:30:43.561702 kernel: loop0: detected capacity change from 0 to 91832 Jan 23 23:30:43.561711 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 23 23:30:43.561720 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 23 23:30:43.561828 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 23 23:30:43.561842 kernel: usbcore: registered new interface driver usbhid Jan 23 23:30:43.561851 kernel: usbhid: USB HID core driver Jan 23 23:30:43.561950 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 23 23:30:43.561981 systemd[1]: Successfully made /usr/ read-only. Jan 23 23:30:43.561993 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 23:30:43.562005 systemd[1]: Detected virtualization kvm. Jan 23 23:30:43.562014 systemd[1]: Detected architecture arm64. Jan 23 23:30:43.562022 systemd[1]: Running in initrd. Jan 23 23:30:43.562031 systemd[1]: No hostname configured, using default hostname. Jan 23 23:30:43.562040 systemd[1]: Hostname set to . Jan 23 23:30:43.562049 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 23:30:43.562059 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 23 23:30:43.562174 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 23 23:30:43.562187 systemd[1]: Queued start job for default target initrd.target. Jan 23 23:30:43.562196 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 23:30:43.562205 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 23:30:43.562214 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 23:30:43.562226 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 23 23:30:43.562235 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 23:30:43.562245 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 23 23:30:43.562254 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 23 23:30:43.562263 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 23:30:43.562274 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 23:30:43.562283 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 23 23:30:43.562292 systemd[1]: Reached target paths.target - Path Units. Jan 23 23:30:43.562300 systemd[1]: Reached target slices.target - Slice Units. Jan 23 23:30:43.562309 systemd[1]: Reached target swap.target - Swaps. Jan 23 23:30:43.562318 systemd[1]: Reached target timers.target - Timer Units. Jan 23 23:30:43.562327 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 23:30:43.562338 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 23:30:43.562347 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 23:30:43.562356 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 23 23:30:43.562365 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 23 23:30:43.562374 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 23:30:43.562383 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 23:30:43.562392 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 23:30:43.562402 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 23:30:43.562412 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 23 23:30:43.562421 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 23 23:30:43.562429 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 23:30:43.562438 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 23 23:30:43.562448 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 23 23:30:43.562458 systemd[1]: Starting systemd-fsck-usr.service... Jan 23 23:30:43.562467 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 23:30:43.562476 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 23:30:43.562485 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 23:30:43.562496 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 23 23:30:43.562505 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 23:30:43.562514 systemd[1]: Finished systemd-fsck-usr.service. Jan 23 23:30:43.562523 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 23 23:30:43.562552 systemd-journald[420]: Collecting audit messages is enabled. Jan 23 23:30:43.562575 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 23 23:30:43.562584 kernel: Bridge firewalling registered Jan 23 23:30:43.562593 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 23:30:43.562602 kernel: audit: type=1130 audit(1769211043.494:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.562612 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 23:30:43.562622 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 23:30:43.562631 kernel: audit: type=1130 audit(1769211043.505:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.562640 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 23:30:43.562650 kernel: audit: type=1130 audit(1769211043.512:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.562660 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 23 23:30:43.562669 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 23:30:43.562680 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 23:30:43.562689 kernel: audit: type=1130 audit(1769211043.527:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.562698 kernel: audit: type=1334 audit(1769211043.528:6): prog-id=6 op=LOAD Jan 23 23:30:43.562706 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 23:30:43.562716 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 23:30:43.562726 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 23:30:43.562735 kernel: audit: type=1130 audit(1769211043.550:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.562746 kernel: audit: type=1130 audit(1769211043.554:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.562755 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 23 23:30:43.562765 systemd-journald[420]: Journal started Jan 23 23:30:43.562784 systemd-journald[420]: Runtime Journal (/run/log/journal/dbd2950648b84aff8a192b4f432132d4) is 8M, max 319.5M, 311.5M free. Jan 23 23:30:43.494000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.505000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.527000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.528000 audit: BPF prog-id=6 op=LOAD Jan 23 23:30:43.550000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.554000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.492139 systemd-modules-load[421]: Inserted module 'br_netfilter' Jan 23 23:30:43.571742 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 23:30:43.571000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.576010 kernel: audit: type=1130 audit(1769211043.571:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.576294 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 23:30:43.582257 dracut-cmdline[453]: dracut-109 Jan 23 23:30:43.586519 dracut-cmdline[453]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=0b7aa2947ffddc152dd47eebbcf7a95dcd57c97b69958c2bfdf6c1781ecaf3c1 Jan 23 23:30:43.591567 systemd-resolved[440]: Positive Trust Anchors: Jan 23 23:30:43.591577 systemd-resolved[440]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 23:30:43.591580 systemd-resolved[440]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 23:30:43.591612 systemd-resolved[440]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 23:30:43.602538 systemd-tmpfiles[465]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 23 23:30:43.609804 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 23:30:43.611000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.616002 kernel: audit: type=1130 audit(1769211043.611:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.620708 systemd-resolved[440]: Defaulting to hostname 'linux'. Jan 23 23:30:43.621543 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 23:30:43.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.622747 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 23:30:43.677987 kernel: Loading iSCSI transport class v2.0-870. Jan 23 23:30:43.690987 kernel: iscsi: registered transport (tcp) Jan 23 23:30:43.705308 kernel: iscsi: registered transport (qla4xxx) Jan 23 23:30:43.705383 kernel: QLogic iSCSI HBA Driver Jan 23 23:30:43.727631 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 23:30:43.752594 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 23:30:43.752000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.754151 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 23:30:43.801163 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 23 23:30:43.801000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.803482 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 23 23:30:43.805044 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 23 23:30:43.837856 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 23 23:30:43.837000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.838000 audit: BPF prog-id=7 op=LOAD Jan 23 23:30:43.838000 audit: BPF prog-id=8 op=LOAD Jan 23 23:30:43.840381 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 23:30:43.871401 systemd-udevd[696]: Using default interface naming scheme 'v257'. Jan 23 23:30:43.879062 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 23:30:43.880000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.881662 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 23 23:30:43.904875 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 23:30:43.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.908000 audit: BPF prog-id=9 op=LOAD Jan 23 23:30:43.909186 dracut-pre-trigger[769]: rd.md=0: removing MD RAID activation Jan 23 23:30:43.909731 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 23:30:43.932522 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 23:30:43.932000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.934837 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 23:30:43.953517 systemd-networkd[808]: lo: Link UP Jan 23 23:30:43.953526 systemd-networkd[808]: lo: Gained carrier Jan 23 23:30:43.955040 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 23:30:43.954000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:43.956011 systemd[1]: Reached target network.target - Network. Jan 23 23:30:44.024152 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 23:30:44.026000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:44.029944 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 23 23:30:44.087493 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 23 23:30:44.102382 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 23 23:30:44.121017 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 23:30:44.128852 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 23 23:30:44.131904 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 23 23:30:44.142231 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 23:30:44.143265 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 23:30:44.144000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:44.144567 systemd-networkd[808]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 23:30:44.144571 systemd-networkd[808]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 23:30:44.145249 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 23:30:44.145708 systemd-networkd[808]: eth0: Link UP Jan 23 23:30:44.145868 systemd-networkd[808]: eth0: Gained carrier Jan 23 23:30:44.145879 systemd-networkd[808]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 23:30:44.148022 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 23:30:44.157008 disk-uuid[878]: Primary Header is updated. Jan 23 23:30:44.157008 disk-uuid[878]: Secondary Entries is updated. Jan 23 23:30:44.157008 disk-uuid[878]: Secondary Header is updated. Jan 23 23:30:44.187283 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 23:30:44.187000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:44.207053 systemd-networkd[808]: eth0: DHCPv4 address 10.0.10.88/25, gateway 10.0.10.1 acquired from 10.0.10.1 Jan 23 23:30:44.237008 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 23 23:30:44.236000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:44.238101 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 23:30:44.240150 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 23:30:44.241890 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 23:30:44.244493 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 23 23:30:44.280504 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 23 23:30:44.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:45.196239 disk-uuid[880]: Warning: The kernel is still using the old partition table. Jan 23 23:30:45.196239 disk-uuid[880]: The new table will be used at the next reboot or after you Jan 23 23:30:45.196239 disk-uuid[880]: run partprobe(8) or kpartx(8) Jan 23 23:30:45.196239 disk-uuid[880]: The operation has completed successfully. Jan 23 23:30:45.204918 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 23 23:30:45.205000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:45.205000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:45.205040 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 23 23:30:45.208643 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 23 23:30:45.233047 systemd-networkd[808]: eth0: Gained IPv6LL Jan 23 23:30:45.251000 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (907) Jan 23 23:30:45.253192 kernel: BTRFS info (device vda6): first mount of filesystem 2702f6ba-cc76-44c2-967b-e3e9acbe619a Jan 23 23:30:45.253221 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 23:30:45.257436 kernel: BTRFS info (device vda6): turning on async discard Jan 23 23:30:45.257458 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 23:30:45.262996 kernel: BTRFS info (device vda6): last unmount of filesystem 2702f6ba-cc76-44c2-967b-e3e9acbe619a Jan 23 23:30:45.263915 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 23 23:30:45.263000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:45.267107 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 23 23:30:45.405812 ignition[926]: Ignition 2.24.0 Jan 23 23:30:45.405826 ignition[926]: Stage: fetch-offline Jan 23 23:30:45.405866 ignition[926]: no configs at "/usr/lib/ignition/base.d" Jan 23 23:30:45.405875 ignition[926]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 23:30:45.406066 ignition[926]: parsed url from cmdline: "" Jan 23 23:30:45.409203 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 23:30:45.410000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:45.406070 ignition[926]: no config URL provided Jan 23 23:30:45.411785 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 23 23:30:45.406692 ignition[926]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 23:30:45.406703 ignition[926]: no config at "/usr/lib/ignition/user.ign" Jan 23 23:30:45.406708 ignition[926]: failed to fetch config: resource requires networking Jan 23 23:30:45.406875 ignition[926]: Ignition finished successfully Jan 23 23:30:45.435907 ignition[936]: Ignition 2.24.0 Jan 23 23:30:45.435929 ignition[936]: Stage: fetch Jan 23 23:30:45.436088 ignition[936]: no configs at "/usr/lib/ignition/base.d" Jan 23 23:30:45.436096 ignition[936]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 23:30:45.436177 ignition[936]: parsed url from cmdline: "" Jan 23 23:30:45.436180 ignition[936]: no config URL provided Jan 23 23:30:45.436184 ignition[936]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 23:30:45.436190 ignition[936]: no config at "/usr/lib/ignition/user.ign" Jan 23 23:30:45.436349 ignition[936]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 23:30:45.436365 ignition[936]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 23:30:45.436370 ignition[936]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 23 23:30:46.437531 ignition[936]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 23:30:46.437657 ignition[936]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 23:30:47.249853 ignition[936]: GET result: OK Jan 23 23:30:47.250988 ignition[936]: parsing config with SHA512: 9a5997a40fe43571179e9a78b58d8f80fc9a7f672526f9eafe8166b651ff46d4f366b6cd2cd1e549a974cdcd1e65bdccf049a59d28174369bba85f859679d659 Jan 23 23:30:47.255978 unknown[936]: fetched base config from "system" Jan 23 23:30:47.256780 unknown[936]: fetched base config from "system" Jan 23 23:30:47.257157 ignition[936]: fetch: fetch complete Jan 23 23:30:47.256787 unknown[936]: fetched user config from "openstack" Jan 23 23:30:47.257162 ignition[936]: fetch: fetch passed Jan 23 23:30:47.257216 ignition[936]: Ignition finished successfully Jan 23 23:30:47.261907 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 23 23:30:47.266259 kernel: kauditd_printk_skb: 20 callbacks suppressed Jan 23 23:30:47.266285 kernel: audit: type=1130 audit(1769211047.261:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.264271 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 23 23:30:47.289813 ignition[944]: Ignition 2.24.0 Jan 23 23:30:47.289835 ignition[944]: Stage: kargs Jan 23 23:30:47.289997 ignition[944]: no configs at "/usr/lib/ignition/base.d" Jan 23 23:30:47.290006 ignition[944]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 23:30:47.290839 ignition[944]: kargs: kargs passed Jan 23 23:30:47.290887 ignition[944]: Ignition finished successfully Jan 23 23:30:47.295754 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 23 23:30:47.295000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.297752 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 23 23:30:47.301084 kernel: audit: type=1130 audit(1769211047.295:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.330159 ignition[951]: Ignition 2.24.0 Jan 23 23:30:47.330177 ignition[951]: Stage: disks Jan 23 23:30:47.330326 ignition[951]: no configs at "/usr/lib/ignition/base.d" Jan 23 23:30:47.330335 ignition[951]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 23:30:47.333214 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 23 23:30:47.333000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.331081 ignition[951]: disks: disks passed Jan 23 23:30:47.338850 kernel: audit: type=1130 audit(1769211047.333:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.337215 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 23 23:30:47.331125 ignition[951]: Ignition finished successfully Jan 23 23:30:47.338428 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 23 23:30:47.339870 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 23:30:47.341575 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 23:30:47.342876 systemd[1]: Reached target basic.target - Basic System. Jan 23 23:30:47.345618 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 23 23:30:47.392746 systemd-fsck[960]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 23 23:30:47.397198 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 23 23:30:47.397000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.402071 kernel: audit: type=1130 audit(1769211047.397:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.399732 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 23 23:30:47.564672 kernel: EXT4-fs (vda9): mounted filesystem 7d385daa-3990-4052-81b1-28f91f90f881 r/w with ordered data mode. Quota mode: none. Jan 23 23:30:47.563889 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 23 23:30:47.565165 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 23 23:30:47.569977 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 23:30:47.571658 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 23 23:30:47.572642 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 23 23:30:47.573281 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 23 23:30:47.577436 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 23 23:30:47.577480 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 23:30:47.594257 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 23 23:30:47.596249 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 23 23:30:47.612990 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (968) Jan 23 23:30:47.616088 kernel: BTRFS info (device vda6): first mount of filesystem 2702f6ba-cc76-44c2-967b-e3e9acbe619a Jan 23 23:30:47.616128 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 23:30:47.622356 kernel: BTRFS info (device vda6): turning on async discard Jan 23 23:30:47.622409 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 23:30:47.623448 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 23:30:47.662004 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:30:47.782038 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 23 23:30:47.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.786577 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 23 23:30:47.788351 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 23 23:30:47.790426 kernel: audit: type=1130 audit(1769211047.782:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.808807 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 23 23:30:47.811024 kernel: BTRFS info (device vda6): last unmount of filesystem 2702f6ba-cc76-44c2-967b-e3e9acbe619a Jan 23 23:30:47.829989 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 23 23:30:47.830000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.834988 kernel: audit: type=1130 audit(1769211047.830:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.838624 ignition[1070]: INFO : Ignition 2.24.0 Jan 23 23:30:47.838624 ignition[1070]: INFO : Stage: mount Jan 23 23:30:47.840557 ignition[1070]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 23:30:47.840557 ignition[1070]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 23:30:47.840557 ignition[1070]: INFO : mount: mount passed Jan 23 23:30:47.840557 ignition[1070]: INFO : Ignition finished successfully Jan 23 23:30:47.848433 kernel: audit: type=1130 audit(1769211047.841:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.841000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:47.841323 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 23 23:30:48.723018 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:30:50.732999 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:30:54.738040 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:30:54.746237 coreos-metadata[970]: Jan 23 23:30:54.746 WARN failed to locate config-drive, using the metadata service API instead Jan 23 23:30:54.764747 coreos-metadata[970]: Jan 23 23:30:54.764 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 23:30:55.409504 coreos-metadata[970]: Jan 23 23:30:55.409 INFO Fetch successful Jan 23 23:30:55.409504 coreos-metadata[970]: Jan 23 23:30:55.409 INFO wrote hostname ci-4593-0-0-1-266c03b17e to /sysroot/etc/hostname Jan 23 23:30:55.411652 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 23 23:30:55.425863 kernel: audit: type=1130 audit(1769211055.411:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:55.425893 kernel: audit: type=1131 audit(1769211055.411:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:55.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:55.411000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:55.411739 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 23 23:30:55.414162 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 23 23:30:55.443182 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 23:30:55.460975 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1087) Jan 23 23:30:55.462973 kernel: BTRFS info (device vda6): first mount of filesystem 2702f6ba-cc76-44c2-967b-e3e9acbe619a Jan 23 23:30:55.463006 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 23:30:55.468001 kernel: BTRFS info (device vda6): turning on async discard Jan 23 23:30:55.468051 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 23:30:55.469549 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 23:30:55.502291 ignition[1105]: INFO : Ignition 2.24.0 Jan 23 23:30:55.502291 ignition[1105]: INFO : Stage: files Jan 23 23:30:55.503800 ignition[1105]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 23:30:55.503800 ignition[1105]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 23:30:55.503800 ignition[1105]: DEBUG : files: compiled without relabeling support, skipping Jan 23 23:30:55.506928 ignition[1105]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 23 23:30:55.506928 ignition[1105]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 23 23:30:55.509436 ignition[1105]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 23 23:30:55.510616 ignition[1105]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 23 23:30:55.510616 ignition[1105]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 23 23:30:55.510129 unknown[1105]: wrote ssh authorized keys file for user: core Jan 23 23:30:55.514206 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 23 23:30:55.514206 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 23 23:30:55.565183 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 23 23:30:55.680519 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 23 23:30:55.680519 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 23:30:55.684164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 23 23:30:55.699240 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 23 23:30:55.699240 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 23 23:30:55.699240 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Jan 23 23:30:55.786713 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 23 23:30:56.338318 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 23 23:30:56.338318 ignition[1105]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 23 23:30:56.341648 ignition[1105]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 23:30:56.344656 ignition[1105]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 23:30:56.344656 ignition[1105]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 23 23:30:56.344656 ignition[1105]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 23 23:30:56.344656 ignition[1105]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 23 23:30:56.344656 ignition[1105]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 23 23:30:56.344656 ignition[1105]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 23 23:30:56.344656 ignition[1105]: INFO : files: files passed Jan 23 23:30:56.344656 ignition[1105]: INFO : Ignition finished successfully Jan 23 23:30:56.358674 kernel: audit: type=1130 audit(1769211056.347:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.347000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.346270 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 23 23:30:56.351512 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 23 23:30:56.354497 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 23 23:30:56.368781 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 23 23:30:56.368905 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 23 23:30:56.376166 kernel: audit: type=1130 audit(1769211056.369:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.376193 kernel: audit: type=1131 audit(1769211056.369:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.369000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.369000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.380013 initrd-setup-root-after-ignition[1140]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 23:30:56.380013 initrd-setup-root-after-ignition[1140]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 23 23:30:56.382729 initrd-setup-root-after-ignition[1144]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 23:30:56.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.382467 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 23:30:56.391473 kernel: audit: type=1130 audit(1769211056.382:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.384141 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 23 23:30:56.389463 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 23 23:30:56.421796 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 23 23:30:56.421932 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 23 23:30:56.429557 kernel: audit: type=1130 audit(1769211056.422:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.429584 kernel: audit: type=1131 audit(1769211056.422:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.422000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.422000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.424029 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 23 23:30:56.430349 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 23 23:30:56.431949 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 23 23:30:56.432951 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 23 23:30:56.458040 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 23:30:56.458000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.461206 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 23 23:30:56.464286 kernel: audit: type=1130 audit(1769211056.458:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.486480 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 23:30:56.486703 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 23 23:30:56.488824 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 23:30:56.490606 systemd[1]: Stopped target timers.target - Timer Units. Jan 23 23:30:56.492096 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 23 23:30:56.493000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.492232 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 23:30:56.497687 kernel: audit: type=1131 audit(1769211056.493:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.496799 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 23 23:30:56.498601 systemd[1]: Stopped target basic.target - Basic System. Jan 23 23:30:56.499943 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 23 23:30:56.501479 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 23:30:56.503061 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 23 23:30:56.504836 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 23 23:30:56.506500 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 23 23:30:56.508073 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 23:30:56.509883 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 23 23:30:56.511612 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 23 23:30:56.513075 systemd[1]: Stopped target swap.target - Swaps. Jan 23 23:30:56.514443 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 23 23:30:56.515000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.514580 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 23 23:30:56.516659 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 23 23:30:56.518401 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 23:30:56.519934 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 23 23:30:56.520034 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 23:30:56.523000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.521886 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 23 23:30:56.522025 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 23 23:30:56.526000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.524433 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 23 23:30:56.527000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.524553 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 23:30:56.526308 systemd[1]: ignition-files.service: Deactivated successfully. Jan 23 23:30:56.526411 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 23 23:30:56.531000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.528687 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 23 23:30:56.530457 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 23 23:30:56.530592 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 23:30:56.546722 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 23 23:30:56.547575 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 23 23:30:56.548000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.547705 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 23:30:56.550000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.549447 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 23 23:30:56.552000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.549565 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 23:30:56.551102 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 23 23:30:56.551211 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 23:30:56.558982 ignition[1165]: INFO : Ignition 2.24.0 Jan 23 23:30:56.558982 ignition[1165]: INFO : Stage: umount Jan 23 23:30:56.558982 ignition[1165]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 23:30:56.558982 ignition[1165]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 23:30:56.560000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.560000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.565386 ignition[1165]: INFO : umount: umount passed Jan 23 23:30:56.565386 ignition[1165]: INFO : Ignition finished successfully Jan 23 23:30:56.565000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.559495 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 23 23:30:56.559598 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 23 23:30:56.568000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.565273 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 23 23:30:56.569000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.565797 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 23 23:30:56.571000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.567648 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 23 23:30:56.568044 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 23 23:30:56.574000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.568087 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 23 23:30:56.569415 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 23 23:30:56.569460 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 23 23:30:56.571020 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 23 23:30:56.571070 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 23 23:30:56.572653 systemd[1]: Stopped target network.target - Network. Jan 23 23:30:56.573992 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 23 23:30:56.574067 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 23:30:56.575628 systemd[1]: Stopped target paths.target - Path Units. Jan 23 23:30:56.576900 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 23 23:30:56.581012 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 23:30:56.582480 systemd[1]: Stopped target slices.target - Slice Units. Jan 23 23:30:56.583852 systemd[1]: Stopped target sockets.target - Socket Units. Jan 23 23:30:56.590000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.585500 systemd[1]: iscsid.socket: Deactivated successfully. Jan 23 23:30:56.592000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.585538 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 23:30:56.586847 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 23 23:30:56.586878 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 23:30:56.588497 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 23 23:30:56.588520 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 23 23:30:56.599000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.590442 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 23 23:30:56.600000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.590497 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 23 23:30:56.591767 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 23 23:30:56.591810 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 23 23:30:56.593357 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 23 23:30:56.594756 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 23 23:30:56.606000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.596422 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 23 23:30:56.597996 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 23 23:30:56.599601 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 23 23:30:56.599700 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 23 23:30:56.611000 audit: BPF prog-id=6 op=UNLOAD Jan 23 23:30:56.611000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.605127 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 23 23:30:56.605235 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 23 23:30:56.610157 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 23 23:30:56.610267 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 23 23:30:56.619000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.620000 audit: BPF prog-id=9 op=UNLOAD Jan 23 23:30:56.612996 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 23 23:30:56.621000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.614835 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 23 23:30:56.622000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.614884 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 23 23:30:56.617220 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 23 23:30:56.617928 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 23 23:30:56.618008 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 23:30:56.619797 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 23 23:30:56.619844 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 23 23:30:56.621487 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 23 23:30:56.621531 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 23 23:30:56.623053 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 23:30:56.642735 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 23 23:30:56.642892 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 23:30:56.644000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.646125 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 23 23:30:56.646199 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 23 23:30:56.647982 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 23 23:30:56.648016 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 23:30:56.651000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.649841 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 23 23:30:56.649899 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 23 23:30:56.653000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.652179 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 23 23:30:56.652229 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 23 23:30:56.656000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.654531 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 23 23:30:56.654583 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 23:30:56.663916 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 23 23:30:56.664819 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 23 23:30:56.665000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.664886 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 23:30:56.667000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.666834 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 23 23:30:56.669000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.666892 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 23:30:56.671000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.668815 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 23:30:56.668863 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 23:30:56.671573 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 23 23:30:56.671683 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 23 23:30:56.676171 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 23 23:30:56.676000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.676000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:56.676258 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 23 23:30:56.678349 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 23 23:30:56.680468 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 23 23:30:56.704765 systemd[1]: Switching root. Jan 23 23:30:56.742058 systemd-journald[420]: Journal stopped Jan 23 23:30:58.348304 systemd-journald[420]: Received SIGTERM from PID 1 (systemd). Jan 23 23:30:58.348397 kernel: SELinux: policy capability network_peer_controls=1 Jan 23 23:30:58.348416 kernel: SELinux: policy capability open_perms=1 Jan 23 23:30:58.348428 kernel: SELinux: policy capability extended_socket_class=1 Jan 23 23:30:58.348446 kernel: SELinux: policy capability always_check_network=0 Jan 23 23:30:58.348459 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 23 23:30:58.348469 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 23 23:30:58.348479 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 23 23:30:58.348492 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 23 23:30:58.348502 kernel: SELinux: policy capability userspace_initial_context=0 Jan 23 23:30:58.348513 systemd[1]: Successfully loaded SELinux policy in 63.947ms. Jan 23 23:30:58.348534 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.859ms. Jan 23 23:30:58.348546 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 23:30:58.348557 systemd[1]: Detected virtualization kvm. Jan 23 23:30:58.348568 systemd[1]: Detected architecture arm64. Jan 23 23:30:58.348579 systemd[1]: Detected first boot. Jan 23 23:30:58.348593 systemd[1]: Hostname set to . Jan 23 23:30:58.348605 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 23:30:58.348615 zram_generator::config[1209]: No configuration found. Jan 23 23:30:58.348631 kernel: NET: Registered PF_VSOCK protocol family Jan 23 23:30:58.348642 systemd[1]: Populated /etc with preset unit settings. Jan 23 23:30:58.348652 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 23 23:30:58.348663 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 23 23:30:58.348674 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 23 23:30:58.348687 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 23 23:30:58.348698 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 23 23:30:58.348709 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 23 23:30:58.348721 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 23 23:30:58.348732 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 23 23:30:58.348743 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 23 23:30:58.348754 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 23 23:30:58.348766 systemd[1]: Created slice user.slice - User and Session Slice. Jan 23 23:30:58.348777 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 23:30:58.348788 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 23:30:58.348800 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 23 23:30:58.348810 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 23 23:30:58.348822 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 23 23:30:58.348833 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 23:30:58.348845 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 23 23:30:58.348856 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 23:30:58.348869 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 23:30:58.348881 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 23 23:30:58.348894 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 23 23:30:58.348927 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 23 23:30:58.348941 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 23 23:30:58.348978 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 23:30:58.348991 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 23:30:58.349047 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 23 23:30:58.349077 systemd[1]: Reached target slices.target - Slice Units. Jan 23 23:30:58.349089 systemd[1]: Reached target swap.target - Swaps. Jan 23 23:30:58.349106 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 23 23:30:58.349118 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 23 23:30:58.349129 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 23 23:30:58.349140 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 23:30:58.349151 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 23 23:30:58.349162 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 23:30:58.349173 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 23 23:30:58.349185 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 23 23:30:58.349196 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 23:30:58.349210 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 23:30:58.349221 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 23 23:30:58.349232 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 23 23:30:58.349244 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 23 23:30:58.349258 systemd[1]: Mounting media.mount - External Media Directory... Jan 23 23:30:58.349271 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 23 23:30:58.349282 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 23 23:30:58.349293 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 23 23:30:58.349304 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 23 23:30:58.349316 systemd[1]: Reached target machines.target - Containers. Jan 23 23:30:58.349326 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 23 23:30:58.349338 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 23:30:58.349351 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 23:30:58.349362 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 23 23:30:58.349373 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 23:30:58.349384 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 23:30:58.349396 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 23:30:58.349408 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 23 23:30:58.349419 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 23:30:58.349431 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 23 23:30:58.349441 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 23 23:30:58.349453 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 23 23:30:58.349465 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 23 23:30:58.349475 systemd[1]: Stopped systemd-fsck-usr.service. Jan 23 23:30:58.349487 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 23:30:58.349497 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 23:30:58.349509 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 23:30:58.349520 kernel: fuse: init (API version 7.41) Jan 23 23:30:58.349532 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 23:30:58.349545 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 23 23:30:58.349556 kernel: ACPI: bus type drm_connector registered Jan 23 23:30:58.349567 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 23 23:30:58.349578 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 23:30:58.349590 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 23 23:30:58.349601 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 23 23:30:58.349611 systemd[1]: Mounted media.mount - External Media Directory. Jan 23 23:30:58.349624 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 23 23:30:58.349635 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 23 23:30:58.349646 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 23 23:30:58.349689 systemd-journald[1273]: Collecting audit messages is enabled. Jan 23 23:30:58.349728 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 23:30:58.349740 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 23 23:30:58.349752 systemd-journald[1273]: Journal started Jan 23 23:30:58.349775 systemd-journald[1273]: Runtime Journal (/run/log/journal/dbd2950648b84aff8a192b4f432132d4) is 8M, max 319.5M, 311.5M free. Jan 23 23:30:58.211000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 23 23:30:58.295000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.298000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.300000 audit: BPF prog-id=14 op=UNLOAD Jan 23 23:30:58.300000 audit: BPF prog-id=13 op=UNLOAD Jan 23 23:30:58.301000 audit: BPF prog-id=15 op=LOAD Jan 23 23:30:58.301000 audit: BPF prog-id=16 op=LOAD Jan 23 23:30:58.301000 audit: BPF prog-id=17 op=LOAD Jan 23 23:30:58.345000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 23 23:30:58.345000 audit[1273]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=fffffe26e070 a2=4000 a3=0 items=0 ppid=1 pid=1273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:30:58.345000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 23 23:30:58.348000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.121010 systemd[1]: Queued start job for default target multi-user.target. Jan 23 23:30:58.144216 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 23 23:30:58.144644 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 23 23:30:58.353826 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 23 23:30:58.352000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.354982 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 23:30:58.354000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.356757 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 23:30:58.356976 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 23:30:58.356000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.360176 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 23:30:58.360337 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 23:30:58.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.360000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.361595 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 23:30:58.361740 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 23:30:58.362000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.363253 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 23 23:30:58.363407 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 23 23:30:58.363000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.363000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.364764 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 23:30:58.364921 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 23:30:58.365000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.366290 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 23:30:58.366000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.367768 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 23:30:58.368000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.371125 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 23 23:30:58.372000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.374090 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 23 23:30:58.374000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.375504 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 23 23:30:58.375000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.387586 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 23:30:58.389630 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 23 23:30:58.391735 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 23 23:30:58.393660 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 23 23:30:58.394686 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 23 23:30:58.394720 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 23:30:58.396469 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 23 23:30:58.397705 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 23:30:58.397810 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 23:30:58.406012 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 23 23:30:58.407864 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 23 23:30:58.408976 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 23:30:58.410198 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 23 23:30:58.411347 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 23:30:58.412426 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 23:30:58.417111 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 23 23:30:58.426333 systemd-journald[1273]: Time spent on flushing to /var/log/journal/dbd2950648b84aff8a192b4f432132d4 is 35.317ms for 1813 entries. Jan 23 23:30:58.426333 systemd-journald[1273]: System Journal (/var/log/journal/dbd2950648b84aff8a192b4f432132d4) is 8M, max 588.1M, 580.1M free. Jan 23 23:30:58.479269 systemd-journald[1273]: Received client request to flush runtime journal. Jan 23 23:30:58.479325 kernel: loop1: detected capacity change from 0 to 211168 Jan 23 23:30:58.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.434000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.451000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.419188 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 23 23:30:58.421506 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 23 23:30:58.422750 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 23 23:30:58.425982 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 23:30:58.431820 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 23 23:30:58.434299 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 23 23:30:58.441116 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 23 23:30:58.451380 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 23:30:58.480056 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 23 23:30:58.481000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.482515 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 23 23:30:58.483000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.486000 audit: BPF prog-id=18 op=LOAD Jan 23 23:30:58.486000 audit: BPF prog-id=19 op=LOAD Jan 23 23:30:58.486000 audit: BPF prog-id=20 op=LOAD Jan 23 23:30:58.488621 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 23 23:30:58.489000 audit: BPF prog-id=21 op=LOAD Jan 23 23:30:58.490990 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 23:30:58.494164 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 23:30:58.496000 audit: BPF prog-id=22 op=LOAD Jan 23 23:30:58.497000 audit: BPF prog-id=23 op=LOAD Jan 23 23:30:58.497000 audit: BPF prog-id=24 op=LOAD Jan 23 23:30:58.499156 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 23 23:30:58.502269 kernel: loop2: detected capacity change from 0 to 100192 Jan 23 23:30:58.500000 audit: BPF prog-id=25 op=LOAD Jan 23 23:30:58.500000 audit: BPF prog-id=26 op=LOAD Jan 23 23:30:58.500000 audit: BPF prog-id=27 op=LOAD Jan 23 23:30:58.503097 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 23 23:30:58.506607 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 23 23:30:58.506000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.526035 systemd-tmpfiles[1347]: ACLs are not supported, ignoring. Jan 23 23:30:58.526054 systemd-tmpfiles[1347]: ACLs are not supported, ignoring. Jan 23 23:30:58.530109 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 23:30:58.531000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.537792 systemd-nsresourced[1348]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 23 23:30:58.538923 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 23 23:30:58.539000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.553993 kernel: loop3: detected capacity change from 0 to 45344 Jan 23 23:30:58.559379 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 23 23:30:58.560000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.602984 kernel: loop4: detected capacity change from 0 to 1648 Jan 23 23:30:58.610425 systemd-resolved[1346]: Positive Trust Anchors: Jan 23 23:30:58.610447 systemd-resolved[1346]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 23:30:58.610450 systemd-resolved[1346]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 23:30:58.610482 systemd-resolved[1346]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 23:30:58.612260 systemd-oomd[1345]: No swap; memory pressure usage will be degraded Jan 23 23:30:58.612853 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 23 23:30:58.613000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.619641 systemd-resolved[1346]: Using system hostname 'ci-4593-0-0-1-266c03b17e'. Jan 23 23:30:58.620910 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 23:30:58.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.622075 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 23:30:58.647997 kernel: loop5: detected capacity change from 0 to 211168 Jan 23 23:30:58.669349 kernel: loop6: detected capacity change from 0 to 100192 Jan 23 23:30:58.684988 kernel: loop7: detected capacity change from 0 to 45344 Jan 23 23:30:58.705064 kernel: loop1: detected capacity change from 0 to 1648 Jan 23 23:30:58.708914 (sd-merge)[1372]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 23 23:30:58.711871 (sd-merge)[1372]: Merged extensions into '/usr'. Jan 23 23:30:58.715897 systemd[1]: Reload requested from client PID 1329 ('systemd-sysext') (unit systemd-sysext.service)... Jan 23 23:30:58.715921 systemd[1]: Reloading... Jan 23 23:30:58.775087 zram_generator::config[1402]: No configuration found. Jan 23 23:30:58.923604 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 23 23:30:58.923895 systemd[1]: Reloading finished in 207 ms. Jan 23 23:30:58.954979 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 23 23:30:58.955000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.956319 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 23 23:30:58.957000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:58.975198 systemd[1]: Starting ensure-sysext.service... Jan 23 23:30:58.976941 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 23:30:58.976000 audit: BPF prog-id=8 op=UNLOAD Jan 23 23:30:58.976000 audit: BPF prog-id=7 op=UNLOAD Jan 23 23:30:58.977000 audit: BPF prog-id=28 op=LOAD Jan 23 23:30:58.977000 audit: BPF prog-id=29 op=LOAD Jan 23 23:30:58.980138 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 23:30:58.980000 audit: BPF prog-id=30 op=LOAD Jan 23 23:30:58.980000 audit: BPF prog-id=22 op=UNLOAD Jan 23 23:30:58.980000 audit: BPF prog-id=31 op=LOAD Jan 23 23:30:58.980000 audit: BPF prog-id=32 op=LOAD Jan 23 23:30:58.980000 audit: BPF prog-id=23 op=UNLOAD Jan 23 23:30:58.980000 audit: BPF prog-id=24 op=UNLOAD Jan 23 23:30:58.981000 audit: BPF prog-id=33 op=LOAD Jan 23 23:30:58.981000 audit: BPF prog-id=25 op=UNLOAD Jan 23 23:30:58.982000 audit: BPF prog-id=34 op=LOAD Jan 23 23:30:58.982000 audit: BPF prog-id=35 op=LOAD Jan 23 23:30:58.982000 audit: BPF prog-id=26 op=UNLOAD Jan 23 23:30:58.982000 audit: BPF prog-id=27 op=UNLOAD Jan 23 23:30:58.983000 audit: BPF prog-id=36 op=LOAD Jan 23 23:30:58.983000 audit: BPF prog-id=18 op=UNLOAD Jan 23 23:30:58.983000 audit: BPF prog-id=37 op=LOAD Jan 23 23:30:58.983000 audit: BPF prog-id=38 op=LOAD Jan 23 23:30:58.983000 audit: BPF prog-id=19 op=UNLOAD Jan 23 23:30:58.983000 audit: BPF prog-id=20 op=UNLOAD Jan 23 23:30:58.984000 audit: BPF prog-id=39 op=LOAD Jan 23 23:30:58.984000 audit: BPF prog-id=15 op=UNLOAD Jan 23 23:30:58.984000 audit: BPF prog-id=40 op=LOAD Jan 23 23:30:58.984000 audit: BPF prog-id=41 op=LOAD Jan 23 23:30:58.984000 audit: BPF prog-id=16 op=UNLOAD Jan 23 23:30:58.984000 audit: BPF prog-id=17 op=UNLOAD Jan 23 23:30:58.984000 audit: BPF prog-id=42 op=LOAD Jan 23 23:30:58.984000 audit: BPF prog-id=21 op=UNLOAD Jan 23 23:30:58.989890 systemd[1]: Reload requested from client PID 1439 ('systemctl') (unit ensure-sysext.service)... Jan 23 23:30:58.989907 systemd[1]: Reloading... Jan 23 23:30:58.993302 systemd-tmpfiles[1440]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 23 23:30:58.993346 systemd-tmpfiles[1440]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 23 23:30:58.993597 systemd-tmpfiles[1440]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 23 23:30:58.994525 systemd-tmpfiles[1440]: ACLs are not supported, ignoring. Jan 23 23:30:58.994580 systemd-tmpfiles[1440]: ACLs are not supported, ignoring. Jan 23 23:30:59.001049 systemd-tmpfiles[1440]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 23:30:59.001062 systemd-tmpfiles[1440]: Skipping /boot Jan 23 23:30:59.003907 systemd-udevd[1441]: Using default interface naming scheme 'v257'. Jan 23 23:30:59.007205 systemd-tmpfiles[1440]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 23:30:59.007220 systemd-tmpfiles[1440]: Skipping /boot Jan 23 23:30:59.047101 zram_generator::config[1473]: No configuration found. Jan 23 23:30:59.132002 kernel: mousedev: PS/2 mouse device common for all mice Jan 23 23:30:59.199304 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 23 23:30:59.199384 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 23 23:30:59.199404 kernel: [drm] features: -context_init Jan 23 23:30:59.201343 kernel: [drm] number of scanouts: 1 Jan 23 23:30:59.201412 kernel: [drm] number of cap sets: 0 Jan 23 23:30:59.204010 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 23 23:30:59.208203 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 23:30:59.211995 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 23 23:30:59.272479 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 23:30:59.273916 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 23 23:30:59.274168 systemd[1]: Reloading finished in 283 ms. Jan 23 23:30:59.281896 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 23:30:59.282000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.283000 audit: BPF prog-id=43 op=LOAD Jan 23 23:30:59.283000 audit: BPF prog-id=44 op=LOAD Jan 23 23:30:59.283000 audit: BPF prog-id=45 op=LOAD Jan 23 23:30:59.284000 audit: BPF prog-id=46 op=LOAD Jan 23 23:30:59.284000 audit: BPF prog-id=47 op=LOAD Jan 23 23:30:59.284000 audit: BPF prog-id=48 op=LOAD Jan 23 23:30:59.285000 audit: BPF prog-id=30 op=UNLOAD Jan 23 23:30:59.285000 audit: BPF prog-id=31 op=UNLOAD Jan 23 23:30:59.285000 audit: BPF prog-id=32 op=UNLOAD Jan 23 23:30:59.285000 audit: BPF prog-id=36 op=UNLOAD Jan 23 23:30:59.285000 audit: BPF prog-id=37 op=UNLOAD Jan 23 23:30:59.285000 audit: BPF prog-id=38 op=UNLOAD Jan 23 23:30:59.286000 audit: BPF prog-id=49 op=LOAD Jan 23 23:30:59.286000 audit: BPF prog-id=42 op=UNLOAD Jan 23 23:30:59.287000 audit: BPF prog-id=50 op=LOAD Jan 23 23:30:59.287000 audit: BPF prog-id=51 op=LOAD Jan 23 23:30:59.287000 audit: BPF prog-id=28 op=UNLOAD Jan 23 23:30:59.287000 audit: BPF prog-id=29 op=UNLOAD Jan 23 23:30:59.287000 audit: BPF prog-id=52 op=LOAD Jan 23 23:30:59.287000 audit: BPF prog-id=39 op=UNLOAD Jan 23 23:30:59.287000 audit: BPF prog-id=53 op=LOAD Jan 23 23:30:59.287000 audit: BPF prog-id=54 op=LOAD Jan 23 23:30:59.287000 audit: BPF prog-id=40 op=UNLOAD Jan 23 23:30:59.287000 audit: BPF prog-id=41 op=UNLOAD Jan 23 23:30:59.288000 audit: BPF prog-id=55 op=LOAD Jan 23 23:30:59.288000 audit: BPF prog-id=33 op=UNLOAD Jan 23 23:30:59.288000 audit: BPF prog-id=56 op=LOAD Jan 23 23:30:59.288000 audit: BPF prog-id=57 op=LOAD Jan 23 23:30:59.288000 audit: BPF prog-id=34 op=UNLOAD Jan 23 23:30:59.288000 audit: BPF prog-id=35 op=UNLOAD Jan 23 23:30:59.294074 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 23:30:59.294000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.325815 systemd[1]: Finished ensure-sysext.service. Jan 23 23:30:59.325000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.332392 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 23:30:59.334275 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 23 23:30:59.335565 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 23:30:59.336567 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 23:30:59.349254 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 23:30:59.351116 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 23:30:59.356250 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 23:30:59.358481 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 23 23:30:59.359875 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 23:30:59.360013 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 23:30:59.361652 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 23 23:30:59.363585 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 23 23:30:59.366163 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 23:30:59.367217 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 23 23:30:59.368000 audit: BPF prog-id=58 op=LOAD Jan 23 23:30:59.369902 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 23:30:59.371199 systemd[1]: Reached target time-set.target - System Time Set. Jan 23 23:30:59.374799 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 23 23:30:59.381112 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 23 23:30:59.381327 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 23 23:30:59.384381 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 23:30:59.390689 kernel: PTP clock support registered Jan 23 23:30:59.386000 audit[1578]: SYSTEM_BOOT pid=1578 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.389000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.388003 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 23:30:59.388338 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 23:30:59.389586 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 23:30:59.389748 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 23:30:59.390000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.390000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.391653 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 23:30:59.394260 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 23:30:59.394000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.398000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.398000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.397565 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 23:30:59.397939 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 23:30:59.399647 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 23 23:30:59.400986 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 23 23:30:59.401000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.401000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.403945 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 23 23:30:59.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.410416 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 23 23:30:59.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.416267 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 23:30:59.416500 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 23:30:59.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:30:59.421254 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 23 23:30:59.442000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 23 23:30:59.442000 audit[1607]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe94d5b50 a2=420 a3=0 items=0 ppid=1561 pid=1607 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:30:59.442000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 23:30:59.444350 augenrules[1607]: No rules Jan 23 23:30:59.445920 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 23:30:59.446266 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 23:30:59.469740 systemd-networkd[1577]: lo: Link UP Jan 23 23:30:59.469750 systemd-networkd[1577]: lo: Gained carrier Jan 23 23:30:59.471088 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 23:30:59.471280 systemd-networkd[1577]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 23:30:59.471289 systemd-networkd[1577]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 23:30:59.471722 systemd-networkd[1577]: eth0: Link UP Jan 23 23:30:59.471924 systemd-networkd[1577]: eth0: Gained carrier Jan 23 23:30:59.471938 systemd-networkd[1577]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 23:30:59.474079 systemd[1]: Reached target network.target - Network. Jan 23 23:30:59.479303 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 23 23:30:59.482049 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 23 23:30:59.485006 systemd-networkd[1577]: eth0: DHCPv4 address 10.0.10.88/25, gateway 10.0.10.1 acquired from 10.0.10.1 Jan 23 23:30:59.485108 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 23:30:59.505464 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 23 23:30:59.548115 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 23 23:30:59.549715 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 23 23:31:00.135411 ldconfig[1574]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 23 23:31:00.141126 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 23 23:31:00.143806 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 23 23:31:00.169143 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 23 23:31:00.170614 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 23:31:00.171676 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 23 23:31:00.172836 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 23 23:31:00.174118 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 23 23:31:00.175108 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 23 23:31:00.176171 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 23 23:31:00.177292 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 23 23:31:00.178268 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 23 23:31:00.179310 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 23 23:31:00.179347 systemd[1]: Reached target paths.target - Path Units. Jan 23 23:31:00.180095 systemd[1]: Reached target timers.target - Timer Units. Jan 23 23:31:00.182036 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 23 23:31:00.184260 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 23 23:31:00.186903 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 23 23:31:00.188235 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 23 23:31:00.189309 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 23 23:31:00.194900 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 23 23:31:00.196202 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 23 23:31:00.197788 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 23 23:31:00.198846 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 23:31:00.199781 systemd[1]: Reached target basic.target - Basic System. Jan 23 23:31:00.200674 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 23 23:31:00.200707 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 23 23:31:00.203415 systemd[1]: Starting chronyd.service - NTP client/server... Jan 23 23:31:00.205059 systemd[1]: Starting containerd.service - containerd container runtime... Jan 23 23:31:00.207037 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 23 23:31:00.208754 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 23 23:31:00.213129 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 23 23:31:00.214999 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:00.215847 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 23 23:31:00.219170 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 23 23:31:00.220026 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 23 23:31:00.222477 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 23 23:31:00.224780 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 23 23:31:00.228196 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 23 23:31:00.231304 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 23 23:31:00.235826 jq[1635]: false Jan 23 23:31:00.236558 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 23 23:31:00.238023 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 23 23:31:00.238522 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 23 23:31:00.239348 systemd[1]: Starting update-engine.service - Update Engine... Jan 23 23:31:00.243511 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 23 23:31:00.248470 extend-filesystems[1636]: Found /dev/vda6 Jan 23 23:31:00.249722 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 23 23:31:00.253848 chronyd[1628]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 23 23:31:00.255693 jq[1652]: true Jan 23 23:31:00.252273 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 23 23:31:00.252529 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 23 23:31:00.260697 extend-filesystems[1636]: Found /dev/vda9 Jan 23 23:31:00.260697 extend-filesystems[1636]: Checking size of /dev/vda9 Jan 23 23:31:00.257317 chronyd[1628]: Loaded seccomp filter (level 2) Jan 23 23:31:00.252798 systemd[1]: motdgen.service: Deactivated successfully. Jan 23 23:31:00.253035 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 23 23:31:00.256687 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 23 23:31:00.256942 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 23 23:31:00.258791 systemd[1]: Started chronyd.service - NTP client/server. Jan 23 23:31:00.276297 extend-filesystems[1636]: Resized partition /dev/vda9 Jan 23 23:31:00.279092 jq[1664]: true Jan 23 23:31:00.281967 tar[1658]: linux-arm64/LICENSE Jan 23 23:31:00.281967 tar[1658]: linux-arm64/helm Jan 23 23:31:00.290703 extend-filesystems[1681]: resize2fs 1.47.3 (8-Jul-2025) Jan 23 23:31:00.304965 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 23 23:31:00.305050 update_engine[1648]: I20260123 23:31:00.303793 1648 main.cc:92] Flatcar Update Engine starting Jan 23 23:31:00.350817 systemd-logind[1646]: New seat seat0. Jan 23 23:31:00.367918 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 23 23:31:00.379766 update_engine[1648]: I20260123 23:31:00.372094 1648 update_check_scheduler.cc:74] Next update check in 10m34s Jan 23 23:31:00.367533 dbus-daemon[1631]: [system] SELinux support is enabled Jan 23 23:31:00.371093 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 23 23:31:00.371123 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 23 23:31:00.372549 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 23 23:31:00.372565 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 23 23:31:00.373911 systemd[1]: Started update-engine.service - Update Engine. Jan 23 23:31:00.376089 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 23 23:31:00.381092 systemd-logind[1646]: Watching system buttons on /dev/input/event0 (Power Button) Jan 23 23:31:00.381120 systemd-logind[1646]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 23 23:31:00.381445 systemd[1]: Started systemd-logind.service - User Login Management. Jan 23 23:31:00.444941 locksmithd[1706]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 23 23:31:00.566898 containerd[1666]: time="2026-01-23T23:31:00Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 23 23:31:00.567993 containerd[1666]: time="2026-01-23T23:31:00.567904040Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 23 23:31:00.580081 containerd[1666]: time="2026-01-23T23:31:00.580025360Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.16µs" Jan 23 23:31:00.580081 containerd[1666]: time="2026-01-23T23:31:00.580062320Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 23 23:31:00.607463 containerd[1666]: time="2026-01-23T23:31:00.580101960Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 23 23:31:00.607463 containerd[1666]: time="2026-01-23T23:31:00.580113800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 23 23:31:00.607642 containerd[1666]: time="2026-01-23T23:31:00.607510800Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 23 23:31:00.607642 containerd[1666]: time="2026-01-23T23:31:00.607559640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 23:31:00.607642 containerd[1666]: time="2026-01-23T23:31:00.607623440Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 23:31:00.607642 containerd[1666]: time="2026-01-23T23:31:00.607635320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608113 containerd[1666]: time="2026-01-23T23:31:00.608079520Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608113 containerd[1666]: time="2026-01-23T23:31:00.608103560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608113 containerd[1666]: time="2026-01-23T23:31:00.608116440Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608198 containerd[1666]: time="2026-01-23T23:31:00.608124880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608326 containerd[1666]: time="2026-01-23T23:31:00.608278680Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608326 containerd[1666]: time="2026-01-23T23:31:00.608298000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608402 containerd[1666]: time="2026-01-23T23:31:00.608381600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608583 containerd[1666]: time="2026-01-23T23:31:00.608558760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608621 containerd[1666]: time="2026-01-23T23:31:00.608585400Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 23:31:00.608621 containerd[1666]: time="2026-01-23T23:31:00.608595720Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 23 23:31:00.608655 containerd[1666]: time="2026-01-23T23:31:00.608626760Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 23 23:31:00.608865 containerd[1666]: time="2026-01-23T23:31:00.608840800Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 23 23:31:00.608965 containerd[1666]: time="2026-01-23T23:31:00.608909200Z" level=info msg="metadata content store policy set" policy=shared Jan 23 23:31:00.616694 bash[1699]: Updated "/home/core/.ssh/authorized_keys" Jan 23 23:31:00.620219 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 23 23:31:00.625599 systemd[1]: Starting sshkeys.service... Jan 23 23:31:00.657114 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 23 23:31:00.659987 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 23 23:31:00.679982 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:00.711288 containerd[1666]: time="2026-01-23T23:31:00.711141560Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 23 23:31:00.711288 containerd[1666]: time="2026-01-23T23:31:00.711219840Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 23:31:00.711838 containerd[1666]: time="2026-01-23T23:31:00.711810200Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 23:31:00.711876 containerd[1666]: time="2026-01-23T23:31:00.711837960Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 23 23:31:00.711876 containerd[1666]: time="2026-01-23T23:31:00.711864920Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 23 23:31:00.711924 containerd[1666]: time="2026-01-23T23:31:00.711895560Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 23 23:31:00.711924 containerd[1666]: time="2026-01-23T23:31:00.711909240Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 23 23:31:00.711924 containerd[1666]: time="2026-01-23T23:31:00.711919640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 23 23:31:00.711992 containerd[1666]: time="2026-01-23T23:31:00.711931800Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 23 23:31:00.711992 containerd[1666]: time="2026-01-23T23:31:00.711943960Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 23 23:31:00.711992 containerd[1666]: time="2026-01-23T23:31:00.711972640Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 23 23:31:00.711992 containerd[1666]: time="2026-01-23T23:31:00.711985880Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 23 23:31:00.712062 containerd[1666]: time="2026-01-23T23:31:00.711995520Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 23 23:31:00.712062 containerd[1666]: time="2026-01-23T23:31:00.712007920Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 23 23:31:00.712266 containerd[1666]: time="2026-01-23T23:31:00.712243520Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 23 23:31:00.712297 containerd[1666]: time="2026-01-23T23:31:00.712274880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 23 23:31:00.712316 containerd[1666]: time="2026-01-23T23:31:00.712296520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 23 23:31:00.712316 containerd[1666]: time="2026-01-23T23:31:00.712307520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 23 23:31:00.712357 containerd[1666]: time="2026-01-23T23:31:00.712318000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 23 23:31:00.712357 containerd[1666]: time="2026-01-23T23:31:00.712328040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 23 23:31:00.712357 containerd[1666]: time="2026-01-23T23:31:00.712340480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 23 23:31:00.712357 containerd[1666]: time="2026-01-23T23:31:00.712356800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 23 23:31:00.712449 containerd[1666]: time="2026-01-23T23:31:00.712381920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 23 23:31:00.712449 containerd[1666]: time="2026-01-23T23:31:00.712393040Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 23 23:31:00.712449 containerd[1666]: time="2026-01-23T23:31:00.712403120Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 23 23:31:00.712449 containerd[1666]: time="2026-01-23T23:31:00.712429240Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 23 23:31:00.712521 containerd[1666]: time="2026-01-23T23:31:00.712479120Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 23 23:31:00.712521 containerd[1666]: time="2026-01-23T23:31:00.712494280Z" level=info msg="Start snapshots syncer" Jan 23 23:31:00.712555 containerd[1666]: time="2026-01-23T23:31:00.712533000Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 23 23:31:00.713037 containerd[1666]: time="2026-01-23T23:31:00.712994040Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 23 23:31:00.713154 containerd[1666]: time="2026-01-23T23:31:00.713053840Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 23 23:31:00.713178 containerd[1666]: time="2026-01-23T23:31:00.713112160Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 23 23:31:00.713615 containerd[1666]: time="2026-01-23T23:31:00.713561080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 23 23:31:00.713644 containerd[1666]: time="2026-01-23T23:31:00.713616160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 23 23:31:00.713644 containerd[1666]: time="2026-01-23T23:31:00.713632160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 23 23:31:00.713689 containerd[1666]: time="2026-01-23T23:31:00.713652560Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 23 23:31:00.713689 containerd[1666]: time="2026-01-23T23:31:00.713671360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 23 23:31:00.713725 containerd[1666]: time="2026-01-23T23:31:00.713688000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 23 23:31:00.713725 containerd[1666]: time="2026-01-23T23:31:00.713701000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 23 23:31:00.713725 containerd[1666]: time="2026-01-23T23:31:00.713716800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 23 23:31:00.713776 containerd[1666]: time="2026-01-23T23:31:00.713757480Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 23 23:31:00.713820 containerd[1666]: time="2026-01-23T23:31:00.713805760Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 23:31:00.713843 containerd[1666]: time="2026-01-23T23:31:00.713827400Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 23:31:00.713863 containerd[1666]: time="2026-01-23T23:31:00.713841800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 23:31:00.713863 containerd[1666]: time="2026-01-23T23:31:00.713855840Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 23:31:00.713899 containerd[1666]: time="2026-01-23T23:31:00.713867680Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 23 23:31:00.713937 containerd[1666]: time="2026-01-23T23:31:00.713882800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 23 23:31:00.713982 containerd[1666]: time="2026-01-23T23:31:00.713947840Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 23 23:31:00.714090 containerd[1666]: time="2026-01-23T23:31:00.714078960Z" level=info msg="runtime interface created" Jan 23 23:31:00.714117 containerd[1666]: time="2026-01-23T23:31:00.714089360Z" level=info msg="created NRI interface" Jan 23 23:31:00.714117 containerd[1666]: time="2026-01-23T23:31:00.714101040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 23 23:31:00.714153 containerd[1666]: time="2026-01-23T23:31:00.714121000Z" level=info msg="Connect containerd service" Jan 23 23:31:00.714170 containerd[1666]: time="2026-01-23T23:31:00.714160000Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 23 23:31:00.715093 containerd[1666]: time="2026-01-23T23:31:00.715016120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 23:31:00.802773 containerd[1666]: time="2026-01-23T23:31:00.802710160Z" level=info msg="Start subscribing containerd event" Jan 23 23:31:00.803219 containerd[1666]: time="2026-01-23T23:31:00.803198640Z" level=info msg="Start recovering state" Jan 23 23:31:00.803627 containerd[1666]: time="2026-01-23T23:31:00.803592320Z" level=info msg="Start event monitor" Jan 23 23:31:00.803811 containerd[1666]: time="2026-01-23T23:31:00.803796080Z" level=info msg="Start cni network conf syncer for default" Jan 23 23:31:00.803997 containerd[1666]: time="2026-01-23T23:31:00.803981560Z" level=info msg="Start streaming server" Jan 23 23:31:00.804352 containerd[1666]: time="2026-01-23T23:31:00.804333640Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 23 23:31:00.804498 containerd[1666]: time="2026-01-23T23:31:00.804482760Z" level=info msg="runtime interface starting up..." Jan 23 23:31:00.804557 containerd[1666]: time="2026-01-23T23:31:00.804535240Z" level=info msg="starting plugins..." Jan 23 23:31:00.804728 containerd[1666]: time="2026-01-23T23:31:00.804674680Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 23 23:31:00.805173 containerd[1666]: time="2026-01-23T23:31:00.805152360Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 23 23:31:00.805394 containerd[1666]: time="2026-01-23T23:31:00.805378000Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 23 23:31:00.806003 containerd[1666]: time="2026-01-23T23:31:00.805939560Z" level=info msg="containerd successfully booted in 0.330794s" Jan 23 23:31:00.806126 systemd[1]: Started containerd.service - containerd container runtime. Jan 23 23:31:00.900913 tar[1658]: linux-arm64/README.md Jan 23 23:31:00.917017 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 23 23:31:00.954464 sshd_keygen[1659]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 23 23:31:00.973611 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 23 23:31:00.976893 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 23 23:31:01.001695 systemd[1]: issuegen.service: Deactivated successfully. Jan 23 23:31:01.001999 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 23 23:31:01.004929 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 23 23:31:01.028862 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 23 23:31:01.032445 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 23 23:31:01.034991 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 23 23:31:01.036279 systemd[1]: Reached target getty.target - Login Prompts. Jan 23 23:31:01.105106 systemd-networkd[1577]: eth0: Gained IPv6LL Jan 23 23:31:01.107488 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 23 23:31:01.109431 systemd[1]: Reached target network-online.target - Network is Online. Jan 23 23:31:01.111918 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:01.114206 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 23 23:31:01.119985 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 23 23:31:01.227995 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:01.279198 extend-filesystems[1681]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 23 23:31:01.279198 extend-filesystems[1681]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 23 23:31:01.279198 extend-filesystems[1681]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 23 23:31:01.285006 extend-filesystems[1636]: Resized filesystem in /dev/vda9 Jan 23 23:31:01.280349 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 23 23:31:01.288054 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 23 23:31:01.290315 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 23 23:31:01.689005 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:02.152581 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:02.165248 (kubelet)[1771]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 23:31:02.772299 kubelet[1771]: E0123 23:31:02.772238 1771 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 23:31:02.774855 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 23:31:02.775013 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 23:31:02.775589 systemd[1]: kubelet.service: Consumed 797ms CPU time, 258.8M memory peak. Jan 23 23:31:03.238975 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:03.702031 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:07.246995 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:07.253308 coreos-metadata[1630]: Jan 23 23:31:07.253 WARN failed to locate config-drive, using the metadata service API instead Jan 23 23:31:07.271784 coreos-metadata[1630]: Jan 23 23:31:07.271 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 23 23:31:07.717065 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 23:31:07.722882 coreos-metadata[1714]: Jan 23 23:31:07.722 WARN failed to locate config-drive, using the metadata service API instead Jan 23 23:31:07.736173 coreos-metadata[1714]: Jan 23 23:31:07.736 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 23 23:31:08.678113 coreos-metadata[1714]: Jan 23 23:31:08.678 INFO Fetch successful Jan 23 23:31:08.678113 coreos-metadata[1714]: Jan 23 23:31:08.678 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 23 23:31:09.272935 coreos-metadata[1630]: Jan 23 23:31:09.272 INFO Fetch successful Jan 23 23:31:09.272935 coreos-metadata[1630]: Jan 23 23:31:09.272 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 23:31:09.929990 coreos-metadata[1714]: Jan 23 23:31:09.929 INFO Fetch successful Jan 23 23:31:09.932251 unknown[1714]: wrote ssh authorized keys file for user: core Jan 23 23:31:09.941173 coreos-metadata[1630]: Jan 23 23:31:09.941 INFO Fetch successful Jan 23 23:31:09.941173 coreos-metadata[1630]: Jan 23 23:31:09.941 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 23 23:31:09.965105 update-ssh-keys[1790]: Updated "/home/core/.ssh/authorized_keys" Jan 23 23:31:09.967097 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 23 23:31:09.969215 systemd[1]: Finished sshkeys.service. Jan 23 23:31:10.145975 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 23 23:31:10.147125 systemd[1]: Started sshd@0-10.0.10.88:22-68.220.241.50:35672.service - OpenSSH per-connection server daemon (68.220.241.50:35672). Jan 23 23:31:10.579087 coreos-metadata[1630]: Jan 23 23:31:10.578 INFO Fetch successful Jan 23 23:31:10.579087 coreos-metadata[1630]: Jan 23 23:31:10.579 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 23 23:31:10.697814 sshd[1794]: Accepted publickey for core from 68.220.241.50 port 35672 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:10.699886 sshd-session[1794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:10.706177 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 23 23:31:10.707099 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 23 23:31:10.711051 systemd-logind[1646]: New session 1 of user core. Jan 23 23:31:10.729509 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 23 23:31:10.731898 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 23 23:31:10.753148 (systemd)[1800]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:10.755739 systemd-logind[1646]: New session 2 of user core. Jan 23 23:31:10.866998 systemd[1800]: Queued start job for default target default.target. Jan 23 23:31:10.874043 systemd[1800]: Created slice app.slice - User Application Slice. Jan 23 23:31:10.874076 systemd[1800]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 23 23:31:10.874088 systemd[1800]: Reached target paths.target - Paths. Jan 23 23:31:10.874138 systemd[1800]: Reached target timers.target - Timers. Jan 23 23:31:10.875317 systemd[1800]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 23 23:31:10.876035 systemd[1800]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 23 23:31:10.885474 systemd[1800]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 23 23:31:10.885546 systemd[1800]: Reached target sockets.target - Sockets. Jan 23 23:31:10.887272 systemd[1800]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 23 23:31:10.887436 systemd[1800]: Reached target basic.target - Basic System. Jan 23 23:31:10.887482 systemd[1800]: Reached target default.target - Main User Target. Jan 23 23:31:10.887507 systemd[1800]: Startup finished in 126ms. Jan 23 23:31:10.887653 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 23 23:31:10.902420 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 23 23:31:11.199469 systemd[1]: Started sshd@1-10.0.10.88:22-68.220.241.50:35678.service - OpenSSH per-connection server daemon (68.220.241.50:35678). Jan 23 23:31:11.221989 coreos-metadata[1630]: Jan 23 23:31:11.220 INFO Fetch successful Jan 23 23:31:11.221989 coreos-metadata[1630]: Jan 23 23:31:11.220 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 23 23:31:11.710949 sshd[1814]: Accepted publickey for core from 68.220.241.50 port 35678 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:11.712465 sshd-session[1814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:11.716630 systemd-logind[1646]: New session 3 of user core. Jan 23 23:31:11.727177 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 23 23:31:11.872571 coreos-metadata[1630]: Jan 23 23:31:11.872 INFO Fetch successful Jan 23 23:31:11.872571 coreos-metadata[1630]: Jan 23 23:31:11.872 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 23 23:31:11.995635 sshd[1818]: Connection closed by 68.220.241.50 port 35678 Jan 23 23:31:11.995860 sshd-session[1814]: pam_unix(sshd:session): session closed for user core Jan 23 23:31:11.999789 systemd[1]: sshd@1-10.0.10.88:22-68.220.241.50:35678.service: Deactivated successfully. Jan 23 23:31:12.003364 systemd[1]: session-3.scope: Deactivated successfully. Jan 23 23:31:12.004161 systemd-logind[1646]: Session 3 logged out. Waiting for processes to exit. Jan 23 23:31:12.005207 systemd-logind[1646]: Removed session 3. Jan 23 23:31:12.105229 systemd[1]: Started sshd@2-10.0.10.88:22-68.220.241.50:35686.service - OpenSSH per-connection server daemon (68.220.241.50:35686). Jan 23 23:31:12.508894 coreos-metadata[1630]: Jan 23 23:31:12.508 INFO Fetch successful Jan 23 23:31:12.532752 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 23 23:31:12.533210 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 23 23:31:12.533342 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 23 23:31:12.533686 systemd[1]: Startup finished in 2.692s (kernel) + 14.331s (initrd) + 15.043s (userspace) = 32.067s. Jan 23 23:31:12.641670 sshd[1824]: Accepted publickey for core from 68.220.241.50 port 35686 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:12.643037 sshd-session[1824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:12.647832 systemd-logind[1646]: New session 4 of user core. Jan 23 23:31:12.655463 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 23 23:31:12.837404 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 23 23:31:12.839487 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:12.933539 sshd[1833]: Connection closed by 68.220.241.50 port 35686 Jan 23 23:31:12.934142 sshd-session[1824]: pam_unix(sshd:session): session closed for user core Jan 23 23:31:12.938359 systemd[1]: sshd@2-10.0.10.88:22-68.220.241.50:35686.service: Deactivated successfully. Jan 23 23:31:12.940374 systemd[1]: session-4.scope: Deactivated successfully. Jan 23 23:31:12.942600 systemd-logind[1646]: Session 4 logged out. Waiting for processes to exit. Jan 23 23:31:12.944551 systemd-logind[1646]: Removed session 4. Jan 23 23:31:12.980619 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:12.984873 (kubelet)[1846]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 23:31:13.459634 kubelet[1846]: E0123 23:31:13.459578 1846 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 23:31:13.462455 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 23:31:13.462585 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 23:31:13.462904 systemd[1]: kubelet.service: Consumed 149ms CPU time, 106.5M memory peak. Jan 23 23:31:23.043635 systemd[1]: Started sshd@3-10.0.10.88:22-68.220.241.50:52614.service - OpenSSH per-connection server daemon (68.220.241.50:52614). Jan 23 23:31:23.546720 sshd[1856]: Accepted publickey for core from 68.220.241.50 port 52614 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:23.548048 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:23.549141 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 23 23:31:23.550943 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:23.557043 systemd-logind[1646]: New session 5 of user core. Jan 23 23:31:23.566111 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 23 23:31:23.694107 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:23.713257 (kubelet)[1869]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 23:31:23.832074 sshd[1863]: Connection closed by 68.220.241.50 port 52614 Jan 23 23:31:23.832001 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Jan 23 23:31:23.836599 systemd[1]: sshd@3-10.0.10.88:22-68.220.241.50:52614.service: Deactivated successfully. Jan 23 23:31:23.839467 systemd[1]: session-5.scope: Deactivated successfully. Jan 23 23:31:23.841023 systemd-logind[1646]: Session 5 logged out. Waiting for processes to exit. Jan 23 23:31:23.842043 systemd-logind[1646]: Removed session 5. Jan 23 23:31:23.942511 systemd[1]: Started sshd@4-10.0.10.88:22-68.220.241.50:52630.service - OpenSSH per-connection server daemon (68.220.241.50:52630). Jan 23 23:31:24.039333 chronyd[1628]: Selected source PHC0 Jan 23 23:31:24.159037 kubelet[1869]: E0123 23:31:24.158989 1869 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 23:31:24.161546 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 23:31:24.161851 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 23:31:24.164067 systemd[1]: kubelet.service: Consumed 235ms CPU time, 106.1M memory peak. Jan 23 23:31:24.435256 sshd[1881]: Accepted publickey for core from 68.220.241.50 port 52630 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:24.436505 sshd-session[1881]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:24.440637 systemd-logind[1646]: New session 6 of user core. Jan 23 23:31:24.459389 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 23 23:31:24.697527 sshd[1886]: Connection closed by 68.220.241.50 port 52630 Jan 23 23:31:24.697842 sshd-session[1881]: pam_unix(sshd:session): session closed for user core Jan 23 23:31:24.701620 systemd[1]: sshd@4-10.0.10.88:22-68.220.241.50:52630.service: Deactivated successfully. Jan 23 23:31:24.703504 systemd[1]: session-6.scope: Deactivated successfully. Jan 23 23:31:24.704975 systemd-logind[1646]: Session 6 logged out. Waiting for processes to exit. Jan 23 23:31:24.705764 systemd-logind[1646]: Removed session 6. Jan 23 23:31:24.798319 systemd[1]: Started sshd@5-10.0.10.88:22-68.220.241.50:52638.service - OpenSSH per-connection server daemon (68.220.241.50:52638). Jan 23 23:31:25.283334 sshd[1892]: Accepted publickey for core from 68.220.241.50 port 52638 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:25.284611 sshd-session[1892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:25.288790 systemd-logind[1646]: New session 7 of user core. Jan 23 23:31:25.297292 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 23 23:31:25.544375 sshd[1896]: Connection closed by 68.220.241.50 port 52638 Jan 23 23:31:25.544555 sshd-session[1892]: pam_unix(sshd:session): session closed for user core Jan 23 23:31:25.548257 systemd[1]: sshd@5-10.0.10.88:22-68.220.241.50:52638.service: Deactivated successfully. Jan 23 23:31:25.549741 systemd[1]: session-7.scope: Deactivated successfully. Jan 23 23:31:25.551667 systemd-logind[1646]: Session 7 logged out. Waiting for processes to exit. Jan 23 23:31:25.552688 systemd-logind[1646]: Removed session 7. Jan 23 23:31:25.640897 systemd[1]: Started sshd@6-10.0.10.88:22-68.220.241.50:52648.service - OpenSSH per-connection server daemon (68.220.241.50:52648). Jan 23 23:31:26.130970 sshd[1902]: Accepted publickey for core from 68.220.241.50 port 52648 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:26.132081 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:26.135491 systemd-logind[1646]: New session 8 of user core. Jan 23 23:31:26.148186 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 23 23:31:26.319946 sudo[1907]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 23 23:31:26.320220 sudo[1907]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 23:31:26.332075 sudo[1907]: pam_unix(sudo:session): session closed for user root Jan 23 23:31:26.420124 sshd[1906]: Connection closed by 68.220.241.50 port 52648 Jan 23 23:31:26.420059 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Jan 23 23:31:26.424103 systemd[1]: sshd@6-10.0.10.88:22-68.220.241.50:52648.service: Deactivated successfully. Jan 23 23:31:26.425723 systemd[1]: session-8.scope: Deactivated successfully. Jan 23 23:31:26.426449 systemd-logind[1646]: Session 8 logged out. Waiting for processes to exit. Jan 23 23:31:26.427472 systemd-logind[1646]: Removed session 8. Jan 23 23:31:26.520201 systemd[1]: Started sshd@7-10.0.10.88:22-68.220.241.50:52656.service - OpenSSH per-connection server daemon (68.220.241.50:52656). Jan 23 23:31:27.020007 sshd[1914]: Accepted publickey for core from 68.220.241.50 port 52656 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:27.021272 sshd-session[1914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:27.024821 systemd-logind[1646]: New session 9 of user core. Jan 23 23:31:27.032332 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 23 23:31:27.205638 sudo[1920]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 23 23:31:27.205887 sudo[1920]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 23:31:27.208641 sudo[1920]: pam_unix(sudo:session): session closed for user root Jan 23 23:31:27.214193 sudo[1919]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 23 23:31:27.214436 sudo[1919]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 23:31:27.220841 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 23:31:27.253000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 23:31:27.254668 augenrules[1944]: No rules Jan 23 23:31:27.255155 kernel: kauditd_printk_skb: 186 callbacks suppressed Jan 23 23:31:27.255202 kernel: audit: type=1305 audit(1769211087.253:230): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 23:31:27.255697 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 23:31:27.256081 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 23:31:27.253000 audit[1944]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff3f691d0 a2=420 a3=0 items=0 ppid=1925 pid=1944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:27.259232 sudo[1919]: pam_unix(sudo:session): session closed for user root Jan 23 23:31:27.260461 kernel: audit: type=1300 audit(1769211087.253:230): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff3f691d0 a2=420 a3=0 items=0 ppid=1925 pid=1944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:27.260512 kernel: audit: type=1327 audit(1769211087.253:230): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 23:31:27.253000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 23:31:27.255000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.263907 kernel: audit: type=1130 audit(1769211087.255:231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.263943 kernel: audit: type=1131 audit(1769211087.255:232): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.255000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.258000 audit[1919]: USER_END pid=1919 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.269520 kernel: audit: type=1106 audit(1769211087.258:233): pid=1919 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.269563 kernel: audit: type=1104 audit(1769211087.258:234): pid=1919 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.258000 audit[1919]: CRED_DISP pid=1919 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.349505 sshd[1918]: Connection closed by 68.220.241.50 port 52656 Jan 23 23:31:27.350084 sshd-session[1914]: pam_unix(sshd:session): session closed for user core Jan 23 23:31:27.351000 audit[1914]: USER_END pid=1914 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:27.355015 systemd[1]: sshd@7-10.0.10.88:22-68.220.241.50:52656.service: Deactivated successfully. Jan 23 23:31:27.351000 audit[1914]: CRED_DISP pid=1914 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:27.356545 systemd[1]: session-9.scope: Deactivated successfully. Jan 23 23:31:27.357259 systemd-logind[1646]: Session 9 logged out. Waiting for processes to exit. Jan 23 23:31:27.358250 kernel: audit: type=1106 audit(1769211087.351:235): pid=1914 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:27.358321 kernel: audit: type=1104 audit(1769211087.351:236): pid=1914 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:27.358341 kernel: audit: type=1131 audit(1769211087.354:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.10.88:22-68.220.241.50:52656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.354000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.10.88:22-68.220.241.50:52656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.358812 systemd-logind[1646]: Removed session 9. Jan 23 23:31:27.450000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.10.88:22-68.220.241.50:52664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:27.451185 systemd[1]: Started sshd@8-10.0.10.88:22-68.220.241.50:52664.service - OpenSSH per-connection server daemon (68.220.241.50:52664). Jan 23 23:31:27.919000 audit[1953]: USER_ACCT pid=1953 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:27.921097 sshd[1953]: Accepted publickey for core from 68.220.241.50 port 52664 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:31:27.920000 audit[1953]: CRED_ACQ pid=1953 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:27.920000 audit[1953]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd73849e0 a2=3 a3=0 items=0 ppid=1 pid=1953 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:27.920000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:31:27.922287 sshd-session[1953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:31:27.926752 systemd-logind[1646]: New session 10 of user core. Jan 23 23:31:27.935253 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 23 23:31:27.936000 audit[1953]: USER_START pid=1953 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:27.937000 audit[1957]: CRED_ACQ pid=1957 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:31:28.100000 audit[1958]: USER_ACCT pid=1958 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:31:28.100000 audit[1958]: CRED_REFR pid=1958 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:31:28.101635 sudo[1958]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 23 23:31:28.100000 audit[1958]: USER_START pid=1958 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:31:28.101899 sudo[1958]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 23:31:28.409826 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 23 23:31:28.432362 (dockerd)[1979]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 23 23:31:28.672156 dockerd[1979]: time="2026-01-23T23:31:28.672025860Z" level=info msg="Starting up" Jan 23 23:31:28.672919 dockerd[1979]: time="2026-01-23T23:31:28.672895109Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 23 23:31:28.682881 dockerd[1979]: time="2026-01-23T23:31:28.682843564Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 23 23:31:28.730176 dockerd[1979]: time="2026-01-23T23:31:28.730132859Z" level=info msg="Loading containers: start." Jan 23 23:31:28.739994 kernel: Initializing XFRM netlink socket Jan 23 23:31:28.784000 audit[2031]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.784000 audit[2031]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffe7eaf310 a2=0 a3=0 items=0 ppid=1979 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.784000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 23:31:28.786000 audit[2033]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.786000 audit[2033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffee76e2c0 a2=0 a3=0 items=0 ppid=1979 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.786000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 23:31:28.788000 audit[2035]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.788000 audit[2035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe02541e0 a2=0 a3=0 items=0 ppid=1979 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.788000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 23:31:28.789000 audit[2037]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.789000 audit[2037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe1242740 a2=0 a3=0 items=0 ppid=1979 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.789000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 23:31:28.791000 audit[2039]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.791000 audit[2039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffbb30d80 a2=0 a3=0 items=0 ppid=1979 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.791000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 23:31:28.793000 audit[2041]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.793000 audit[2041]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe78efdc0 a2=0 a3=0 items=0 ppid=1979 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.793000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 23:31:28.795000 audit[2043]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.795000 audit[2043]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffcbaaa9f0 a2=0 a3=0 items=0 ppid=1979 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.795000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 23:31:28.797000 audit[2045]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.797000 audit[2045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffc8669260 a2=0 a3=0 items=0 ppid=1979 pid=2045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.797000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 23:31:28.821000 audit[2048]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.821000 audit[2048]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffc5f47750 a2=0 a3=0 items=0 ppid=1979 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.821000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 23 23:31:28.823000 audit[2050]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.823000 audit[2050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe3190640 a2=0 a3=0 items=0 ppid=1979 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.823000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 23:31:28.825000 audit[2052]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.825000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffe4782810 a2=0 a3=0 items=0 ppid=1979 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.825000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 23:31:28.827000 audit[2054]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.827000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=fffffa9ec110 a2=0 a3=0 items=0 ppid=1979 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.827000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 23:31:28.829000 audit[2056]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.829000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffe948e640 a2=0 a3=0 items=0 ppid=1979 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.829000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 23:31:28.867000 audit[2086]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.867000 audit[2086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffc2ff20e0 a2=0 a3=0 items=0 ppid=1979 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.867000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 23:31:28.869000 audit[2088]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.869000 audit[2088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffe55a7310 a2=0 a3=0 items=0 ppid=1979 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.869000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 23:31:28.870000 audit[2090]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.870000 audit[2090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff928c440 a2=0 a3=0 items=0 ppid=1979 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.870000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 23:31:28.872000 audit[2092]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.872000 audit[2092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc413cd80 a2=0 a3=0 items=0 ppid=1979 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.872000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 23:31:28.874000 audit[2094]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.874000 audit[2094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffefdb6f0 a2=0 a3=0 items=0 ppid=1979 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.874000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 23:31:28.876000 audit[2096]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.876000 audit[2096]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffffede1c70 a2=0 a3=0 items=0 ppid=1979 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.876000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 23:31:28.877000 audit[2098]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.877000 audit[2098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd550ba70 a2=0 a3=0 items=0 ppid=1979 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.877000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 23:31:28.879000 audit[2100]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.879000 audit[2100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffe4c29290 a2=0 a3=0 items=0 ppid=1979 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.879000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 23:31:28.881000 audit[2102]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.881000 audit[2102]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffdd54d000 a2=0 a3=0 items=0 ppid=1979 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.881000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 23 23:31:28.883000 audit[2104]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.883000 audit[2104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffcda85d20 a2=0 a3=0 items=0 ppid=1979 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.883000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 23:31:28.885000 audit[2106]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.885000 audit[2106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffdbb31ee0 a2=0 a3=0 items=0 ppid=1979 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.885000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 23:31:28.887000 audit[2108]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.887000 audit[2108]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffebc506d0 a2=0 a3=0 items=0 ppid=1979 pid=2108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.887000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 23:31:28.888000 audit[2110]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.888000 audit[2110]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffebc4e150 a2=0 a3=0 items=0 ppid=1979 pid=2110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.888000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 23:31:28.893000 audit[2115]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.893000 audit[2115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffa5f5ae0 a2=0 a3=0 items=0 ppid=1979 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.893000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 23:31:28.895000 audit[2117]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.895000 audit[2117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffe504bdb0 a2=0 a3=0 items=0 ppid=1979 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.895000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 23:31:28.897000 audit[2119]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.897000 audit[2119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffde7ebe90 a2=0 a3=0 items=0 ppid=1979 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.897000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 23:31:28.898000 audit[2121]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.898000 audit[2121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffc668ea0 a2=0 a3=0 items=0 ppid=1979 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.898000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 23:31:28.900000 audit[2123]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.900000 audit[2123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffff5d9ca40 a2=0 a3=0 items=0 ppid=1979 pid=2123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.900000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 23:31:28.902000 audit[2125]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:28.902000 audit[2125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffcfac8590 a2=0 a3=0 items=0 ppid=1979 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.902000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 23:31:28.917000 audit[2129]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.917000 audit[2129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffd2e6ccf0 a2=0 a3=0 items=0 ppid=1979 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.917000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 23 23:31:28.919000 audit[2133]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.919000 audit[2133]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffefc57d40 a2=0 a3=0 items=0 ppid=1979 pid=2133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.919000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 23 23:31:28.927000 audit[2141]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.927000 audit[2141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffe060d260 a2=0 a3=0 items=0 ppid=1979 pid=2141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.927000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 23 23:31:28.937000 audit[2147]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.937000 audit[2147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffdc043ec0 a2=0 a3=0 items=0 ppid=1979 pid=2147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.937000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 23 23:31:28.939000 audit[2149]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.939000 audit[2149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=fffffb5895c0 a2=0 a3=0 items=0 ppid=1979 pid=2149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.939000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 23 23:31:28.941000 audit[2151]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.941000 audit[2151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffd343bbd0 a2=0 a3=0 items=0 ppid=1979 pid=2151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.941000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 23 23:31:28.943000 audit[2153]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.943000 audit[2153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffdd7a4660 a2=0 a3=0 items=0 ppid=1979 pid=2153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.943000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 23:31:28.946000 audit[2155]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:28.946000 audit[2155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffc24a5fd0 a2=0 a3=0 items=0 ppid=1979 pid=2155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:28.946000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 23 23:31:28.948672 systemd-networkd[1577]: docker0: Link UP Jan 23 23:31:28.957803 dockerd[1979]: time="2026-01-23T23:31:28.957753685Z" level=info msg="Loading containers: done." Jan 23 23:31:28.984533 dockerd[1979]: time="2026-01-23T23:31:28.984443062Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 23 23:31:28.984533 dockerd[1979]: time="2026-01-23T23:31:28.984537943Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 23 23:31:28.984909 dockerd[1979]: time="2026-01-23T23:31:28.984867186Z" level=info msg="Initializing buildkit" Jan 23 23:31:29.009448 dockerd[1979]: time="2026-01-23T23:31:29.009400102Z" level=info msg="Completed buildkit initialization" Jan 23 23:31:29.014078 dockerd[1979]: time="2026-01-23T23:31:29.014002946Z" level=info msg="Daemon has completed initialization" Jan 23 23:31:29.014301 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 23 23:31:29.013000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:29.014926 dockerd[1979]: time="2026-01-23T23:31:29.014145907Z" level=info msg="API listen on /run/docker.sock" Jan 23 23:31:29.695477 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck296904181-merged.mount: Deactivated successfully. Jan 23 23:31:30.174264 containerd[1666]: time="2026-01-23T23:31:30.174225368Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 23 23:31:30.989882 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1534807399.mount: Deactivated successfully. Jan 23 23:31:31.552578 containerd[1666]: time="2026-01-23T23:31:31.552520091Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:31.553737 containerd[1666]: time="2026-01-23T23:31:31.553689855Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=25791094" Jan 23 23:31:31.554825 containerd[1666]: time="2026-01-23T23:31:31.554795698Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:31.558301 containerd[1666]: time="2026-01-23T23:31:31.557799467Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:31.558868 containerd[1666]: time="2026-01-23T23:31:31.558831190Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 1.384559862s" Jan 23 23:31:31.558868 containerd[1666]: time="2026-01-23T23:31:31.558866830Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Jan 23 23:31:31.560360 containerd[1666]: time="2026-01-23T23:31:31.560320155Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 23 23:31:32.601979 containerd[1666]: time="2026-01-23T23:31:32.601912399Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:32.603127 containerd[1666]: time="2026-01-23T23:31:32.603076724Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23544927" Jan 23 23:31:32.604133 containerd[1666]: time="2026-01-23T23:31:32.604096647Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:32.607491 containerd[1666]: time="2026-01-23T23:31:32.607450540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:32.609108 containerd[1666]: time="2026-01-23T23:31:32.609070026Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.048622751s" Jan 23 23:31:32.609108 containerd[1666]: time="2026-01-23T23:31:32.609105066Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Jan 23 23:31:32.609545 containerd[1666]: time="2026-01-23T23:31:32.609484188Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 23 23:31:33.754193 containerd[1666]: time="2026-01-23T23:31:33.754147828Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:33.755555 containerd[1666]: time="2026-01-23T23:31:33.755362951Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18289931" Jan 23 23:31:33.756690 containerd[1666]: time="2026-01-23T23:31:33.756654795Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:33.759291 containerd[1666]: time="2026-01-23T23:31:33.759254963Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:33.760226 containerd[1666]: time="2026-01-23T23:31:33.760197326Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.150681138s" Jan 23 23:31:33.760313 containerd[1666]: time="2026-01-23T23:31:33.760300006Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Jan 23 23:31:33.760819 containerd[1666]: time="2026-01-23T23:31:33.760731328Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 23 23:31:34.352785 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 23 23:31:34.354697 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:34.657932 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3691604006.mount: Deactivated successfully. Jan 23 23:31:35.136769 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:35.156883 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 23 23:31:35.156945 kernel: audit: type=1130 audit(1769211095.135:288): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:35.135000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:35.140544 (kubelet)[2274]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 23:31:35.170964 kubelet[2274]: E0123 23:31:35.170913 2274 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 23:31:35.173314 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 23:31:35.173452 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 23:31:35.174000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 23:31:35.175038 systemd[1]: kubelet.service: Consumed 137ms CPU time, 107.3M memory peak. Jan 23 23:31:35.178004 kernel: audit: type=1131 audit(1769211095.174:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 23:31:35.395903 containerd[1666]: time="2026-01-23T23:31:35.395734074Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:35.396938 containerd[1666]: time="2026-01-23T23:31:35.396864918Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=0" Jan 23 23:31:35.398016 containerd[1666]: time="2026-01-23T23:31:35.397929601Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:35.400001 containerd[1666]: time="2026-01-23T23:31:35.399969527Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:35.400795 containerd[1666]: time="2026-01-23T23:31:35.400678369Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.639853561s" Jan 23 23:31:35.400795 containerd[1666]: time="2026-01-23T23:31:35.400709329Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Jan 23 23:31:35.401329 containerd[1666]: time="2026-01-23T23:31:35.401308371Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 23 23:31:35.944982 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3306531224.mount: Deactivated successfully. Jan 23 23:31:36.636390 containerd[1666]: time="2026-01-23T23:31:36.636311098Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:36.637945 containerd[1666]: time="2026-01-23T23:31:36.637899022Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=18338344" Jan 23 23:31:36.642738 containerd[1666]: time="2026-01-23T23:31:36.642693957Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:36.645725 containerd[1666]: time="2026-01-23T23:31:36.645675246Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:36.647038 containerd[1666]: time="2026-01-23T23:31:36.647009010Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.245589079s" Jan 23 23:31:36.647093 containerd[1666]: time="2026-01-23T23:31:36.647043530Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Jan 23 23:31:36.647727 containerd[1666]: time="2026-01-23T23:31:36.647697332Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 23 23:31:37.126999 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3976772747.mount: Deactivated successfully. Jan 23 23:31:37.141763 containerd[1666]: time="2026-01-23T23:31:37.141711439Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 23:31:37.143344 containerd[1666]: time="2026-01-23T23:31:37.143077203Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 23:31:37.144263 containerd[1666]: time="2026-01-23T23:31:37.144217407Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 23:31:37.146416 containerd[1666]: time="2026-01-23T23:31:37.146378453Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 23:31:37.147214 containerd[1666]: time="2026-01-23T23:31:37.147191736Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 499.465284ms" Jan 23 23:31:37.147375 containerd[1666]: time="2026-01-23T23:31:37.147289736Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 23 23:31:37.147793 containerd[1666]: time="2026-01-23T23:31:37.147769897Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 23 23:31:37.691179 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount906133803.mount: Deactivated successfully. Jan 23 23:31:39.390707 containerd[1666]: time="2026-01-23T23:31:39.390638538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:39.391607 containerd[1666]: time="2026-01-23T23:31:39.391532620Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=57926377" Jan 23 23:31:39.392601 containerd[1666]: time="2026-01-23T23:31:39.392559024Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:39.395833 containerd[1666]: time="2026-01-23T23:31:39.395424232Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:31:39.396512 containerd[1666]: time="2026-01-23T23:31:39.396480916Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.248678778s" Jan 23 23:31:39.396512 containerd[1666]: time="2026-01-23T23:31:39.396505716Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Jan 23 23:31:45.352940 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 23 23:31:45.354558 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:45.482098 update_engine[1648]: I20260123 23:31:45.481988 1648 update_attempter.cc:509] Updating boot flags... Jan 23 23:31:45.581533 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 23 23:31:45.581637 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 23 23:31:45.581975 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:45.582206 systemd[1]: kubelet.service: Consumed 64ms CPU time, 71.6M memory peak. Jan 23 23:31:45.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 23:31:45.587051 kernel: audit: type=1130 audit(1769211105.581:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 23:31:45.588477 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:45.613693 systemd[1]: Reload requested from client PID 2446 ('systemctl') (unit session-10.scope)... Jan 23 23:31:45.613713 systemd[1]: Reloading... Jan 23 23:31:45.697107 zram_generator::config[2491]: No configuration found. Jan 23 23:31:45.877494 systemd[1]: Reloading finished in 263 ms. Jan 23 23:31:46.004262 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 23 23:31:46.003000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 23:31:46.004369 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 23 23:31:46.004748 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:46.007997 kernel: audit: type=1130 audit(1769211106.003:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 23:31:46.008000 audit: BPF prog-id=63 op=LOAD Jan 23 23:31:46.008720 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:46.008000 audit: BPF prog-id=55 op=UNLOAD Jan 23 23:31:46.009000 audit: BPF prog-id=64 op=LOAD Jan 23 23:31:46.011702 kernel: audit: type=1334 audit(1769211106.008:292): prog-id=63 op=LOAD Jan 23 23:31:46.011751 kernel: audit: type=1334 audit(1769211106.008:293): prog-id=55 op=UNLOAD Jan 23 23:31:46.011767 kernel: audit: type=1334 audit(1769211106.009:294): prog-id=64 op=LOAD Jan 23 23:31:46.011783 kernel: audit: type=1334 audit(1769211106.009:295): prog-id=65 op=LOAD Jan 23 23:31:46.009000 audit: BPF prog-id=65 op=LOAD Jan 23 23:31:46.009000 audit: BPF prog-id=56 op=UNLOAD Jan 23 23:31:46.013183 kernel: audit: type=1334 audit(1769211106.009:296): prog-id=56 op=UNLOAD Jan 23 23:31:46.013237 kernel: audit: type=1334 audit(1769211106.009:297): prog-id=57 op=UNLOAD Jan 23 23:31:46.009000 audit: BPF prog-id=57 op=UNLOAD Jan 23 23:31:46.010000 audit: BPF prog-id=66 op=LOAD Jan 23 23:31:46.010000 audit: BPF prog-id=58 op=UNLOAD Jan 23 23:31:46.014001 kernel: audit: type=1334 audit(1769211106.010:298): prog-id=66 op=LOAD Jan 23 23:31:46.014027 kernel: audit: type=1334 audit(1769211106.010:299): prog-id=58 op=UNLOAD Jan 23 23:31:46.012000 audit: BPF prog-id=67 op=LOAD Jan 23 23:31:46.012000 audit: BPF prog-id=49 op=UNLOAD Jan 23 23:31:46.014000 audit: BPF prog-id=68 op=LOAD Jan 23 23:31:46.014000 audit: BPF prog-id=52 op=UNLOAD Jan 23 23:31:46.014000 audit: BPF prog-id=69 op=LOAD Jan 23 23:31:46.014000 audit: BPF prog-id=70 op=LOAD Jan 23 23:31:46.014000 audit: BPF prog-id=53 op=UNLOAD Jan 23 23:31:46.014000 audit: BPF prog-id=54 op=UNLOAD Jan 23 23:31:46.015000 audit: BPF prog-id=71 op=LOAD Jan 23 23:31:46.015000 audit: BPF prog-id=60 op=UNLOAD Jan 23 23:31:46.015000 audit: BPF prog-id=72 op=LOAD Jan 23 23:31:46.015000 audit: BPF prog-id=73 op=LOAD Jan 23 23:31:46.015000 audit: BPF prog-id=61 op=UNLOAD Jan 23 23:31:46.015000 audit: BPF prog-id=62 op=UNLOAD Jan 23 23:31:46.015000 audit: BPF prog-id=74 op=LOAD Jan 23 23:31:46.016000 audit: BPF prog-id=75 op=LOAD Jan 23 23:31:46.016000 audit: BPF prog-id=50 op=UNLOAD Jan 23 23:31:46.016000 audit: BPF prog-id=51 op=UNLOAD Jan 23 23:31:46.016000 audit: BPF prog-id=76 op=LOAD Jan 23 23:31:46.016000 audit: BPF prog-id=43 op=UNLOAD Jan 23 23:31:46.016000 audit: BPF prog-id=77 op=LOAD Jan 23 23:31:46.016000 audit: BPF prog-id=78 op=LOAD Jan 23 23:31:46.016000 audit: BPF prog-id=44 op=UNLOAD Jan 23 23:31:46.016000 audit: BPF prog-id=45 op=UNLOAD Jan 23 23:31:46.036000 audit: BPF prog-id=79 op=LOAD Jan 23 23:31:46.036000 audit: BPF prog-id=59 op=UNLOAD Jan 23 23:31:46.036000 audit: BPF prog-id=80 op=LOAD Jan 23 23:31:46.036000 audit: BPF prog-id=46 op=UNLOAD Jan 23 23:31:46.036000 audit: BPF prog-id=81 op=LOAD Jan 23 23:31:46.036000 audit: BPF prog-id=82 op=LOAD Jan 23 23:31:46.036000 audit: BPF prog-id=47 op=UNLOAD Jan 23 23:31:46.036000 audit: BPF prog-id=48 op=UNLOAD Jan 23 23:31:47.113857 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:47.113000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:47.135554 (kubelet)[2540]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 23:31:47.163715 kubelet[2540]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 23:31:47.163715 kubelet[2540]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 23:31:47.163715 kubelet[2540]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 23:31:47.164044 kubelet[2540]: I0123 23:31:47.163751 2540 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 23:31:47.471542 kubelet[2540]: I0123 23:31:47.471430 2540 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 23 23:31:47.472386 kubelet[2540]: I0123 23:31:47.471517 2540 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 23:31:47.472674 kubelet[2540]: I0123 23:31:47.472634 2540 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 23:31:47.505523 kubelet[2540]: E0123 23:31:47.505475 2540 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.10.88:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.10.88:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 23 23:31:47.505688 kubelet[2540]: I0123 23:31:47.505637 2540 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 23:31:47.517670 kubelet[2540]: I0123 23:31:47.517614 2540 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 23:31:47.521596 kubelet[2540]: I0123 23:31:47.521557 2540 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 23 23:31:47.521975 kubelet[2540]: I0123 23:31:47.521881 2540 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 23:31:47.522082 kubelet[2540]: I0123 23:31:47.521919 2540 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-1-266c03b17e","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 23:31:47.522196 kubelet[2540]: I0123 23:31:47.522168 2540 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 23:31:47.522196 kubelet[2540]: I0123 23:31:47.522177 2540 container_manager_linux.go:303] "Creating device plugin manager" Jan 23 23:31:47.523494 kubelet[2540]: I0123 23:31:47.523451 2540 state_mem.go:36] "Initialized new in-memory state store" Jan 23 23:31:47.527156 kubelet[2540]: I0123 23:31:47.527119 2540 kubelet.go:480] "Attempting to sync node with API server" Jan 23 23:31:47.527156 kubelet[2540]: I0123 23:31:47.527150 2540 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 23:31:47.527315 kubelet[2540]: I0123 23:31:47.527172 2540 kubelet.go:386] "Adding apiserver pod source" Jan 23 23:31:47.528982 kubelet[2540]: I0123 23:31:47.528585 2540 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 23:31:47.530013 kubelet[2540]: I0123 23:31:47.529989 2540 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 23:31:47.533977 kubelet[2540]: I0123 23:31:47.531972 2540 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 23:31:47.533977 kubelet[2540]: E0123 23:31:47.532274 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.10.88:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4593-0-0-1-266c03b17e&limit=500&resourceVersion=0\": dial tcp 10.0.10.88:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 23 23:31:47.533977 kubelet[2540]: W0123 23:31:47.532399 2540 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 23 23:31:47.533977 kubelet[2540]: E0123 23:31:47.532455 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.10.88:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.10.88:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 23 23:31:47.534969 kubelet[2540]: I0123 23:31:47.534937 2540 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 23 23:31:47.535011 kubelet[2540]: I0123 23:31:47.535003 2540 server.go:1289] "Started kubelet" Jan 23 23:31:47.535431 kubelet[2540]: I0123 23:31:47.535400 2540 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 23:31:47.538537 kubelet[2540]: I0123 23:31:47.537348 2540 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 23:31:47.539108 kubelet[2540]: I0123 23:31:47.539082 2540 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 23:31:47.539521 kubelet[2540]: I0123 23:31:47.539503 2540 server.go:317] "Adding debug handlers to kubelet server" Jan 23 23:31:47.544875 kubelet[2540]: I0123 23:31:47.544522 2540 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 23:31:47.545115 kubelet[2540]: I0123 23:31:47.545085 2540 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 23 23:31:47.545164 kubelet[2540]: I0123 23:31:47.544786 2540 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 23:31:47.545276 kubelet[2540]: I0123 23:31:47.545255 2540 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 23 23:31:47.545310 kubelet[2540]: I0123 23:31:47.545299 2540 reconciler.go:26] "Reconciler: start to sync state" Jan 23 23:31:47.545910 kubelet[2540]: E0123 23:31:47.545870 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.10.88:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.10.88:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 23 23:31:47.546096 kubelet[2540]: E0123 23:31:47.546068 2540 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 23:31:47.546650 kubelet[2540]: E0123 23:31:47.546391 2540 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-1-266c03b17e\" not found" Jan 23 23:31:47.546650 kubelet[2540]: E0123 23:31:47.544946 2540 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.10.88:6443/api/v1/namespaces/default/events\": dial tcp 10.0.10.88:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4593-0-0-1-266c03b17e.188d8020ed4c1d12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4593-0-0-1-266c03b17e,UID:ci-4593-0-0-1-266c03b17e,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-1-266c03b17e,},FirstTimestamp:2026-01-23 23:31:47.53497013 +0000 UTC m=+0.396151689,LastTimestamp:2026-01-23 23:31:47.53497013 +0000 UTC m=+0.396151689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-1-266c03b17e,}" Jan 23 23:31:47.546650 kubelet[2540]: E0123 23:31:47.546526 2540 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.88:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-1-266c03b17e?timeout=10s\": dial tcp 10.0.10.88:6443: connect: connection refused" interval="200ms" Jan 23 23:31:47.546846 kubelet[2540]: I0123 23:31:47.546812 2540 factory.go:223] Registration of the systemd container factory successfully Jan 23 23:31:47.546940 kubelet[2540]: I0123 23:31:47.546914 2540 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 23:31:47.548208 kubelet[2540]: I0123 23:31:47.548080 2540 factory.go:223] Registration of the containerd container factory successfully Jan 23 23:31:47.549000 audit[2557]: NETFILTER_CFG table=mangle:42 family=10 entries=2 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:47.549000 audit[2557]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffffdceeea0 a2=0 a3=0 items=0 ppid=2540 pid=2557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.549000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 23:31:47.551096 kubelet[2540]: I0123 23:31:47.551061 2540 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 23 23:31:47.550000 audit[2558]: NETFILTER_CFG table=mangle:43 family=2 entries=2 op=nft_register_chain pid=2558 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.550000 audit[2558]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc36052f0 a2=0 a3=0 items=0 ppid=2540 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.550000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 23:31:47.551000 audit[2559]: NETFILTER_CFG table=filter:44 family=2 entries=1 op=nft_register_chain pid=2559 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.551000 audit[2559]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc3741930 a2=0 a3=0 items=0 ppid=2540 pid=2559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.551000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 23:31:47.553000 audit[2561]: NETFILTER_CFG table=mangle:45 family=10 entries=1 op=nft_register_chain pid=2561 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:47.553000 audit[2561]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe51dd3c0 a2=0 a3=0 items=0 ppid=2540 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.553000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 23:31:47.554000 audit[2562]: NETFILTER_CFG table=nat:46 family=10 entries=1 op=nft_register_chain pid=2562 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:47.554000 audit[2562]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe1f7a460 a2=0 a3=0 items=0 ppid=2540 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.554000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 23:31:47.555000 audit[2563]: NETFILTER_CFG table=filter:47 family=10 entries=1 op=nft_register_chain pid=2563 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:31:47.555000 audit[2563]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff5744d90 a2=0 a3=0 items=0 ppid=2540 pid=2563 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.555000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 23:31:47.556000 audit[2564]: NETFILTER_CFG table=filter:48 family=2 entries=2 op=nft_register_chain pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.556000 audit[2564]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffffdd0a530 a2=0 a3=0 items=0 ppid=2540 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.556000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 23:31:47.558000 audit[2566]: NETFILTER_CFG table=filter:49 family=2 entries=2 op=nft_register_chain pid=2566 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.558000 audit[2566]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff557e740 a2=0 a3=0 items=0 ppid=2540 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.558000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 23:31:47.561191 kubelet[2540]: I0123 23:31:47.560685 2540 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 23:31:47.561191 kubelet[2540]: I0123 23:31:47.560705 2540 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 23:31:47.561191 kubelet[2540]: I0123 23:31:47.560723 2540 state_mem.go:36] "Initialized new in-memory state store" Jan 23 23:31:47.566755 kubelet[2540]: I0123 23:31:47.566715 2540 policy_none.go:49] "None policy: Start" Jan 23 23:31:47.566755 kubelet[2540]: I0123 23:31:47.566750 2540 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 23 23:31:47.566755 kubelet[2540]: I0123 23:31:47.566763 2540 state_mem.go:35] "Initializing new in-memory state store" Jan 23 23:31:47.566000 audit[2570]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_rule pid=2570 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.566000 audit[2570]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffe4798cd0 a2=0 a3=0 items=0 ppid=2540 pid=2570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.566000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 23 23:31:47.568180 kubelet[2540]: I0123 23:31:47.568143 2540 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 23 23:31:47.568243 kubelet[2540]: I0123 23:31:47.568183 2540 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 23 23:31:47.568243 kubelet[2540]: I0123 23:31:47.568209 2540 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 23:31:47.568243 kubelet[2540]: I0123 23:31:47.568218 2540 kubelet.go:2436] "Starting kubelet main sync loop" Jan 23 23:31:47.568299 kubelet[2540]: E0123 23:31:47.568266 2540 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 23:31:47.568000 audit[2572]: NETFILTER_CFG table=mangle:51 family=2 entries=1 op=nft_register_chain pid=2572 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.568000 audit[2572]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff0d89cd0 a2=0 a3=0 items=0 ppid=2540 pid=2572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.568000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 23:31:47.569000 audit[2573]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2573 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.569000 audit[2573]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff9ca6ee0 a2=0 a3=0 items=0 ppid=2540 pid=2573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.569000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 23:31:47.570000 audit[2574]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2574 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:31:47.570000 audit[2574]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff3874f20 a2=0 a3=0 items=0 ppid=2540 pid=2574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:47.570000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 23:31:47.571543 kubelet[2540]: E0123 23:31:47.571474 2540 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.10.88:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.10.88:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 23 23:31:47.573452 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 23 23:31:47.591191 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 23 23:31:47.595019 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 23 23:31:47.610005 kubelet[2540]: E0123 23:31:47.609949 2540 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 23:31:47.610225 kubelet[2540]: I0123 23:31:47.610187 2540 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 23:31:47.610261 kubelet[2540]: I0123 23:31:47.610209 2540 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 23:31:47.610690 kubelet[2540]: I0123 23:31:47.610459 2540 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 23:31:47.611976 kubelet[2540]: E0123 23:31:47.611835 2540 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 23:31:47.611976 kubelet[2540]: E0123 23:31:47.611906 2540 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4593-0-0-1-266c03b17e\" not found" Jan 23 23:31:47.679390 systemd[1]: Created slice kubepods-burstable-podb7ad832767dc5e8f8219a741e13b3034.slice - libcontainer container kubepods-burstable-podb7ad832767dc5e8f8219a741e13b3034.slice. Jan 23 23:31:47.685843 kubelet[2540]: E0123 23:31:47.685796 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.688421 systemd[1]: Created slice kubepods-burstable-pod032306928b65ee55adf2c7a7b01381a2.slice - libcontainer container kubepods-burstable-pod032306928b65ee55adf2c7a7b01381a2.slice. Jan 23 23:31:47.690451 kubelet[2540]: E0123 23:31:47.690427 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.692709 systemd[1]: Created slice kubepods-burstable-pod937257e3ba4df4b0c4b3e3ee920557f9.slice - libcontainer container kubepods-burstable-pod937257e3ba4df4b0c4b3e3ee920557f9.slice. Jan 23 23:31:47.694023 kubelet[2540]: E0123 23:31:47.693998 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.712790 kubelet[2540]: I0123 23:31:47.712736 2540 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.713234 kubelet[2540]: E0123 23:31:47.713210 2540 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.88:6443/api/v1/nodes\": dial tcp 10.0.10.88:6443: connect: connection refused" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746644 kubelet[2540]: I0123 23:31:47.746548 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/032306928b65ee55adf2c7a7b01381a2-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-1-266c03b17e\" (UID: \"032306928b65ee55adf2c7a7b01381a2\") " pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746644 kubelet[2540]: I0123 23:31:47.746583 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/937257e3ba4df4b0c4b3e3ee920557f9-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-1-266c03b17e\" (UID: \"937257e3ba4df4b0c4b3e3ee920557f9\") " pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746644 kubelet[2540]: I0123 23:31:47.746605 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746768 kubelet[2540]: I0123 23:31:47.746665 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746768 kubelet[2540]: I0123 23:31:47.746703 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746768 kubelet[2540]: I0123 23:31:47.746727 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/937257e3ba4df4b0c4b3e3ee920557f9-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-1-266c03b17e\" (UID: \"937257e3ba4df4b0c4b3e3ee920557f9\") " pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746768 kubelet[2540]: I0123 23:31:47.746744 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/937257e3ba4df4b0c4b3e3ee920557f9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-1-266c03b17e\" (UID: \"937257e3ba4df4b0c4b3e3ee920557f9\") " pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746768 kubelet[2540]: I0123 23:31:47.746760 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.746860 kubelet[2540]: I0123 23:31:47.746777 2540 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.747077 kubelet[2540]: E0123 23:31:47.747025 2540 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.88:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-1-266c03b17e?timeout=10s\": dial tcp 10.0.10.88:6443: connect: connection refused" interval="400ms" Jan 23 23:31:47.914900 kubelet[2540]: I0123 23:31:47.914867 2540 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.915248 kubelet[2540]: E0123 23:31:47.915222 2540 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.88:6443/api/v1/nodes\": dial tcp 10.0.10.88:6443: connect: connection refused" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:47.987583 containerd[1666]: time="2026-01-23T23:31:47.987533710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-1-266c03b17e,Uid:b7ad832767dc5e8f8219a741e13b3034,Namespace:kube-system,Attempt:0,}" Jan 23 23:31:47.991553 containerd[1666]: time="2026-01-23T23:31:47.991489322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-1-266c03b17e,Uid:032306928b65ee55adf2c7a7b01381a2,Namespace:kube-system,Attempt:0,}" Jan 23 23:31:47.995426 containerd[1666]: time="2026-01-23T23:31:47.995344774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-1-266c03b17e,Uid:937257e3ba4df4b0c4b3e3ee920557f9,Namespace:kube-system,Attempt:0,}" Jan 23 23:31:48.025230 containerd[1666]: time="2026-01-23T23:31:48.025070465Z" level=info msg="connecting to shim f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95" address="unix:///run/containerd/s/404ace6a350006bd41f85b8f4ed807af7d4637e839837a653813c40a77e9d9d1" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:31:48.025458 containerd[1666]: time="2026-01-23T23:31:48.025429786Z" level=info msg="connecting to shim eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0" address="unix:///run/containerd/s/ee96e29dca3a6a96fe9d8db47cc062d60aa944bcdc6c4ce5ffb2fb721b507d21" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:31:48.050752 containerd[1666]: time="2026-01-23T23:31:48.050674143Z" level=info msg="connecting to shim 59c15f1db68cad0c0ec1be939b0bd2dd06e03219b9881c38759c7f6a6093849d" address="unix:///run/containerd/s/08233e0c9f3bb888402fcc83164cd6960b78733b4dfaff7d5003008ab239062d" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:31:48.051207 systemd[1]: Started cri-containerd-f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95.scope - libcontainer container f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95. Jan 23 23:31:48.054330 systemd[1]: Started cri-containerd-eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0.scope - libcontainer container eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0. Jan 23 23:31:48.064000 audit: BPF prog-id=83 op=LOAD Jan 23 23:31:48.065000 audit: BPF prog-id=84 op=LOAD Jan 23 23:31:48.065000 audit[2612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2590 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.065000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631363438303664373866643533373733633861346232613862623462 Jan 23 23:31:48.065000 audit: BPF prog-id=84 op=UNLOAD Jan 23 23:31:48.065000 audit[2612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.065000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631363438303664373866643533373733633861346232613862623462 Jan 23 23:31:48.066000 audit: BPF prog-id=85 op=LOAD Jan 23 23:31:48.066000 audit[2612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2590 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631363438303664373866643533373733633861346232613862623462 Jan 23 23:31:48.066000 audit: BPF prog-id=86 op=LOAD Jan 23 23:31:48.066000 audit[2612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2590 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631363438303664373866643533373733633861346232613862623462 Jan 23 23:31:48.066000 audit: BPF prog-id=86 op=UNLOAD Jan 23 23:31:48.066000 audit[2612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631363438303664373866643533373733633861346232613862623462 Jan 23 23:31:48.066000 audit: BPF prog-id=85 op=UNLOAD Jan 23 23:31:48.066000 audit[2612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631363438303664373866643533373733633861346232613862623462 Jan 23 23:31:48.066000 audit: BPF prog-id=87 op=LOAD Jan 23 23:31:48.066000 audit[2612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2590 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631363438303664373866643533373733633861346232613862623462 Jan 23 23:31:48.068000 audit: BPF prog-id=88 op=LOAD Jan 23 23:31:48.069000 audit: BPF prog-id=89 op=LOAD Jan 23 23:31:48.069000 audit[2614]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2592 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653434623861636361393336653664656139633837373431666265 Jan 23 23:31:48.069000 audit: BPF prog-id=89 op=UNLOAD Jan 23 23:31:48.069000 audit[2614]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653434623861636361393336653664656139633837373431666265 Jan 23 23:31:48.070000 audit: BPF prog-id=90 op=LOAD Jan 23 23:31:48.070000 audit[2614]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2592 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653434623861636361393336653664656139633837373431666265 Jan 23 23:31:48.070000 audit: BPF prog-id=91 op=LOAD Jan 23 23:31:48.070000 audit[2614]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2592 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653434623861636361393336653664656139633837373431666265 Jan 23 23:31:48.070000 audit: BPF prog-id=91 op=UNLOAD Jan 23 23:31:48.070000 audit[2614]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653434623861636361393336653664656139633837373431666265 Jan 23 23:31:48.070000 audit: BPF prog-id=90 op=UNLOAD Jan 23 23:31:48.070000 audit[2614]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653434623861636361393336653664656139633837373431666265 Jan 23 23:31:48.070000 audit: BPF prog-id=92 op=LOAD Jan 23 23:31:48.070000 audit[2614]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2592 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653434623861636361393336653664656139633837373431666265 Jan 23 23:31:48.080216 systemd[1]: Started cri-containerd-59c15f1db68cad0c0ec1be939b0bd2dd06e03219b9881c38759c7f6a6093849d.scope - libcontainer container 59c15f1db68cad0c0ec1be939b0bd2dd06e03219b9881c38759c7f6a6093849d. Jan 23 23:31:48.093000 audit: BPF prog-id=93 op=LOAD Jan 23 23:31:48.095144 containerd[1666]: time="2026-01-23T23:31:48.094905878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-1-266c03b17e,Uid:032306928b65ee55adf2c7a7b01381a2,Namespace:kube-system,Attempt:0,} returns sandbox id \"f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95\"" Jan 23 23:31:48.094000 audit: BPF prog-id=94 op=LOAD Jan 23 23:31:48.094000 audit[2669]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2645 pid=2669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.094000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135663164623638636164306330656331626539333962306264 Jan 23 23:31:48.094000 audit: BPF prog-id=94 op=UNLOAD Jan 23 23:31:48.094000 audit[2669]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.094000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135663164623638636164306330656331626539333962306264 Jan 23 23:31:48.094000 audit: BPF prog-id=95 op=LOAD Jan 23 23:31:48.094000 audit[2669]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2645 pid=2669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.094000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135663164623638636164306330656331626539333962306264 Jan 23 23:31:48.094000 audit: BPF prog-id=96 op=LOAD Jan 23 23:31:48.094000 audit[2669]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2645 pid=2669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.094000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135663164623638636164306330656331626539333962306264 Jan 23 23:31:48.094000 audit: BPF prog-id=96 op=UNLOAD Jan 23 23:31:48.094000 audit[2669]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.094000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135663164623638636164306330656331626539333962306264 Jan 23 23:31:48.094000 audit: BPF prog-id=95 op=UNLOAD Jan 23 23:31:48.094000 audit[2669]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.094000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135663164623638636164306330656331626539333962306264 Jan 23 23:31:48.094000 audit: BPF prog-id=97 op=LOAD Jan 23 23:31:48.094000 audit[2669]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2645 pid=2669 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.094000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135663164623638636164306330656331626539333962306264 Jan 23 23:31:48.103339 containerd[1666]: time="2026-01-23T23:31:48.103303023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-1-266c03b17e,Uid:b7ad832767dc5e8f8219a741e13b3034,Namespace:kube-system,Attempt:0,} returns sandbox id \"eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0\"" Jan 23 23:31:48.103727 containerd[1666]: time="2026-01-23T23:31:48.103698865Z" level=info msg="CreateContainer within sandbox \"f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 23 23:31:48.109773 containerd[1666]: time="2026-01-23T23:31:48.109727563Z" level=info msg="CreateContainer within sandbox \"eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 23 23:31:48.116071 containerd[1666]: time="2026-01-23T23:31:48.116032942Z" level=info msg="Container feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:31:48.119344 containerd[1666]: time="2026-01-23T23:31:48.119232152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-1-266c03b17e,Uid:937257e3ba4df4b0c4b3e3ee920557f9,Namespace:kube-system,Attempt:0,} returns sandbox id \"59c15f1db68cad0c0ec1be939b0bd2dd06e03219b9881c38759c7f6a6093849d\"" Jan 23 23:31:48.124259 containerd[1666]: time="2026-01-23T23:31:48.124226167Z" level=info msg="CreateContainer within sandbox \"59c15f1db68cad0c0ec1be939b0bd2dd06e03219b9881c38759c7f6a6093849d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 23 23:31:48.124872 containerd[1666]: time="2026-01-23T23:31:48.124839409Z" level=info msg="Container 804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:31:48.128908 containerd[1666]: time="2026-01-23T23:31:48.128856461Z" level=info msg="CreateContainer within sandbox \"f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4\"" Jan 23 23:31:48.129818 containerd[1666]: time="2026-01-23T23:31:48.129501503Z" level=info msg="StartContainer for \"feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4\"" Jan 23 23:31:48.131025 containerd[1666]: time="2026-01-23T23:31:48.130995388Z" level=info msg="connecting to shim feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4" address="unix:///run/containerd/s/404ace6a350006bd41f85b8f4ed807af7d4637e839837a653813c40a77e9d9d1" protocol=ttrpc version=3 Jan 23 23:31:48.137244 containerd[1666]: time="2026-01-23T23:31:48.137207287Z" level=info msg="CreateContainer within sandbox \"eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106\"" Jan 23 23:31:48.137705 containerd[1666]: time="2026-01-23T23:31:48.137679848Z" level=info msg="StartContainer for \"804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106\"" Jan 23 23:31:48.138672 containerd[1666]: time="2026-01-23T23:31:48.138639731Z" level=info msg="connecting to shim 804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106" address="unix:///run/containerd/s/ee96e29dca3a6a96fe9d8db47cc062d60aa944bcdc6c4ce5ffb2fb721b507d21" protocol=ttrpc version=3 Jan 23 23:31:48.142807 containerd[1666]: time="2026-01-23T23:31:48.142770904Z" level=info msg="Container f970f0fcf98389adcf35a564d8791aa65e08b10aab938ecca3a3bb92077449fa: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:31:48.148365 kubelet[2540]: E0123 23:31:48.148314 2540 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.88:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-1-266c03b17e?timeout=10s\": dial tcp 10.0.10.88:6443: connect: connection refused" interval="800ms" Jan 23 23:31:48.150242 systemd[1]: Started cri-containerd-feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4.scope - libcontainer container feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4. Jan 23 23:31:48.152425 containerd[1666]: time="2026-01-23T23:31:48.152375333Z" level=info msg="CreateContainer within sandbox \"59c15f1db68cad0c0ec1be939b0bd2dd06e03219b9881c38759c7f6a6093849d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f970f0fcf98389adcf35a564d8791aa65e08b10aab938ecca3a3bb92077449fa\"" Jan 23 23:31:48.152910 containerd[1666]: time="2026-01-23T23:31:48.152864614Z" level=info msg="StartContainer for \"f970f0fcf98389adcf35a564d8791aa65e08b10aab938ecca3a3bb92077449fa\"" Jan 23 23:31:48.154761 systemd[1]: Started cri-containerd-804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106.scope - libcontainer container 804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106. Jan 23 23:31:48.156097 containerd[1666]: time="2026-01-23T23:31:48.156030584Z" level=info msg="connecting to shim f970f0fcf98389adcf35a564d8791aa65e08b10aab938ecca3a3bb92077449fa" address="unix:///run/containerd/s/08233e0c9f3bb888402fcc83164cd6960b78733b4dfaff7d5003008ab239062d" protocol=ttrpc version=3 Jan 23 23:31:48.164000 audit: BPF prog-id=98 op=LOAD Jan 23 23:31:48.165000 audit: BPF prog-id=99 op=LOAD Jan 23 23:31:48.165000 audit[2713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2590 pid=2713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665636131343230323262393739346438356635373331373238323961 Jan 23 23:31:48.165000 audit: BPF prog-id=99 op=UNLOAD Jan 23 23:31:48.165000 audit[2713]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=2713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665636131343230323262393739346438356635373331373238323961 Jan 23 23:31:48.167000 audit: BPF prog-id=100 op=LOAD Jan 23 23:31:48.167000 audit[2713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2590 pid=2713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665636131343230323262393739346438356635373331373238323961 Jan 23 23:31:48.167000 audit: BPF prog-id=101 op=LOAD Jan 23 23:31:48.167000 audit[2713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2590 pid=2713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665636131343230323262393739346438356635373331373238323961 Jan 23 23:31:48.167000 audit: BPF prog-id=101 op=UNLOAD Jan 23 23:31:48.167000 audit[2713]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=2713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665636131343230323262393739346438356635373331373238323961 Jan 23 23:31:48.167000 audit: BPF prog-id=100 op=UNLOAD Jan 23 23:31:48.167000 audit[2713]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=2713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665636131343230323262393739346438356635373331373238323961 Jan 23 23:31:48.167000 audit: BPF prog-id=102 op=LOAD Jan 23 23:31:48.167000 audit[2713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2590 pid=2713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665636131343230323262393739346438356635373331373238323961 Jan 23 23:31:48.176344 systemd[1]: Started cri-containerd-f970f0fcf98389adcf35a564d8791aa65e08b10aab938ecca3a3bb92077449fa.scope - libcontainer container f970f0fcf98389adcf35a564d8791aa65e08b10aab938ecca3a3bb92077449fa. Jan 23 23:31:48.177000 audit: BPF prog-id=103 op=LOAD Jan 23 23:31:48.178000 audit: BPF prog-id=104 op=LOAD Jan 23 23:31:48.178000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2592 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830346235616337366330383463643331373432643465613134383034 Jan 23 23:31:48.178000 audit: BPF prog-id=104 op=UNLOAD Jan 23 23:31:48.178000 audit[2726]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830346235616337366330383463643331373432643465613134383034 Jan 23 23:31:48.178000 audit: BPF prog-id=105 op=LOAD Jan 23 23:31:48.178000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2592 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830346235616337366330383463643331373432643465613134383034 Jan 23 23:31:48.178000 audit: BPF prog-id=106 op=LOAD Jan 23 23:31:48.178000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2592 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830346235616337366330383463643331373432643465613134383034 Jan 23 23:31:48.178000 audit: BPF prog-id=106 op=UNLOAD Jan 23 23:31:48.178000 audit[2726]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830346235616337366330383463643331373432643465613134383034 Jan 23 23:31:48.178000 audit: BPF prog-id=105 op=UNLOAD Jan 23 23:31:48.178000 audit[2726]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830346235616337366330383463643331373432643465613134383034 Jan 23 23:31:48.179000 audit: BPF prog-id=107 op=LOAD Jan 23 23:31:48.179000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2592 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830346235616337366330383463643331373432643465613134383034 Jan 23 23:31:48.192000 audit: BPF prog-id=108 op=LOAD Jan 23 23:31:48.192000 audit: BPF prog-id=109 op=LOAD Jan 23 23:31:48.192000 audit[2752]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2645 pid=2752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.192000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639373066306663663938333839616463663335613536346438373931 Jan 23 23:31:48.192000 audit: BPF prog-id=109 op=UNLOAD Jan 23 23:31:48.192000 audit[2752]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.192000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639373066306663663938333839616463663335613536346438373931 Jan 23 23:31:48.193000 audit: BPF prog-id=110 op=LOAD Jan 23 23:31:48.193000 audit[2752]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2645 pid=2752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639373066306663663938333839616463663335613536346438373931 Jan 23 23:31:48.193000 audit: BPF prog-id=111 op=LOAD Jan 23 23:31:48.193000 audit[2752]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2645 pid=2752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639373066306663663938333839616463663335613536346438373931 Jan 23 23:31:48.193000 audit: BPF prog-id=111 op=UNLOAD Jan 23 23:31:48.193000 audit[2752]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639373066306663663938333839616463663335613536346438373931 Jan 23 23:31:48.193000 audit: BPF prog-id=110 op=UNLOAD Jan 23 23:31:48.193000 audit[2752]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639373066306663663938333839616463663335613536346438373931 Jan 23 23:31:48.193000 audit: BPF prog-id=112 op=LOAD Jan 23 23:31:48.193000 audit[2752]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2645 pid=2752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:48.193000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639373066306663663938333839616463663335613536346438373931 Jan 23 23:31:48.203209 containerd[1666]: time="2026-01-23T23:31:48.203081928Z" level=info msg="StartContainer for \"feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4\" returns successfully" Jan 23 23:31:48.220082 containerd[1666]: time="2026-01-23T23:31:48.220040659Z" level=info msg="StartContainer for \"804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106\" returns successfully" Jan 23 23:31:48.230813 containerd[1666]: time="2026-01-23T23:31:48.230726372Z" level=info msg="StartContainer for \"f970f0fcf98389adcf35a564d8791aa65e08b10aab938ecca3a3bb92077449fa\" returns successfully" Jan 23 23:31:48.317535 kubelet[2540]: I0123 23:31:48.317441 2540 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:48.317852 kubelet[2540]: E0123 23:31:48.317794 2540 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.88:6443/api/v1/nodes\": dial tcp 10.0.10.88:6443: connect: connection refused" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:48.577833 kubelet[2540]: E0123 23:31:48.577740 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:48.577913 kubelet[2540]: E0123 23:31:48.577833 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:48.579076 kubelet[2540]: E0123 23:31:48.579056 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.120299 kubelet[2540]: I0123 23:31:49.120272 2540 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.582432 kubelet[2540]: E0123 23:31:49.582330 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.582759 kubelet[2540]: E0123 23:31:49.582541 2540 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.779301 kubelet[2540]: E0123 23:31:49.779256 2540 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4593-0-0-1-266c03b17e\" not found" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.905651 kubelet[2540]: I0123 23:31:49.905373 2540 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.946641 kubelet[2540]: I0123 23:31:49.946594 2540 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.951840 kubelet[2540]: E0123 23:31:49.951629 2540 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.951840 kubelet[2540]: I0123 23:31:49.951657 2540 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.953509 kubelet[2540]: E0123 23:31:49.953161 2540 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-1-266c03b17e\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.953509 kubelet[2540]: I0123 23:31:49.953182 2540 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:49.955989 kubelet[2540]: E0123 23:31:49.955542 2540 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-1-266c03b17e\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:50.530561 kubelet[2540]: I0123 23:31:50.530523 2540 apiserver.go:52] "Watching apiserver" Jan 23 23:31:50.545980 kubelet[2540]: I0123 23:31:50.545936 2540 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 23 23:31:51.912272 systemd[1]: Reload requested from client PID 2823 ('systemctl') (unit session-10.scope)... Jan 23 23:31:51.912288 systemd[1]: Reloading... Jan 23 23:31:51.993042 zram_generator::config[2869]: No configuration found. Jan 23 23:31:52.176937 systemd[1]: Reloading finished in 264 ms. Jan 23 23:31:52.199240 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:52.213698 systemd[1]: kubelet.service: Deactivated successfully. Jan 23 23:31:52.213932 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:52.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:52.214008 systemd[1]: kubelet.service: Consumed 764ms CPU time, 128.3M memory peak. Jan 23 23:31:52.219033 kernel: kauditd_printk_skb: 201 callbacks suppressed Jan 23 23:31:52.219123 kernel: audit: type=1131 audit(1769211112.213:393): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:52.219152 kernel: audit: type=1334 audit(1769211112.216:394): prog-id=113 op=LOAD Jan 23 23:31:52.219168 kernel: audit: type=1334 audit(1769211112.216:395): prog-id=68 op=UNLOAD Jan 23 23:31:52.216000 audit: BPF prog-id=113 op=LOAD Jan 23 23:31:52.216000 audit: BPF prog-id=68 op=UNLOAD Jan 23 23:31:52.216769 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 23:31:52.219451 kernel: audit: type=1334 audit(1769211112.217:396): prog-id=114 op=LOAD Jan 23 23:31:52.217000 audit: BPF prog-id=114 op=LOAD Jan 23 23:31:52.218000 audit: BPF prog-id=115 op=LOAD Jan 23 23:31:52.218000 audit: BPF prog-id=69 op=UNLOAD Jan 23 23:31:52.218000 audit: BPF prog-id=70 op=UNLOAD Jan 23 23:31:52.219000 audit: BPF prog-id=116 op=LOAD Jan 23 23:31:52.221003 kernel: audit: type=1334 audit(1769211112.218:397): prog-id=115 op=LOAD Jan 23 23:31:52.221029 kernel: audit: type=1334 audit(1769211112.218:398): prog-id=69 op=UNLOAD Jan 23 23:31:52.221045 kernel: audit: type=1334 audit(1769211112.218:399): prog-id=70 op=UNLOAD Jan 23 23:31:52.221068 kernel: audit: type=1334 audit(1769211112.219:400): prog-id=116 op=LOAD Jan 23 23:31:52.221082 kernel: audit: type=1334 audit(1769211112.219:401): prog-id=76 op=UNLOAD Jan 23 23:31:52.221099 kernel: audit: type=1334 audit(1769211112.219:402): prog-id=117 op=LOAD Jan 23 23:31:52.219000 audit: BPF prog-id=76 op=UNLOAD Jan 23 23:31:52.219000 audit: BPF prog-id=117 op=LOAD Jan 23 23:31:52.219000 audit: BPF prog-id=118 op=LOAD Jan 23 23:31:52.219000 audit: BPF prog-id=77 op=UNLOAD Jan 23 23:31:52.219000 audit: BPF prog-id=78 op=UNLOAD Jan 23 23:31:52.221000 audit: BPF prog-id=119 op=LOAD Jan 23 23:31:52.221000 audit: BPF prog-id=67 op=UNLOAD Jan 23 23:31:52.222000 audit: BPF prog-id=120 op=LOAD Jan 23 23:31:52.222000 audit: BPF prog-id=80 op=UNLOAD Jan 23 23:31:52.222000 audit: BPF prog-id=121 op=LOAD Jan 23 23:31:52.222000 audit: BPF prog-id=122 op=LOAD Jan 23 23:31:52.222000 audit: BPF prog-id=81 op=UNLOAD Jan 23 23:31:52.222000 audit: BPF prog-id=82 op=UNLOAD Jan 23 23:31:52.223000 audit: BPF prog-id=123 op=LOAD Jan 23 23:31:52.223000 audit: BPF prog-id=63 op=UNLOAD Jan 23 23:31:52.223000 audit: BPF prog-id=124 op=LOAD Jan 23 23:31:52.223000 audit: BPF prog-id=125 op=LOAD Jan 23 23:31:52.223000 audit: BPF prog-id=64 op=UNLOAD Jan 23 23:31:52.223000 audit: BPF prog-id=65 op=UNLOAD Jan 23 23:31:52.225000 audit: BPF prog-id=126 op=LOAD Jan 23 23:31:52.225000 audit: BPF prog-id=66 op=UNLOAD Jan 23 23:31:52.244000 audit: BPF prog-id=127 op=LOAD Jan 23 23:31:52.244000 audit: BPF prog-id=79 op=UNLOAD Jan 23 23:31:52.245000 audit: BPF prog-id=128 op=LOAD Jan 23 23:31:52.245000 audit: BPF prog-id=129 op=LOAD Jan 23 23:31:52.245000 audit: BPF prog-id=74 op=UNLOAD Jan 23 23:31:52.245000 audit: BPF prog-id=75 op=UNLOAD Jan 23 23:31:52.246000 audit: BPF prog-id=130 op=LOAD Jan 23 23:31:52.246000 audit: BPF prog-id=71 op=UNLOAD Jan 23 23:31:52.246000 audit: BPF prog-id=131 op=LOAD Jan 23 23:31:52.246000 audit: BPF prog-id=132 op=LOAD Jan 23 23:31:52.246000 audit: BPF prog-id=72 op=UNLOAD Jan 23 23:31:52.246000 audit: BPF prog-id=73 op=UNLOAD Jan 23 23:31:53.231901 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 23:31:53.231000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:31:53.236226 (kubelet)[2913]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 23:31:53.586885 kubelet[2913]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 23:31:53.586885 kubelet[2913]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 23:31:53.586885 kubelet[2913]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 23:31:53.586885 kubelet[2913]: I0123 23:31:53.513089 2913 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 23:31:53.586885 kubelet[2913]: I0123 23:31:53.521411 2913 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 23 23:31:53.586885 kubelet[2913]: I0123 23:31:53.521434 2913 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 23:31:53.586885 kubelet[2913]: I0123 23:31:53.521608 2913 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 23:31:53.587571 kubelet[2913]: I0123 23:31:53.586990 2913 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 23 23:31:53.589459 kubelet[2913]: I0123 23:31:53.589425 2913 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 23:31:53.593884 kubelet[2913]: I0123 23:31:53.593864 2913 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 23:31:53.599498 kubelet[2913]: I0123 23:31:53.599424 2913 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 23 23:31:53.600235 kubelet[2913]: I0123 23:31:53.600209 2913 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 23:31:53.600375 kubelet[2913]: I0123 23:31:53.600235 2913 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-1-266c03b17e","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 23:31:53.600454 kubelet[2913]: I0123 23:31:53.600383 2913 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 23:31:53.600454 kubelet[2913]: I0123 23:31:53.600391 2913 container_manager_linux.go:303] "Creating device plugin manager" Jan 23 23:31:53.600454 kubelet[2913]: I0123 23:31:53.600432 2913 state_mem.go:36] "Initialized new in-memory state store" Jan 23 23:31:53.601419 kubelet[2913]: I0123 23:31:53.600581 2913 kubelet.go:480] "Attempting to sync node with API server" Jan 23 23:31:53.601419 kubelet[2913]: I0123 23:31:53.600596 2913 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 23:31:53.601419 kubelet[2913]: I0123 23:31:53.600636 2913 kubelet.go:386] "Adding apiserver pod source" Jan 23 23:31:53.601419 kubelet[2913]: I0123 23:31:53.600652 2913 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 23:31:53.601665 kubelet[2913]: I0123 23:31:53.601633 2913 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 23:31:53.602356 kubelet[2913]: I0123 23:31:53.602291 2913 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 23:31:53.607034 kubelet[2913]: I0123 23:31:53.606994 2913 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 23 23:31:53.607147 kubelet[2913]: I0123 23:31:53.607053 2913 server.go:1289] "Started kubelet" Jan 23 23:31:53.609140 kubelet[2913]: I0123 23:31:53.607231 2913 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 23:31:53.609140 kubelet[2913]: I0123 23:31:53.607264 2913 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 23:31:53.609357 kubelet[2913]: I0123 23:31:53.609324 2913 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 23:31:53.609913 kubelet[2913]: I0123 23:31:53.609876 2913 server.go:317] "Adding debug handlers to kubelet server" Jan 23 23:31:53.612420 kubelet[2913]: I0123 23:31:53.612391 2913 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 23:31:53.621204 kubelet[2913]: I0123 23:31:53.621170 2913 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 23:31:53.621333 kubelet[2913]: I0123 23:31:53.621306 2913 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 23 23:31:53.622660 kubelet[2913]: I0123 23:31:53.622352 2913 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 23 23:31:53.622821 kubelet[2913]: I0123 23:31:53.622800 2913 reconciler.go:26] "Reconciler: start to sync state" Jan 23 23:31:53.624091 kubelet[2913]: E0123 23:31:53.623388 2913 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 23:31:53.624851 kubelet[2913]: I0123 23:31:53.624768 2913 factory.go:223] Registration of the systemd container factory successfully Jan 23 23:31:53.625133 kubelet[2913]: I0123 23:31:53.625102 2913 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 23:31:53.628172 kubelet[2913]: I0123 23:31:53.628141 2913 factory.go:223] Registration of the containerd container factory successfully Jan 23 23:31:53.634698 kubelet[2913]: I0123 23:31:53.634653 2913 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 23 23:31:53.635946 kubelet[2913]: I0123 23:31:53.635748 2913 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 23 23:31:53.635946 kubelet[2913]: I0123 23:31:53.635781 2913 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 23 23:31:53.635946 kubelet[2913]: I0123 23:31:53.635812 2913 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 23:31:53.635946 kubelet[2913]: I0123 23:31:53.635821 2913 kubelet.go:2436] "Starting kubelet main sync loop" Jan 23 23:31:53.635946 kubelet[2913]: E0123 23:31:53.635861 2913 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 23:31:53.660054 kubelet[2913]: I0123 23:31:53.660029 2913 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 23:31:53.660054 kubelet[2913]: I0123 23:31:53.660047 2913 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 23:31:53.660197 kubelet[2913]: I0123 23:31:53.660069 2913 state_mem.go:36] "Initialized new in-memory state store" Jan 23 23:31:53.660197 kubelet[2913]: I0123 23:31:53.660194 2913 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 23 23:31:53.660249 kubelet[2913]: I0123 23:31:53.660203 2913 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 23 23:31:53.660249 kubelet[2913]: I0123 23:31:53.660217 2913 policy_none.go:49] "None policy: Start" Jan 23 23:31:53.660249 kubelet[2913]: I0123 23:31:53.660226 2913 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 23 23:31:53.660249 kubelet[2913]: I0123 23:31:53.660233 2913 state_mem.go:35] "Initializing new in-memory state store" Jan 23 23:31:53.660329 kubelet[2913]: I0123 23:31:53.660311 2913 state_mem.go:75] "Updated machine memory state" Jan 23 23:31:53.666292 kubelet[2913]: E0123 23:31:53.666221 2913 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 23:31:53.666415 kubelet[2913]: I0123 23:31:53.666393 2913 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 23:31:53.666450 kubelet[2913]: I0123 23:31:53.666404 2913 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 23:31:53.666675 kubelet[2913]: I0123 23:31:53.666656 2913 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 23:31:53.668070 kubelet[2913]: E0123 23:31:53.667634 2913 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 23:31:53.737267 kubelet[2913]: I0123 23:31:53.737220 2913 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.737574 kubelet[2913]: I0123 23:31:53.737554 2913 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.737936 kubelet[2913]: I0123 23:31:53.737915 2913 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.771812 kubelet[2913]: I0123 23:31:53.771787 2913 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.781812 kubelet[2913]: I0123 23:31:53.781783 2913 kubelet_node_status.go:124] "Node was previously registered" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.781915 kubelet[2913]: I0123 23:31:53.781860 2913 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.923800 kubelet[2913]: I0123 23:31:53.923754 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/937257e3ba4df4b0c4b3e3ee920557f9-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-1-266c03b17e\" (UID: \"937257e3ba4df4b0c4b3e3ee920557f9\") " pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.923911 kubelet[2913]: I0123 23:31:53.923817 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.923911 kubelet[2913]: I0123 23:31:53.923852 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.923911 kubelet[2913]: I0123 23:31:53.923882 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.924653 kubelet[2913]: I0123 23:31:53.923914 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/032306928b65ee55adf2c7a7b01381a2-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-1-266c03b17e\" (UID: \"032306928b65ee55adf2c7a7b01381a2\") " pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.924653 kubelet[2913]: I0123 23:31:53.923940 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/937257e3ba4df4b0c4b3e3ee920557f9-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-1-266c03b17e\" (UID: \"937257e3ba4df4b0c4b3e3ee920557f9\") " pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.924653 kubelet[2913]: I0123 23:31:53.924218 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/937257e3ba4df4b0c4b3e3ee920557f9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-1-266c03b17e\" (UID: \"937257e3ba4df4b0c4b3e3ee920557f9\") " pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.924653 kubelet[2913]: I0123 23:31:53.924252 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:53.924653 kubelet[2913]: I0123 23:31:53.924272 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b7ad832767dc5e8f8219a741e13b3034-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" (UID: \"b7ad832767dc5e8f8219a741e13b3034\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:54.602087 kubelet[2913]: I0123 23:31:54.601703 2913 apiserver.go:52] "Watching apiserver" Jan 23 23:31:54.623189 kubelet[2913]: I0123 23:31:54.623096 2913 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 23 23:31:54.648911 kubelet[2913]: I0123 23:31:54.648335 2913 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:54.648911 kubelet[2913]: I0123 23:31:54.648764 2913 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:54.654584 kubelet[2913]: E0123 23:31:54.654483 2913 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-1-266c03b17e\" already exists" pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:54.655661 kubelet[2913]: E0123 23:31:54.655635 2913 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-1-266c03b17e\" already exists" pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" Jan 23 23:31:54.669480 kubelet[2913]: I0123 23:31:54.668952 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4593-0-0-1-266c03b17e" podStartSLOduration=1.6689388059999999 podStartE2EDuration="1.668938806s" podCreationTimestamp="2026-01-23 23:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 23:31:54.668607005 +0000 UTC m=+1.429249560" watchObservedRunningTime="2026-01-23 23:31:54.668938806 +0000 UTC m=+1.429581321" Jan 23 23:31:54.686088 kubelet[2913]: I0123 23:31:54.686038 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4593-0-0-1-266c03b17e" podStartSLOduration=1.6860211779999998 podStartE2EDuration="1.686021178s" podCreationTimestamp="2026-01-23 23:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 23:31:54.677537952 +0000 UTC m=+1.438180507" watchObservedRunningTime="2026-01-23 23:31:54.686021178 +0000 UTC m=+1.446663733" Jan 23 23:31:54.686266 kubelet[2913]: I0123 23:31:54.686233 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4593-0-0-1-266c03b17e" podStartSLOduration=1.6862270590000001 podStartE2EDuration="1.686227059s" podCreationTimestamp="2026-01-23 23:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 23:31:54.686005658 +0000 UTC m=+1.446648213" watchObservedRunningTime="2026-01-23 23:31:54.686227059 +0000 UTC m=+1.446869614" Jan 23 23:31:58.109917 kubelet[2913]: I0123 23:31:58.109867 2913 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 23 23:31:58.110559 kubelet[2913]: I0123 23:31:58.110394 2913 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 23 23:31:58.110589 containerd[1666]: time="2026-01-23T23:31:58.110213261Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 23 23:31:59.047821 systemd[1]: Created slice kubepods-besteffort-pod83f09bfa_d389_4ec3_8704_612841f5751e.slice - libcontainer container kubepods-besteffort-pod83f09bfa_d389_4ec3_8704_612841f5751e.slice. Jan 23 23:31:59.055216 kubelet[2913]: I0123 23:31:59.055093 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/83f09bfa-d389-4ec3-8704-612841f5751e-kube-proxy\") pod \"kube-proxy-k67t4\" (UID: \"83f09bfa-d389-4ec3-8704-612841f5751e\") " pod="kube-system/kube-proxy-k67t4" Jan 23 23:31:59.055216 kubelet[2913]: I0123 23:31:59.055135 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/83f09bfa-d389-4ec3-8704-612841f5751e-xtables-lock\") pod \"kube-proxy-k67t4\" (UID: \"83f09bfa-d389-4ec3-8704-612841f5751e\") " pod="kube-system/kube-proxy-k67t4" Jan 23 23:31:59.055216 kubelet[2913]: I0123 23:31:59.055155 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83f09bfa-d389-4ec3-8704-612841f5751e-lib-modules\") pod \"kube-proxy-k67t4\" (UID: \"83f09bfa-d389-4ec3-8704-612841f5751e\") " pod="kube-system/kube-proxy-k67t4" Jan 23 23:31:59.055216 kubelet[2913]: I0123 23:31:59.055176 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9sxz\" (UniqueName: \"kubernetes.io/projected/83f09bfa-d389-4ec3-8704-612841f5751e-kube-api-access-g9sxz\") pod \"kube-proxy-k67t4\" (UID: \"83f09bfa-d389-4ec3-8704-612841f5751e\") " pod="kube-system/kube-proxy-k67t4" Jan 23 23:31:59.163630 kubelet[2913]: E0123 23:31:59.163591 2913 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jan 23 23:31:59.163630 kubelet[2913]: E0123 23:31:59.163624 2913 projected.go:194] Error preparing data for projected volume kube-api-access-g9sxz for pod kube-system/kube-proxy-k67t4: configmap "kube-root-ca.crt" not found Jan 23 23:31:59.164091 kubelet[2913]: E0123 23:31:59.163693 2913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83f09bfa-d389-4ec3-8704-612841f5751e-kube-api-access-g9sxz podName:83f09bfa-d389-4ec3-8704-612841f5751e nodeName:}" failed. No retries permitted until 2026-01-23 23:31:59.663670673 +0000 UTC m=+6.424313228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g9sxz" (UniqueName: "kubernetes.io/projected/83f09bfa-d389-4ec3-8704-612841f5751e-kube-api-access-g9sxz") pod "kube-proxy-k67t4" (UID: "83f09bfa-d389-4ec3-8704-612841f5751e") : configmap "kube-root-ca.crt" not found Jan 23 23:31:59.373304 systemd[1]: Created slice kubepods-besteffort-podf9a46658_d8c8_4f97_b7fc_7ee7eaaebbd2.slice - libcontainer container kubepods-besteffort-podf9a46658_d8c8_4f97_b7fc_7ee7eaaebbd2.slice. Jan 23 23:31:59.458767 kubelet[2913]: I0123 23:31:59.458690 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f9a46658-d8c8-4f97-b7fc-7ee7eaaebbd2-var-lib-calico\") pod \"tigera-operator-7dcd859c48-xf2g5\" (UID: \"f9a46658-d8c8-4f97-b7fc-7ee7eaaebbd2\") " pod="tigera-operator/tigera-operator-7dcd859c48-xf2g5" Jan 23 23:31:59.458767 kubelet[2913]: I0123 23:31:59.458735 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbj94\" (UniqueName: \"kubernetes.io/projected/f9a46658-d8c8-4f97-b7fc-7ee7eaaebbd2-kube-api-access-mbj94\") pod \"tigera-operator-7dcd859c48-xf2g5\" (UID: \"f9a46658-d8c8-4f97-b7fc-7ee7eaaebbd2\") " pod="tigera-operator/tigera-operator-7dcd859c48-xf2g5" Jan 23 23:31:59.677503 containerd[1666]: time="2026-01-23T23:31:59.677158879Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-xf2g5,Uid:f9a46658-d8c8-4f97-b7fc-7ee7eaaebbd2,Namespace:tigera-operator,Attempt:0,}" Jan 23 23:31:59.701688 containerd[1666]: time="2026-01-23T23:31:59.701405313Z" level=info msg="connecting to shim 32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4" address="unix:///run/containerd/s/37c65ef8244f7a2b524bd6a141501db3bb0fd76f4bff8ee4d3dcd7852fc475fa" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:31:59.728222 systemd[1]: Started cri-containerd-32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4.scope - libcontainer container 32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4. Jan 23 23:31:59.737000 audit: BPF prog-id=133 op=LOAD Jan 23 23:31:59.739444 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 23 23:31:59.739487 kernel: audit: type=1334 audit(1769211119.737:435): prog-id=133 op=LOAD Jan 23 23:31:59.739000 audit: BPF prog-id=134 op=LOAD Jan 23 23:31:59.741381 kernel: audit: type=1334 audit(1769211119.739:436): prog-id=134 op=LOAD Jan 23 23:31:59.741415 kernel: audit: type=1300 audit(1769211119.739:436): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.739000 audit[2987]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.739000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.748871 kernel: audit: type=1327 audit(1769211119.739:436): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.749018 kernel: audit: type=1334 audit(1769211119.739:437): prog-id=134 op=UNLOAD Jan 23 23:31:59.739000 audit: BPF prog-id=134 op=UNLOAD Jan 23 23:31:59.739000 audit[2987]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.753549 kernel: audit: type=1300 audit(1769211119.739:437): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.753684 kernel: audit: type=1327 audit(1769211119.739:437): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.739000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.739000 audit: BPF prog-id=135 op=LOAD Jan 23 23:31:59.758218 kernel: audit: type=1334 audit(1769211119.739:438): prog-id=135 op=LOAD Jan 23 23:31:59.758265 kernel: audit: type=1300 audit(1769211119.739:438): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.739000 audit[2987]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.763974 kernel: audit: type=1327 audit(1769211119.739:438): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.739000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.740000 audit: BPF prog-id=136 op=LOAD Jan 23 23:31:59.740000 audit[2987]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.740000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.744000 audit: BPF prog-id=136 op=UNLOAD Jan 23 23:31:59.744000 audit[2987]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.744000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.744000 audit: BPF prog-id=135 op=UNLOAD Jan 23 23:31:59.744000 audit[2987]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.744000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.744000 audit: BPF prog-id=137 op=LOAD Jan 23 23:31:59.744000 audit[2987]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2976 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:31:59.744000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653463633331363933336365373663303363396162663465646136 Jan 23 23:31:59.782277 containerd[1666]: time="2026-01-23T23:31:59.782237000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-xf2g5,Uid:f9a46658-d8c8-4f97-b7fc-7ee7eaaebbd2,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4\"" Jan 23 23:31:59.783881 containerd[1666]: time="2026-01-23T23:31:59.783853445Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 23 23:31:59.960124 containerd[1666]: time="2026-01-23T23:31:59.960004062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k67t4,Uid:83f09bfa-d389-4ec3-8704-612841f5751e,Namespace:kube-system,Attempt:0,}" Jan 23 23:31:59.988476 containerd[1666]: time="2026-01-23T23:31:59.988433989Z" level=info msg="connecting to shim 7443bc8fe4c640da752febd94c282f4b58c1d25dbd2f4c40b5ec9f580f1cd936" address="unix:///run/containerd/s/8ed1a512a1c9359848d43d8af5bc38558a256a3b19bf7533475924f5b88a857d" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:00.010198 systemd[1]: Started cri-containerd-7443bc8fe4c640da752febd94c282f4b58c1d25dbd2f4c40b5ec9f580f1cd936.scope - libcontainer container 7443bc8fe4c640da752febd94c282f4b58c1d25dbd2f4c40b5ec9f580f1cd936. Jan 23 23:32:00.018000 audit: BPF prog-id=138 op=LOAD Jan 23 23:32:00.020000 audit: BPF prog-id=139 op=LOAD Jan 23 23:32:00.020000 audit[3034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3023 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.020000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734343362633866653463363430646137353266656264393463323832 Jan 23 23:32:00.021000 audit: BPF prog-id=139 op=UNLOAD Jan 23 23:32:00.021000 audit[3034]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734343362633866653463363430646137353266656264393463323832 Jan 23 23:32:00.021000 audit: BPF prog-id=140 op=LOAD Jan 23 23:32:00.021000 audit[3034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3023 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734343362633866653463363430646137353266656264393463323832 Jan 23 23:32:00.021000 audit: BPF prog-id=141 op=LOAD Jan 23 23:32:00.021000 audit[3034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3023 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734343362633866653463363430646137353266656264393463323832 Jan 23 23:32:00.021000 audit: BPF prog-id=141 op=UNLOAD Jan 23 23:32:00.021000 audit[3034]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734343362633866653463363430646137353266656264393463323832 Jan 23 23:32:00.021000 audit: BPF prog-id=140 op=UNLOAD Jan 23 23:32:00.021000 audit[3034]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734343362633866653463363430646137353266656264393463323832 Jan 23 23:32:00.021000 audit: BPF prog-id=142 op=LOAD Jan 23 23:32:00.021000 audit[3034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3023 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734343362633866653463363430646137353266656264393463323832 Jan 23 23:32:00.034613 containerd[1666]: time="2026-01-23T23:32:00.034574449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k67t4,Uid:83f09bfa-d389-4ec3-8704-612841f5751e,Namespace:kube-system,Attempt:0,} returns sandbox id \"7443bc8fe4c640da752febd94c282f4b58c1d25dbd2f4c40b5ec9f580f1cd936\"" Jan 23 23:32:00.040591 containerd[1666]: time="2026-01-23T23:32:00.040558788Z" level=info msg="CreateContainer within sandbox \"7443bc8fe4c640da752febd94c282f4b58c1d25dbd2f4c40b5ec9f580f1cd936\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 23 23:32:00.049542 containerd[1666]: time="2026-01-23T23:32:00.049493375Z" level=info msg="Container 9b1e06c7555474df6317d1f6c699592bef2a9bff03a959203096bf6cd665cbc2: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:00.060952 containerd[1666]: time="2026-01-23T23:32:00.060893210Z" level=info msg="CreateContainer within sandbox \"7443bc8fe4c640da752febd94c282f4b58c1d25dbd2f4c40b5ec9f580f1cd936\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9b1e06c7555474df6317d1f6c699592bef2a9bff03a959203096bf6cd665cbc2\"" Jan 23 23:32:00.061622 containerd[1666]: time="2026-01-23T23:32:00.061597812Z" level=info msg="StartContainer for \"9b1e06c7555474df6317d1f6c699592bef2a9bff03a959203096bf6cd665cbc2\"" Jan 23 23:32:00.063040 containerd[1666]: time="2026-01-23T23:32:00.063013976Z" level=info msg="connecting to shim 9b1e06c7555474df6317d1f6c699592bef2a9bff03a959203096bf6cd665cbc2" address="unix:///run/containerd/s/8ed1a512a1c9359848d43d8af5bc38558a256a3b19bf7533475924f5b88a857d" protocol=ttrpc version=3 Jan 23 23:32:00.083200 systemd[1]: Started cri-containerd-9b1e06c7555474df6317d1f6c699592bef2a9bff03a959203096bf6cd665cbc2.scope - libcontainer container 9b1e06c7555474df6317d1f6c699592bef2a9bff03a959203096bf6cd665cbc2. Jan 23 23:32:00.141000 audit: BPF prog-id=143 op=LOAD Jan 23 23:32:00.141000 audit[3061]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3023 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962316530366337353535343734646636333137643166366336393935 Jan 23 23:32:00.141000 audit: BPF prog-id=144 op=LOAD Jan 23 23:32:00.141000 audit[3061]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3023 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962316530366337353535343734646636333137643166366336393935 Jan 23 23:32:00.141000 audit: BPF prog-id=144 op=UNLOAD Jan 23 23:32:00.141000 audit[3061]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962316530366337353535343734646636333137643166366336393935 Jan 23 23:32:00.141000 audit: BPF prog-id=143 op=UNLOAD Jan 23 23:32:00.141000 audit[3061]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962316530366337353535343734646636333137643166366336393935 Jan 23 23:32:00.141000 audit: BPF prog-id=145 op=LOAD Jan 23 23:32:00.141000 audit[3061]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3023 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962316530366337353535343734646636333137643166366336393935 Jan 23 23:32:00.160338 containerd[1666]: time="2026-01-23T23:32:00.160299113Z" level=info msg="StartContainer for \"9b1e06c7555474df6317d1f6c699592bef2a9bff03a959203096bf6cd665cbc2\" returns successfully" Jan 23 23:32:00.319000 audit[3127]: NETFILTER_CFG table=mangle:54 family=10 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.319000 audit[3127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffee88a870 a2=0 a3=1 items=0 ppid=3074 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.319000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 23:32:00.320000 audit[3126]: NETFILTER_CFG table=mangle:55 family=2 entries=1 op=nft_register_chain pid=3126 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.320000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdff95cc0 a2=0 a3=1 items=0 ppid=3074 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.320000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 23:32:00.320000 audit[3129]: NETFILTER_CFG table=nat:56 family=10 entries=1 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.320000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd70e9010 a2=0 a3=1 items=0 ppid=3074 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.320000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 23:32:00.321000 audit[3131]: NETFILTER_CFG table=filter:57 family=10 entries=1 op=nft_register_chain pid=3131 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.321000 audit[3131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffeca5c7f0 a2=0 a3=1 items=0 ppid=3074 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.321000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 23:32:00.322000 audit[3133]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=3133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.322000 audit[3133]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc75dc220 a2=0 a3=1 items=0 ppid=3074 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.322000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 23:32:00.325000 audit[3134]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.325000 audit[3134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc31422a0 a2=0 a3=1 items=0 ppid=3074 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.325000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 23:32:00.422000 audit[3135]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.422000 audit[3135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffffbe0f1f0 a2=0 a3=1 items=0 ppid=3074 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.422000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 23:32:00.425000 audit[3137]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.425000 audit[3137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc5b0e120 a2=0 a3=1 items=0 ppid=3074 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.425000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 23 23:32:00.429000 audit[3140]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.429000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd95ae0d0 a2=0 a3=1 items=0 ppid=3074 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.429000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 23 23:32:00.430000 audit[3141]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.430000 audit[3141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc7d13180 a2=0 a3=1 items=0 ppid=3074 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.430000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 23:32:00.432000 audit[3143]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.432000 audit[3143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff922b170 a2=0 a3=1 items=0 ppid=3074 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.432000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 23:32:00.434000 audit[3144]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3144 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.434000 audit[3144]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc2102ba0 a2=0 a3=1 items=0 ppid=3074 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.434000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 23:32:00.436000 audit[3146]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.436000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff4056050 a2=0 a3=1 items=0 ppid=3074 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.436000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 23 23:32:00.439000 audit[3149]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.439000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffddda9880 a2=0 a3=1 items=0 ppid=3074 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.439000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 23 23:32:00.441000 audit[3150]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.441000 audit[3150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdf391320 a2=0 a3=1 items=0 ppid=3074 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.441000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 23:32:00.443000 audit[3152]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.443000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff8bd2f30 a2=0 a3=1 items=0 ppid=3074 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.443000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 23:32:00.444000 audit[3153]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.444000 audit[3153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff4e5a730 a2=0 a3=1 items=0 ppid=3074 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.444000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 23:32:00.447000 audit[3155]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.447000 audit[3155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd758b840 a2=0 a3=1 items=0 ppid=3074 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.447000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 23:32:00.450000 audit[3158]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.450000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe3fe5cb0 a2=0 a3=1 items=0 ppid=3074 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.450000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 23:32:00.454000 audit[3161]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3161 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.454000 audit[3161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff62ea880 a2=0 a3=1 items=0 ppid=3074 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.454000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 23 23:32:00.455000 audit[3162]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3162 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.455000 audit[3162]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffed866520 a2=0 a3=1 items=0 ppid=3074 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.455000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 23:32:00.457000 audit[3164]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3164 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.457000 audit[3164]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd814ea60 a2=0 a3=1 items=0 ppid=3074 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.457000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 23:32:00.461000 audit[3167]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.461000 audit[3167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffeaffb900 a2=0 a3=1 items=0 ppid=3074 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.461000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 23:32:00.462000 audit[3168]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3168 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.462000 audit[3168]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe89c16e0 a2=0 a3=1 items=0 ppid=3074 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.462000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 23:32:00.464000 audit[3170]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3170 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 23:32:00.464000 audit[3170]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffcdea5dd0 a2=0 a3=1 items=0 ppid=3074 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.464000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 23:32:00.486000 audit[3176]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3176 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:00.486000 audit[3176]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe307d130 a2=0 a3=1 items=0 ppid=3074 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.486000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:00.496000 audit[3176]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3176 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:00.496000 audit[3176]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffe307d130 a2=0 a3=1 items=0 ppid=3074 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.496000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:00.498000 audit[3181]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.498000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffff555b860 a2=0 a3=1 items=0 ppid=3074 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.498000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 23:32:00.500000 audit[3183]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3183 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.500000 audit[3183]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffd74b2d70 a2=0 a3=1 items=0 ppid=3074 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.500000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 23 23:32:00.504000 audit[3186]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.504000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd2ef3e30 a2=0 a3=1 items=0 ppid=3074 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.504000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 23 23:32:00.505000 audit[3187]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.505000 audit[3187]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff3945430 a2=0 a3=1 items=0 ppid=3074 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.505000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 23:32:00.507000 audit[3189]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3189 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.507000 audit[3189]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffea386840 a2=0 a3=1 items=0 ppid=3074 pid=3189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.507000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 23:32:00.508000 audit[3190]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.508000 audit[3190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe08d3ba0 a2=0 a3=1 items=0 ppid=3074 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.508000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 23:32:00.511000 audit[3192]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.511000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc33d71b0 a2=0 a3=1 items=0 ppid=3074 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.511000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 23 23:32:00.514000 audit[3195]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.514000 audit[3195]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffc4a505d0 a2=0 a3=1 items=0 ppid=3074 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.514000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 23 23:32:00.515000 audit[3196]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.515000 audit[3196]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff56e2cb0 a2=0 a3=1 items=0 ppid=3074 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.515000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 23:32:00.518000 audit[3198]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.518000 audit[3198]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcde2cf10 a2=0 a3=1 items=0 ppid=3074 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.518000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 23:32:00.519000 audit[3199]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.519000 audit[3199]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe1fd6dd0 a2=0 a3=1 items=0 ppid=3074 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.519000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 23:32:00.522000 audit[3201]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.522000 audit[3201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc9021780 a2=0 a3=1 items=0 ppid=3074 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.522000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 23:32:00.526000 audit[3204]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3204 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.526000 audit[3204]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffce83e6f0 a2=0 a3=1 items=0 ppid=3074 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.526000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 23 23:32:00.530000 audit[3207]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3207 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.530000 audit[3207]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff2199210 a2=0 a3=1 items=0 ppid=3074 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.530000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 23 23:32:00.532000 audit[3208]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.532000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffff4c4260 a2=0 a3=1 items=0 ppid=3074 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.532000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 23:32:00.535000 audit[3210]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3210 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.535000 audit[3210]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffeac44550 a2=0 a3=1 items=0 ppid=3074 pid=3210 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.535000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 23:32:00.538000 audit[3213]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.538000 audit[3213]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe24741e0 a2=0 a3=1 items=0 ppid=3074 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.538000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 23:32:00.539000 audit[3214]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3214 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.539000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe33efc70 a2=0 a3=1 items=0 ppid=3074 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.539000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 23:32:00.542000 audit[3216]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3216 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.542000 audit[3216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffe7c62c90 a2=0 a3=1 items=0 ppid=3074 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.542000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 23:32:00.543000 audit[3217]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3217 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.543000 audit[3217]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc5427220 a2=0 a3=1 items=0 ppid=3074 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.543000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 23:32:00.545000 audit[3219]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.545000 audit[3219]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffe391d3a0 a2=0 a3=1 items=0 ppid=3074 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.545000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 23:32:00.549000 audit[3222]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3222 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 23:32:00.549000 audit[3222]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffed056c50 a2=0 a3=1 items=0 ppid=3074 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.549000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 23:32:00.553000 audit[3224]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 23:32:00.553000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffd3f8f5f0 a2=0 a3=1 items=0 ppid=3074 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.553000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:00.553000 audit[3224]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 23:32:00.553000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffd3f8f5f0 a2=0 a3=1 items=0 ppid=3074 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:00.553000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:00.684931 kubelet[2913]: I0123 23:32:00.684869 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-k67t4" podStartSLOduration=1.684852673 podStartE2EDuration="1.684852673s" podCreationTimestamp="2026-01-23 23:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 23:32:00.674985882 +0000 UTC m=+7.435628437" watchObservedRunningTime="2026-01-23 23:32:00.684852673 +0000 UTC m=+7.445495228" Jan 23 23:32:01.255537 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2206287892.mount: Deactivated successfully. Jan 23 23:32:01.673682 containerd[1666]: time="2026-01-23T23:32:01.673639208Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:01.675258 containerd[1666]: time="2026-01-23T23:32:01.675194453Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 23 23:32:01.676800 containerd[1666]: time="2026-01-23T23:32:01.676324936Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:01.680166 containerd[1666]: time="2026-01-23T23:32:01.680132388Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:01.681046 containerd[1666]: time="2026-01-23T23:32:01.681012310Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 1.897123625s" Jan 23 23:32:01.681083 containerd[1666]: time="2026-01-23T23:32:01.681050311Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 23 23:32:01.684569 containerd[1666]: time="2026-01-23T23:32:01.684530921Z" level=info msg="CreateContainer within sandbox \"32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 23 23:32:01.692990 containerd[1666]: time="2026-01-23T23:32:01.692227705Z" level=info msg="Container a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:01.695022 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount947503901.mount: Deactivated successfully. Jan 23 23:32:01.700730 containerd[1666]: time="2026-01-23T23:32:01.700649290Z" level=info msg="CreateContainer within sandbox \"32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a\"" Jan 23 23:32:01.701880 containerd[1666]: time="2026-01-23T23:32:01.701159092Z" level=info msg="StartContainer for \"a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a\"" Jan 23 23:32:01.702354 containerd[1666]: time="2026-01-23T23:32:01.702312175Z" level=info msg="connecting to shim a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a" address="unix:///run/containerd/s/37c65ef8244f7a2b524bd6a141501db3bb0fd76f4bff8ee4d3dcd7852fc475fa" protocol=ttrpc version=3 Jan 23 23:32:01.720232 systemd[1]: Started cri-containerd-a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a.scope - libcontainer container a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a. Jan 23 23:32:01.728000 audit: BPF prog-id=146 op=LOAD Jan 23 23:32:01.729000 audit: BPF prog-id=147 op=LOAD Jan 23 23:32:01.729000 audit[3233]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=2976 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:01.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137316432333235653830386163623562396261643836386436326166 Jan 23 23:32:01.729000 audit: BPF prog-id=147 op=UNLOAD Jan 23 23:32:01.729000 audit[3233]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:01.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137316432333235653830386163623562396261643836386436326166 Jan 23 23:32:01.729000 audit: BPF prog-id=148 op=LOAD Jan 23 23:32:01.729000 audit[3233]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=2976 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:01.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137316432333235653830386163623562396261643836386436326166 Jan 23 23:32:01.729000 audit: BPF prog-id=149 op=LOAD Jan 23 23:32:01.729000 audit[3233]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=2976 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:01.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137316432333235653830386163623562396261643836386436326166 Jan 23 23:32:01.729000 audit: BPF prog-id=149 op=UNLOAD Jan 23 23:32:01.729000 audit[3233]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:01.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137316432333235653830386163623562396261643836386436326166 Jan 23 23:32:01.729000 audit: BPF prog-id=148 op=UNLOAD Jan 23 23:32:01.729000 audit[3233]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:01.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137316432333235653830386163623562396261643836386436326166 Jan 23 23:32:01.729000 audit: BPF prog-id=150 op=LOAD Jan 23 23:32:01.729000 audit[3233]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=2976 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:01.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6137316432333235653830386163623562396261643836386436326166 Jan 23 23:32:01.748254 containerd[1666]: time="2026-01-23T23:32:01.748215515Z" level=info msg="StartContainer for \"a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a\" returns successfully" Jan 23 23:32:02.676988 kubelet[2913]: I0123 23:32:02.676661 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-xf2g5" podStartSLOduration=1.778580039 podStartE2EDuration="3.676645627s" podCreationTimestamp="2026-01-23 23:31:59 +0000 UTC" firstStartedPulling="2026-01-23 23:31:59.783610524 +0000 UTC m=+6.544253079" lastFinishedPulling="2026-01-23 23:32:01.681676112 +0000 UTC m=+8.442318667" observedRunningTime="2026-01-23 23:32:02.676566267 +0000 UTC m=+9.437208822" watchObservedRunningTime="2026-01-23 23:32:02.676645627 +0000 UTC m=+9.437288182" Jan 23 23:32:06.912186 sudo[1958]: pam_unix(sudo:session): session closed for user root Jan 23 23:32:06.911000 audit[1958]: USER_END pid=1958 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:32:06.916484 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 23 23:32:06.916575 kernel: audit: type=1106 audit(1769211126.911:515): pid=1958 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:32:06.911000 audit[1958]: CRED_DISP pid=1958 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:32:06.920187 kernel: audit: type=1104 audit(1769211126.911:516): pid=1958 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 23:32:07.009593 sshd[1957]: Connection closed by 68.220.241.50 port 52664 Jan 23 23:32:07.008353 sshd-session[1953]: pam_unix(sshd:session): session closed for user core Jan 23 23:32:07.009000 audit[1953]: USER_END pid=1953 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:32:07.009000 audit[1953]: CRED_DISP pid=1953 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:32:07.017897 systemd-logind[1646]: Session 10 logged out. Waiting for processes to exit. Jan 23 23:32:07.018693 kernel: audit: type=1106 audit(1769211127.009:517): pid=1953 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:32:07.018744 kernel: audit: type=1104 audit(1769211127.009:518): pid=1953 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:32:07.019235 systemd[1]: sshd@8-10.0.10.88:22-68.220.241.50:52664.service: Deactivated successfully. Jan 23 23:32:07.018000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.10.88:22-68.220.241.50:52664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:32:07.021520 systemd[1]: session-10.scope: Deactivated successfully. Jan 23 23:32:07.022594 kernel: audit: type=1131 audit(1769211127.018:519): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.10.88:22-68.220.241.50:52664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:32:07.025340 systemd[1]: session-10.scope: Consumed 7.589s CPU time, 223.7M memory peak. Jan 23 23:32:07.030225 systemd-logind[1646]: Removed session 10. Jan 23 23:32:07.399000 audit[3327]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:07.402985 kernel: audit: type=1325 audit(1769211127.399:520): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:07.399000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd7a5c590 a2=0 a3=1 items=0 ppid=3074 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:07.399000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:07.409484 kernel: audit: type=1300 audit(1769211127.399:520): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd7a5c590 a2=0 a3=1 items=0 ppid=3074 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:07.409550 kernel: audit: type=1327 audit(1769211127.399:520): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:07.410000 audit[3327]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:07.412975 kernel: audit: type=1325 audit(1769211127.410:521): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:07.410000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd7a5c590 a2=0 a3=1 items=0 ppid=3074 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:07.410000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:07.416973 kernel: audit: type=1300 audit(1769211127.410:521): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd7a5c590 a2=0 a3=1 items=0 ppid=3074 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:07.420000 audit[3329]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:07.420000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffdd31e220 a2=0 a3=1 items=0 ppid=3074 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:07.420000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:07.424000 audit[3329]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:07.424000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdd31e220 a2=0 a3=1 items=0 ppid=3074 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:07.424000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:11.363000 audit[3331]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:11.363000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffe2e6c660 a2=0 a3=1 items=0 ppid=3074 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:11.363000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:11.372000 audit[3331]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:11.372000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe2e6c660 a2=0 a3=1 items=0 ppid=3074 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:11.372000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:11.390000 audit[3333]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:11.390000 audit[3333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff916ac70 a2=0 a3=1 items=0 ppid=3074 pid=3333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:11.390000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:11.398000 audit[3333]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:11.398000 audit[3333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff916ac70 a2=0 a3=1 items=0 ppid=3074 pid=3333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:11.398000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:12.409000 audit[3335]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:12.413475 kernel: kauditd_printk_skb: 19 callbacks suppressed Jan 23 23:32:12.413532 kernel: audit: type=1325 audit(1769211132.409:528): table=filter:113 family=2 entries=19 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:12.409000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffdd7d9330 a2=0 a3=1 items=0 ppid=3074 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:12.416983 kernel: audit: type=1300 audit(1769211132.409:528): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffdd7d9330 a2=0 a3=1 items=0 ppid=3074 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:12.409000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:12.419256 kernel: audit: type=1327 audit(1769211132.409:528): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:12.417000 audit[3335]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:12.421081 kernel: audit: type=1325 audit(1769211132.417:529): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:12.417000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdd7d9330 a2=0 a3=1 items=0 ppid=3074 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:12.424491 kernel: audit: type=1300 audit(1769211132.417:529): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdd7d9330 a2=0 a3=1 items=0 ppid=3074 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:12.417000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:12.426154 kernel: audit: type=1327 audit(1769211132.417:529): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:13.002615 systemd[1]: Created slice kubepods-besteffort-pod9153c950_4ddd_40f6_8202_76418a16b236.slice - libcontainer container kubepods-besteffort-pod9153c950_4ddd_40f6_8202_76418a16b236.slice. Jan 23 23:32:13.042701 kubelet[2913]: I0123 23:32:13.042613 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9153c950-4ddd-40f6-8202-76418a16b236-tigera-ca-bundle\") pod \"calico-typha-69b8f8cd5f-mskgd\" (UID: \"9153c950-4ddd-40f6-8202-76418a16b236\") " pod="calico-system/calico-typha-69b8f8cd5f-mskgd" Jan 23 23:32:13.042701 kubelet[2913]: I0123 23:32:13.042659 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phgpq\" (UniqueName: \"kubernetes.io/projected/9153c950-4ddd-40f6-8202-76418a16b236-kube-api-access-phgpq\") pod \"calico-typha-69b8f8cd5f-mskgd\" (UID: \"9153c950-4ddd-40f6-8202-76418a16b236\") " pod="calico-system/calico-typha-69b8f8cd5f-mskgd" Jan 23 23:32:13.042701 kubelet[2913]: I0123 23:32:13.042679 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9153c950-4ddd-40f6-8202-76418a16b236-typha-certs\") pod \"calico-typha-69b8f8cd5f-mskgd\" (UID: \"9153c950-4ddd-40f6-8202-76418a16b236\") " pod="calico-system/calico-typha-69b8f8cd5f-mskgd" Jan 23 23:32:13.199945 systemd[1]: Created slice kubepods-besteffort-pod2bdc109c_b385_43d6_b2b6_e6aea5c69aaa.slice - libcontainer container kubepods-besteffort-pod2bdc109c_b385_43d6_b2b6_e6aea5c69aaa.slice. Jan 23 23:32:13.244740 kubelet[2913]: I0123 23:32:13.244677 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-xtables-lock\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.244740 kubelet[2913]: I0123 23:32:13.244735 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-tigera-ca-bundle\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.244740 kubelet[2913]: I0123 23:32:13.244753 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-var-run-calico\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.244949 kubelet[2913]: I0123 23:32:13.244838 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-cni-bin-dir\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.244949 kubelet[2913]: I0123 23:32:13.244899 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-cni-net-dir\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.244949 kubelet[2913]: I0123 23:32:13.244933 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsr6v\" (UniqueName: \"kubernetes.io/projected/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-kube-api-access-vsr6v\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.245272 kubelet[2913]: I0123 23:32:13.244971 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-node-certs\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.245272 kubelet[2913]: I0123 23:32:13.244988 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-policysync\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.245272 kubelet[2913]: I0123 23:32:13.245006 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-lib-modules\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.245272 kubelet[2913]: I0123 23:32:13.245026 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-cni-log-dir\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.245272 kubelet[2913]: I0123 23:32:13.245040 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-flexvol-driver-host\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.245460 kubelet[2913]: I0123 23:32:13.245072 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2bdc109c-b385-43d6-b2b6-e6aea5c69aaa-var-lib-calico\") pod \"calico-node-zhpk5\" (UID: \"2bdc109c-b385-43d6-b2b6-e6aea5c69aaa\") " pod="calico-system/calico-node-zhpk5" Jan 23 23:32:13.308427 containerd[1666]: time="2026-01-23T23:32:13.308328527Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69b8f8cd5f-mskgd,Uid:9153c950-4ddd-40f6-8202-76418a16b236,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:13.334178 containerd[1666]: time="2026-01-23T23:32:13.334121366Z" level=info msg="connecting to shim ba8f671fe8dfd8cbeb71d7b00b947032d070d2c8b7c4ed59157b866c0655ab56" address="unix:///run/containerd/s/1ba2344343608e4747f39bc4ca85f654af0603a50a472c6e1629116ffaa6ce08" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:13.347566 kubelet[2913]: E0123 23:32:13.347445 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.347566 kubelet[2913]: W0123 23:32:13.347481 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.347566 kubelet[2913]: E0123 23:32:13.347535 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.349121 kubelet[2913]: E0123 23:32:13.348057 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.349121 kubelet[2913]: W0123 23:32:13.348220 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.349121 kubelet[2913]: E0123 23:32:13.348239 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.349121 kubelet[2913]: E0123 23:32:13.348495 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.349121 kubelet[2913]: W0123 23:32:13.348506 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.349121 kubelet[2913]: E0123 23:32:13.348516 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.349121 kubelet[2913]: E0123 23:32:13.348881 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.349121 kubelet[2913]: W0123 23:32:13.348900 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.349121 kubelet[2913]: E0123 23:32:13.348911 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.350186 kubelet[2913]: E0123 23:32:13.349514 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.350186 kubelet[2913]: W0123 23:32:13.349531 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.350186 kubelet[2913]: E0123 23:32:13.349543 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.350186 kubelet[2913]: E0123 23:32:13.349671 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.350186 kubelet[2913]: W0123 23:32:13.349678 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.350186 kubelet[2913]: E0123 23:32:13.349685 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.350996 kubelet[2913]: E0123 23:32:13.350621 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.350996 kubelet[2913]: W0123 23:32:13.350634 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.350996 kubelet[2913]: E0123 23:32:13.350646 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.352064 kubelet[2913]: E0123 23:32:13.352040 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.352064 kubelet[2913]: W0123 23:32:13.352059 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.352256 kubelet[2913]: E0123 23:32:13.352072 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.352584 kubelet[2913]: E0123 23:32:13.352557 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.352584 kubelet[2913]: W0123 23:32:13.352575 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.352584 kubelet[2913]: E0123 23:32:13.352588 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.354015 kubelet[2913]: E0123 23:32:13.353984 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.354015 kubelet[2913]: W0123 23:32:13.354003 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.354015 kubelet[2913]: E0123 23:32:13.354018 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.354508 kubelet[2913]: E0123 23:32:13.354487 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.354508 kubelet[2913]: W0123 23:32:13.354502 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.354508 kubelet[2913]: E0123 23:32:13.354515 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.355098 kubelet[2913]: E0123 23:32:13.355038 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.355098 kubelet[2913]: W0123 23:32:13.355052 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.355098 kubelet[2913]: E0123 23:32:13.355063 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.355903 kubelet[2913]: E0123 23:32:13.355298 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.355903 kubelet[2913]: W0123 23:32:13.355313 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.355903 kubelet[2913]: E0123 23:32:13.355324 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.355903 kubelet[2913]: E0123 23:32:13.355516 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.355903 kubelet[2913]: W0123 23:32:13.355525 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.355903 kubelet[2913]: E0123 23:32:13.355534 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.355903 kubelet[2913]: E0123 23:32:13.355827 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.355903 kubelet[2913]: W0123 23:32:13.355838 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.355903 kubelet[2913]: E0123 23:32:13.355848 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.356467 kubelet[2913]: E0123 23:32:13.356444 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.356467 kubelet[2913]: W0123 23:32:13.356466 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.356574 kubelet[2913]: E0123 23:32:13.356480 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.356741 kubelet[2913]: E0123 23:32:13.356720 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.356741 kubelet[2913]: W0123 23:32:13.356734 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.356853 kubelet[2913]: E0123 23:32:13.356745 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.357009 kubelet[2913]: E0123 23:32:13.356995 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.357053 kubelet[2913]: W0123 23:32:13.357009 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.357053 kubelet[2913]: E0123 23:32:13.357019 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.357250 kubelet[2913]: E0123 23:32:13.357234 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.357250 kubelet[2913]: W0123 23:32:13.357243 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.357311 kubelet[2913]: E0123 23:32:13.357270 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.357499 kubelet[2913]: E0123 23:32:13.357467 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.357499 kubelet[2913]: W0123 23:32:13.357480 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.357561 kubelet[2913]: E0123 23:32:13.357510 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.357892 kubelet[2913]: E0123 23:32:13.357855 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.357892 kubelet[2913]: W0123 23:32:13.357877 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.357892 kubelet[2913]: E0123 23:32:13.357887 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.358725 kubelet[2913]: E0123 23:32:13.358705 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.358725 kubelet[2913]: W0123 23:32:13.358724 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.358790 kubelet[2913]: E0123 23:32:13.358738 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.363824 kubelet[2913]: E0123 23:32:13.363274 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.363824 kubelet[2913]: W0123 23:32:13.363808 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.363951 kubelet[2913]: E0123 23:32:13.363831 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.365677 kubelet[2913]: E0123 23:32:13.365599 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.365677 kubelet[2913]: W0123 23:32:13.365620 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.365677 kubelet[2913]: E0123 23:32:13.365635 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.375376 systemd[1]: Started cri-containerd-ba8f671fe8dfd8cbeb71d7b00b947032d070d2c8b7c4ed59157b866c0655ab56.scope - libcontainer container ba8f671fe8dfd8cbeb71d7b00b947032d070d2c8b7c4ed59157b866c0655ab56. Jan 23 23:32:13.386492 kubelet[2913]: E0123 23:32:13.386354 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:13.394000 audit: BPF prog-id=151 op=LOAD Jan 23 23:32:13.400585 kernel: audit: type=1334 audit(1769211133.394:530): prog-id=151 op=LOAD Jan 23 23:32:13.400685 kernel: audit: type=1334 audit(1769211133.395:531): prog-id=152 op=LOAD Jan 23 23:32:13.400703 kernel: audit: type=1300 audit(1769211133.395:531): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.395000 audit: BPF prog-id=152 op=LOAD Jan 23 23:32:13.395000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.395000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.404514 kernel: audit: type=1327 audit(1769211133.395:531): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.395000 audit: BPF prog-id=152 op=UNLOAD Jan 23 23:32:13.395000 audit[3358]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.395000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.395000 audit: BPF prog-id=153 op=LOAD Jan 23 23:32:13.395000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.395000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.400000 audit: BPF prog-id=154 op=LOAD Jan 23 23:32:13.400000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.400000 audit: BPF prog-id=154 op=UNLOAD Jan 23 23:32:13.400000 audit[3358]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.400000 audit: BPF prog-id=153 op=UNLOAD Jan 23 23:32:13.400000 audit[3358]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.400000 audit: BPF prog-id=155 op=LOAD Jan 23 23:32:13.400000 audit[3358]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3347 pid=3358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261386636373166653864666438636265623731643762303062393437 Jan 23 23:32:13.431791 kubelet[2913]: E0123 23:32:13.431748 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.432134 kubelet[2913]: W0123 23:32:13.431881 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.432134 kubelet[2913]: E0123 23:32:13.431905 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.432658 kubelet[2913]: E0123 23:32:13.432363 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.432658 kubelet[2913]: W0123 23:32:13.432375 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.432658 kubelet[2913]: E0123 23:32:13.432410 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.433150 kubelet[2913]: E0123 23:32:13.432947 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.433150 kubelet[2913]: W0123 23:32:13.433014 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.433150 kubelet[2913]: E0123 23:32:13.433028 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.433400 kubelet[2913]: E0123 23:32:13.433374 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.433454 kubelet[2913]: W0123 23:32:13.433443 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.433684 kubelet[2913]: E0123 23:32:13.433483 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.434234 kubelet[2913]: E0123 23:32:13.434118 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.434234 kubelet[2913]: W0123 23:32:13.434132 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.434234 kubelet[2913]: E0123 23:32:13.434148 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.434816 kubelet[2913]: E0123 23:32:13.434607 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.434816 kubelet[2913]: W0123 23:32:13.434617 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.434816 kubelet[2913]: E0123 23:32:13.434629 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.434937 containerd[1666]: time="2026-01-23T23:32:13.434227951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69b8f8cd5f-mskgd,Uid:9153c950-4ddd-40f6-8202-76418a16b236,Namespace:calico-system,Attempt:0,} returns sandbox id \"ba8f671fe8dfd8cbeb71d7b00b947032d070d2c8b7c4ed59157b866c0655ab56\"" Jan 23 23:32:13.435296 kubelet[2913]: E0123 23:32:13.435282 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.435377 kubelet[2913]: W0123 23:32:13.435365 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.435757 kubelet[2913]: E0123 23:32:13.435644 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.435880 kubelet[2913]: E0123 23:32:13.435866 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.436019 kubelet[2913]: W0123 23:32:13.435938 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.436019 kubelet[2913]: E0123 23:32:13.435983 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.436383 containerd[1666]: time="2026-01-23T23:32:13.436352237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 23 23:32:13.436502 kubelet[2913]: E0123 23:32:13.436489 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.436749 kubelet[2913]: W0123 23:32:13.436597 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.436749 kubelet[2913]: E0123 23:32:13.436638 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.437102 kubelet[2913]: E0123 23:32:13.437085 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.437399 kubelet[2913]: W0123 23:32:13.437264 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.437399 kubelet[2913]: E0123 23:32:13.437296 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.437728 kubelet[2913]: E0123 23:32:13.437714 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.437945 kubelet[2913]: W0123 23:32:13.437796 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.437945 kubelet[2913]: E0123 23:32:13.437829 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.436000 audit[3427]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3427 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:13.436000 audit[3427]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffda4a2580 a2=0 a3=1 items=0 ppid=3074 pid=3427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.436000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:13.438754 kubelet[2913]: E0123 23:32:13.438593 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.438754 kubelet[2913]: W0123 23:32:13.438663 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.438754 kubelet[2913]: E0123 23:32:13.438674 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.439238 kubelet[2913]: E0123 23:32:13.439211 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.439512 kubelet[2913]: W0123 23:32:13.439399 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.439512 kubelet[2913]: E0123 23:32:13.439421 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.439737 kubelet[2913]: E0123 23:32:13.439721 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.439850 kubelet[2913]: W0123 23:32:13.439837 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.439998 kubelet[2913]: E0123 23:32:13.439985 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.440476 kubelet[2913]: E0123 23:32:13.440365 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.440476 kubelet[2913]: W0123 23:32:13.440380 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.440476 kubelet[2913]: E0123 23:32:13.440390 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.440657 kubelet[2913]: E0123 23:32:13.440643 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.440712 kubelet[2913]: W0123 23:32:13.440702 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.440763 kubelet[2913]: E0123 23:32:13.440752 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.441073 kubelet[2913]: E0123 23:32:13.440970 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.441073 kubelet[2913]: W0123 23:32:13.440986 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.441073 kubelet[2913]: E0123 23:32:13.440996 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.441224 kubelet[2913]: E0123 23:32:13.441213 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.441277 kubelet[2913]: W0123 23:32:13.441266 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.441324 kubelet[2913]: E0123 23:32:13.441315 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.441516 kubelet[2913]: E0123 23:32:13.441504 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.441671 kubelet[2913]: W0123 23:32:13.441571 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.441671 kubelet[2913]: E0123 23:32:13.441586 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.441792 kubelet[2913]: E0123 23:32:13.441780 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.441840 kubelet[2913]: W0123 23:32:13.441829 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.441890 kubelet[2913]: E0123 23:32:13.441880 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.441000 audit[3427]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3427 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:13.441000 audit[3427]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffda4a2580 a2=0 a3=1 items=0 ppid=3074 pid=3427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.441000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:13.447146 kubelet[2913]: E0123 23:32:13.447125 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.447146 kubelet[2913]: W0123 23:32:13.447142 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.447240 kubelet[2913]: E0123 23:32:13.447156 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.447240 kubelet[2913]: I0123 23:32:13.447180 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a69a1122-8e77-47a0-ac55-a81fea68c3e7-kubelet-dir\") pod \"csi-node-driver-rfnpf\" (UID: \"a69a1122-8e77-47a0-ac55-a81fea68c3e7\") " pod="calico-system/csi-node-driver-rfnpf" Jan 23 23:32:13.447415 kubelet[2913]: E0123 23:32:13.447377 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.447415 kubelet[2913]: W0123 23:32:13.447393 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.447415 kubelet[2913]: E0123 23:32:13.447404 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.447506 kubelet[2913]: I0123 23:32:13.447434 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a69a1122-8e77-47a0-ac55-a81fea68c3e7-socket-dir\") pod \"csi-node-driver-rfnpf\" (UID: \"a69a1122-8e77-47a0-ac55-a81fea68c3e7\") " pod="calico-system/csi-node-driver-rfnpf" Jan 23 23:32:13.447644 kubelet[2913]: E0123 23:32:13.447625 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.447788 kubelet[2913]: W0123 23:32:13.447644 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.447788 kubelet[2913]: E0123 23:32:13.447657 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.447851 kubelet[2913]: E0123 23:32:13.447797 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.447851 kubelet[2913]: W0123 23:32:13.447804 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.447851 kubelet[2913]: E0123 23:32:13.447812 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.448016 kubelet[2913]: E0123 23:32:13.448003 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.448050 kubelet[2913]: W0123 23:32:13.448016 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.448050 kubelet[2913]: E0123 23:32:13.448026 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.448180 kubelet[2913]: I0123 23:32:13.448062 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a69a1122-8e77-47a0-ac55-a81fea68c3e7-varrun\") pod \"csi-node-driver-rfnpf\" (UID: \"a69a1122-8e77-47a0-ac55-a81fea68c3e7\") " pod="calico-system/csi-node-driver-rfnpf" Jan 23 23:32:13.448260 kubelet[2913]: E0123 23:32:13.448247 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.448294 kubelet[2913]: W0123 23:32:13.448260 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.448294 kubelet[2913]: E0123 23:32:13.448269 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.448294 kubelet[2913]: I0123 23:32:13.448290 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a69a1122-8e77-47a0-ac55-a81fea68c3e7-registration-dir\") pod \"csi-node-driver-rfnpf\" (UID: \"a69a1122-8e77-47a0-ac55-a81fea68c3e7\") " pod="calico-system/csi-node-driver-rfnpf" Jan 23 23:32:13.448499 kubelet[2913]: E0123 23:32:13.448487 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.448499 kubelet[2913]: W0123 23:32:13.448500 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.448550 kubelet[2913]: E0123 23:32:13.448508 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.448550 kubelet[2913]: I0123 23:32:13.448529 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6g8z\" (UniqueName: \"kubernetes.io/projected/a69a1122-8e77-47a0-ac55-a81fea68c3e7-kube-api-access-q6g8z\") pod \"csi-node-driver-rfnpf\" (UID: \"a69a1122-8e77-47a0-ac55-a81fea68c3e7\") " pod="calico-system/csi-node-driver-rfnpf" Jan 23 23:32:13.448718 kubelet[2913]: E0123 23:32:13.448706 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.448748 kubelet[2913]: W0123 23:32:13.448719 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.448748 kubelet[2913]: E0123 23:32:13.448730 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.448886 kubelet[2913]: E0123 23:32:13.448874 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.448886 kubelet[2913]: W0123 23:32:13.448885 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.448936 kubelet[2913]: E0123 23:32:13.448894 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.449139 kubelet[2913]: E0123 23:32:13.449123 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.449139 kubelet[2913]: W0123 23:32:13.449137 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.449190 kubelet[2913]: E0123 23:32:13.449147 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.449311 kubelet[2913]: E0123 23:32:13.449299 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.449380 kubelet[2913]: W0123 23:32:13.449311 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.449380 kubelet[2913]: E0123 23:32:13.449322 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.449474 kubelet[2913]: E0123 23:32:13.449463 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.449510 kubelet[2913]: W0123 23:32:13.449474 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.449510 kubelet[2913]: E0123 23:32:13.449483 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.449647 kubelet[2913]: E0123 23:32:13.449634 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.449647 kubelet[2913]: W0123 23:32:13.449646 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.449709 kubelet[2913]: E0123 23:32:13.449653 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.449795 kubelet[2913]: E0123 23:32:13.449783 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.449821 kubelet[2913]: W0123 23:32:13.449795 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.449821 kubelet[2913]: E0123 23:32:13.449803 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.449947 kubelet[2913]: E0123 23:32:13.449937 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.449947 kubelet[2913]: W0123 23:32:13.449947 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.450506 kubelet[2913]: E0123 23:32:13.450463 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.503876 containerd[1666]: time="2026-01-23T23:32:13.503822243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zhpk5,Uid:2bdc109c-b385-43d6-b2b6-e6aea5c69aaa,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:13.529152 containerd[1666]: time="2026-01-23T23:32:13.529108680Z" level=info msg="connecting to shim 2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e" address="unix:///run/containerd/s/d86ff896aaf873dbdaea0b6ec6220bcb1944e47bea37bb86656bb1e316470e93" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:13.549738 kubelet[2913]: E0123 23:32:13.549709 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.549738 kubelet[2913]: W0123 23:32:13.549731 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.549876 kubelet[2913]: E0123 23:32:13.549749 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.550047 kubelet[2913]: E0123 23:32:13.550033 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.550047 kubelet[2913]: W0123 23:32:13.550045 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.550104 kubelet[2913]: E0123 23:32:13.550053 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.550248 kubelet[2913]: E0123 23:32:13.550234 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.550248 kubelet[2913]: W0123 23:32:13.550247 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.550300 kubelet[2913]: E0123 23:32:13.550257 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.550450 kubelet[2913]: E0123 23:32:13.550437 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.550450 kubelet[2913]: W0123 23:32:13.550448 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.550495 kubelet[2913]: E0123 23:32:13.550456 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.550617 kubelet[2913]: E0123 23:32:13.550605 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.550617 kubelet[2913]: W0123 23:32:13.550616 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.550669 kubelet[2913]: E0123 23:32:13.550624 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.550845 kubelet[2913]: E0123 23:32:13.550824 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.550874 kubelet[2913]: W0123 23:32:13.550844 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.550874 kubelet[2913]: E0123 23:32:13.550859 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.551294 kubelet[2913]: E0123 23:32:13.551077 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.551294 kubelet[2913]: W0123 23:32:13.551089 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.551294 kubelet[2913]: E0123 23:32:13.551098 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.551294 kubelet[2913]: E0123 23:32:13.551254 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.551294 kubelet[2913]: W0123 23:32:13.551262 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.551294 kubelet[2913]: E0123 23:32:13.551270 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.551132 systemd[1]: Started cri-containerd-2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e.scope - libcontainer container 2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e. Jan 23 23:32:13.551695 kubelet[2913]: E0123 23:32:13.551671 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.551695 kubelet[2913]: W0123 23:32:13.551688 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.551751 kubelet[2913]: E0123 23:32:13.551698 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.552294 kubelet[2913]: E0123 23:32:13.552274 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.552294 kubelet[2913]: W0123 23:32:13.552291 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.552360 kubelet[2913]: E0123 23:32:13.552305 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.552517 kubelet[2913]: E0123 23:32:13.552498 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.552548 kubelet[2913]: W0123 23:32:13.552517 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.552548 kubelet[2913]: E0123 23:32:13.552528 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.552857 kubelet[2913]: E0123 23:32:13.552832 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.552857 kubelet[2913]: W0123 23:32:13.552851 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.552915 kubelet[2913]: E0123 23:32:13.552864 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.553221 kubelet[2913]: E0123 23:32:13.553190 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.553221 kubelet[2913]: W0123 23:32:13.553207 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.553221 kubelet[2913]: E0123 23:32:13.553219 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.553657 kubelet[2913]: E0123 23:32:13.553625 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.553657 kubelet[2913]: W0123 23:32:13.553645 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.553657 kubelet[2913]: E0123 23:32:13.553658 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.553868 kubelet[2913]: E0123 23:32:13.553846 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.553868 kubelet[2913]: W0123 23:32:13.553861 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.553868 kubelet[2913]: E0123 23:32:13.553870 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.554037 kubelet[2913]: E0123 23:32:13.554021 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.554037 kubelet[2913]: W0123 23:32:13.554033 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.554092 kubelet[2913]: E0123 23:32:13.554041 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.554218 kubelet[2913]: E0123 23:32:13.554197 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.554218 kubelet[2913]: W0123 23:32:13.554209 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.554272 kubelet[2913]: E0123 23:32:13.554217 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.554384 kubelet[2913]: E0123 23:32:13.554367 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.554384 kubelet[2913]: W0123 23:32:13.554378 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.554432 kubelet[2913]: E0123 23:32:13.554386 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.554643 kubelet[2913]: E0123 23:32:13.554624 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.554643 kubelet[2913]: W0123 23:32:13.554638 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.554718 kubelet[2913]: E0123 23:32:13.554649 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.555793 kubelet[2913]: E0123 23:32:13.555111 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.555793 kubelet[2913]: W0123 23:32:13.555130 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.555793 kubelet[2913]: E0123 23:32:13.555231 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.555793 kubelet[2913]: E0123 23:32:13.555520 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.555793 kubelet[2913]: W0123 23:32:13.555531 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.555793 kubelet[2913]: E0123 23:32:13.555551 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.555793 kubelet[2913]: E0123 23:32:13.555746 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.555793 kubelet[2913]: W0123 23:32:13.555757 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.555793 kubelet[2913]: E0123 23:32:13.555765 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.556124 kubelet[2913]: E0123 23:32:13.556104 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.556124 kubelet[2913]: W0123 23:32:13.556119 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.556190 kubelet[2913]: E0123 23:32:13.556130 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.556459 kubelet[2913]: E0123 23:32:13.556332 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.556459 kubelet[2913]: W0123 23:32:13.556341 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.556459 kubelet[2913]: E0123 23:32:13.556349 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.556664 kubelet[2913]: E0123 23:32:13.556640 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.556664 kubelet[2913]: W0123 23:32:13.556659 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.556726 kubelet[2913]: E0123 23:32:13.556677 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.560000 audit: BPF prog-id=156 op=LOAD Jan 23 23:32:13.562000 audit: BPF prog-id=157 op=LOAD Jan 23 23:32:13.562000 audit[3477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3465 pid=3477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263326531383832396537306531616530333034386331346632373032 Jan 23 23:32:13.562000 audit: BPF prog-id=157 op=UNLOAD Jan 23 23:32:13.562000 audit[3477]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263326531383832396537306531616530333034386331346632373032 Jan 23 23:32:13.562000 audit: BPF prog-id=158 op=LOAD Jan 23 23:32:13.562000 audit[3477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3465 pid=3477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263326531383832396537306531616530333034386331346632373032 Jan 23 23:32:13.563000 audit: BPF prog-id=159 op=LOAD Jan 23 23:32:13.563000 audit[3477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3465 pid=3477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.563000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263326531383832396537306531616530333034386331346632373032 Jan 23 23:32:13.563000 audit: BPF prog-id=159 op=UNLOAD Jan 23 23:32:13.563000 audit[3477]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.563000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263326531383832396537306531616530333034386331346632373032 Jan 23 23:32:13.563000 audit: BPF prog-id=158 op=UNLOAD Jan 23 23:32:13.563000 audit[3477]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.563000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263326531383832396537306531616530333034386331346632373032 Jan 23 23:32:13.563000 audit: BPF prog-id=160 op=LOAD Jan 23 23:32:13.563000 audit[3477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3465 pid=3477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:13.563000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3263326531383832396537306531616530333034386331346632373032 Jan 23 23:32:13.567346 kubelet[2913]: E0123 23:32:13.567326 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:13.567346 kubelet[2913]: W0123 23:32:13.567344 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:13.567699 kubelet[2913]: E0123 23:32:13.567632 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:13.579220 containerd[1666]: time="2026-01-23T23:32:13.579159113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zhpk5,Uid:2bdc109c-b385-43d6-b2b6-e6aea5c69aaa,Namespace:calico-system,Attempt:0,} returns sandbox id \"2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e\"" Jan 23 23:32:14.636911 kubelet[2913]: E0123 23:32:14.636847 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:14.891665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3541084901.mount: Deactivated successfully. Jan 23 23:32:15.758965 containerd[1666]: time="2026-01-23T23:32:15.758916881Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:15.760301 containerd[1666]: time="2026-01-23T23:32:15.760107244Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 23 23:32:15.761553 containerd[1666]: time="2026-01-23T23:32:15.761514329Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:15.764105 containerd[1666]: time="2026-01-23T23:32:15.764067816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:15.765479 containerd[1666]: time="2026-01-23T23:32:15.765438581Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.328875223s" Jan 23 23:32:15.765479 containerd[1666]: time="2026-01-23T23:32:15.765476941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 23 23:32:15.766542 containerd[1666]: time="2026-01-23T23:32:15.766518384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 23 23:32:15.776240 containerd[1666]: time="2026-01-23T23:32:15.776203093Z" level=info msg="CreateContainer within sandbox \"ba8f671fe8dfd8cbeb71d7b00b947032d070d2c8b7c4ed59157b866c0655ab56\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 23 23:32:15.790534 containerd[1666]: time="2026-01-23T23:32:15.789384374Z" level=info msg="Container 4c2d92a10fed6d85d8081b56f73db25d9cc5254db25909f63e34749b88901bcf: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:15.802767 containerd[1666]: time="2026-01-23T23:32:15.802718854Z" level=info msg="CreateContainer within sandbox \"ba8f671fe8dfd8cbeb71d7b00b947032d070d2c8b7c4ed59157b866c0655ab56\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4c2d92a10fed6d85d8081b56f73db25d9cc5254db25909f63e34749b88901bcf\"" Jan 23 23:32:15.803685 containerd[1666]: time="2026-01-23T23:32:15.803663377Z" level=info msg="StartContainer for \"4c2d92a10fed6d85d8081b56f73db25d9cc5254db25909f63e34749b88901bcf\"" Jan 23 23:32:15.804994 containerd[1666]: time="2026-01-23T23:32:15.804947901Z" level=info msg="connecting to shim 4c2d92a10fed6d85d8081b56f73db25d9cc5254db25909f63e34749b88901bcf" address="unix:///run/containerd/s/1ba2344343608e4747f39bc4ca85f654af0603a50a472c6e1629116ffaa6ce08" protocol=ttrpc version=3 Jan 23 23:32:15.825294 systemd[1]: Started cri-containerd-4c2d92a10fed6d85d8081b56f73db25d9cc5254db25909f63e34749b88901bcf.scope - libcontainer container 4c2d92a10fed6d85d8081b56f73db25d9cc5254db25909f63e34749b88901bcf. Jan 23 23:32:15.835000 audit: BPF prog-id=161 op=LOAD Jan 23 23:32:15.836000 audit: BPF prog-id=162 op=LOAD Jan 23 23:32:15.836000 audit[3540]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3347 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:15.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463326439326131306665643664383564383038316235366637336462 Jan 23 23:32:15.836000 audit: BPF prog-id=162 op=UNLOAD Jan 23 23:32:15.836000 audit[3540]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:15.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463326439326131306665643664383564383038316235366637336462 Jan 23 23:32:15.836000 audit: BPF prog-id=163 op=LOAD Jan 23 23:32:15.836000 audit[3540]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3347 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:15.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463326439326131306665643664383564383038316235366637336462 Jan 23 23:32:15.836000 audit: BPF prog-id=164 op=LOAD Jan 23 23:32:15.836000 audit[3540]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3347 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:15.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463326439326131306665643664383564383038316235366637336462 Jan 23 23:32:15.836000 audit: BPF prog-id=164 op=UNLOAD Jan 23 23:32:15.836000 audit[3540]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:15.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463326439326131306665643664383564383038316235366637336462 Jan 23 23:32:15.836000 audit: BPF prog-id=163 op=UNLOAD Jan 23 23:32:15.836000 audit[3540]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3347 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:15.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463326439326131306665643664383564383038316235366637336462 Jan 23 23:32:15.836000 audit: BPF prog-id=165 op=LOAD Jan 23 23:32:15.836000 audit[3540]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3347 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:15.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463326439326131306665643664383564383038316235366637336462 Jan 23 23:32:15.862728 containerd[1666]: time="2026-01-23T23:32:15.862692957Z" level=info msg="StartContainer for \"4c2d92a10fed6d85d8081b56f73db25d9cc5254db25909f63e34749b88901bcf\" returns successfully" Jan 23 23:32:16.637219 kubelet[2913]: E0123 23:32:16.637126 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:16.719252 kubelet[2913]: I0123 23:32:16.719047 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-69b8f8cd5f-mskgd" podStartSLOduration=2.388686302 podStartE2EDuration="4.719030209s" podCreationTimestamp="2026-01-23 23:32:12 +0000 UTC" firstStartedPulling="2026-01-23 23:32:13.435970436 +0000 UTC m=+20.196612991" lastFinishedPulling="2026-01-23 23:32:15.766314343 +0000 UTC m=+22.526956898" observedRunningTime="2026-01-23 23:32:16.706811612 +0000 UTC m=+23.467454167" watchObservedRunningTime="2026-01-23 23:32:16.719030209 +0000 UTC m=+23.479672764" Jan 23 23:32:16.730000 audit[3584]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3584 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:16.730000 audit[3584]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc8d56120 a2=0 a3=1 items=0 ppid=3074 pid=3584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:16.730000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:16.740000 audit[3584]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3584 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:16.740000 audit[3584]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffc8d56120 a2=0 a3=1 items=0 ppid=3074 pid=3584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:16.740000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:16.764814 kubelet[2913]: E0123 23:32:16.764767 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.764814 kubelet[2913]: W0123 23:32:16.764796 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.764814 kubelet[2913]: E0123 23:32:16.764818 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.766215 kubelet[2913]: E0123 23:32:16.766089 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.766215 kubelet[2913]: W0123 23:32:16.766111 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.766215 kubelet[2913]: E0123 23:32:16.766125 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.766415 kubelet[2913]: E0123 23:32:16.766404 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.766617 kubelet[2913]: W0123 23:32:16.766459 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.766617 kubelet[2913]: E0123 23:32:16.766474 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.767014 kubelet[2913]: E0123 23:32:16.766878 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.767014 kubelet[2913]: W0123 23:32:16.766911 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.767014 kubelet[2913]: E0123 23:32:16.766924 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.767203 kubelet[2913]: E0123 23:32:16.767191 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.767334 kubelet[2913]: W0123 23:32:16.767244 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.767334 kubelet[2913]: E0123 23:32:16.767264 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.767472 kubelet[2913]: E0123 23:32:16.767463 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.767528 kubelet[2913]: W0123 23:32:16.767513 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.767585 kubelet[2913]: E0123 23:32:16.767575 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.767759 kubelet[2913]: E0123 23:32:16.767749 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.767891 kubelet[2913]: W0123 23:32:16.767811 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.767891 kubelet[2913]: E0123 23:32:16.767826 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.768125 kubelet[2913]: E0123 23:32:16.768114 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.768364 kubelet[2913]: W0123 23:32:16.768187 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.768364 kubelet[2913]: E0123 23:32:16.768219 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.768488 kubelet[2913]: E0123 23:32:16.768472 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.768488 kubelet[2913]: W0123 23:32:16.768483 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.768537 kubelet[2913]: E0123 23:32:16.768494 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.768777 kubelet[2913]: E0123 23:32:16.768758 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.769090 kubelet[2913]: W0123 23:32:16.769056 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.769090 kubelet[2913]: E0123 23:32:16.769076 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.769272 kubelet[2913]: E0123 23:32:16.769254 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.769272 kubelet[2913]: W0123 23:32:16.769266 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.769318 kubelet[2913]: E0123 23:32:16.769276 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.769471 kubelet[2913]: E0123 23:32:16.769448 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.769471 kubelet[2913]: W0123 23:32:16.769465 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.769522 kubelet[2913]: E0123 23:32:16.769473 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.769619 kubelet[2913]: E0123 23:32:16.769604 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.769653 kubelet[2913]: W0123 23:32:16.769615 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.769653 kubelet[2913]: E0123 23:32:16.769634 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.769776 kubelet[2913]: E0123 23:32:16.769760 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.769808 kubelet[2913]: W0123 23:32:16.769771 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.769808 kubelet[2913]: E0123 23:32:16.769790 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.769929 kubelet[2913]: E0123 23:32:16.769913 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.769929 kubelet[2913]: W0123 23:32:16.769923 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.770003 kubelet[2913]: E0123 23:32:16.769940 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.771869 kubelet[2913]: E0123 23:32:16.771855 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.772070 kubelet[2913]: W0123 23:32:16.771939 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.772070 kubelet[2913]: E0123 23:32:16.771974 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.772195 kubelet[2913]: E0123 23:32:16.772184 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.772242 kubelet[2913]: W0123 23:32:16.772232 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.772288 kubelet[2913]: E0123 23:32:16.772279 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.772556 kubelet[2913]: E0123 23:32:16.772514 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.772556 kubelet[2913]: W0123 23:32:16.772532 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.772556 kubelet[2913]: E0123 23:32:16.772545 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.772706 kubelet[2913]: E0123 23:32:16.772693 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.772706 kubelet[2913]: W0123 23:32:16.772704 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.772750 kubelet[2913]: E0123 23:32:16.772714 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.772863 kubelet[2913]: E0123 23:32:16.772845 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.772863 kubelet[2913]: W0123 23:32:16.772856 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.772927 kubelet[2913]: E0123 23:32:16.772865 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.773085 kubelet[2913]: E0123 23:32:16.773042 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.773085 kubelet[2913]: W0123 23:32:16.773054 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.773085 kubelet[2913]: E0123 23:32:16.773064 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.773377 kubelet[2913]: E0123 23:32:16.773361 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.773377 kubelet[2913]: W0123 23:32:16.773373 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.773475 kubelet[2913]: E0123 23:32:16.773383 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.773932 kubelet[2913]: E0123 23:32:16.773919 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.773973 kubelet[2913]: W0123 23:32:16.773932 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.773973 kubelet[2913]: E0123 23:32:16.773945 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.774222 kubelet[2913]: E0123 23:32:16.774209 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.774262 kubelet[2913]: W0123 23:32:16.774222 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.774262 kubelet[2913]: E0123 23:32:16.774233 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.774412 kubelet[2913]: E0123 23:32:16.774401 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.774439 kubelet[2913]: W0123 23:32:16.774412 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.774439 kubelet[2913]: E0123 23:32:16.774421 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.774580 kubelet[2913]: E0123 23:32:16.774570 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.774580 kubelet[2913]: W0123 23:32:16.774580 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.774638 kubelet[2913]: E0123 23:32:16.774588 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.774731 kubelet[2913]: E0123 23:32:16.774717 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.774731 kubelet[2913]: W0123 23:32:16.774728 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.774779 kubelet[2913]: E0123 23:32:16.774737 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.774897 kubelet[2913]: E0123 23:32:16.774885 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.774897 kubelet[2913]: W0123 23:32:16.774896 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.775094 kubelet[2913]: E0123 23:32:16.774904 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.775207 kubelet[2913]: E0123 23:32:16.775191 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.775258 kubelet[2913]: W0123 23:32:16.775246 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.775310 kubelet[2913]: E0123 23:32:16.775300 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.775604 kubelet[2913]: E0123 23:32:16.775492 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.775604 kubelet[2913]: W0123 23:32:16.775503 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.775604 kubelet[2913]: E0123 23:32:16.775513 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.775747 kubelet[2913]: E0123 23:32:16.775737 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.775797 kubelet[2913]: W0123 23:32:16.775787 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.775842 kubelet[2913]: E0123 23:32:16.775833 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.776128 kubelet[2913]: E0123 23:32:16.776108 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.776128 kubelet[2913]: W0123 23:32:16.776122 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.776206 kubelet[2913]: E0123 23:32:16.776137 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:16.776290 kubelet[2913]: E0123 23:32:16.776277 2913 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 23:32:16.776290 kubelet[2913]: W0123 23:32:16.776288 2913 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 23:32:16.776333 kubelet[2913]: E0123 23:32:16.776296 2913 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 23:32:17.270927 containerd[1666]: time="2026-01-23T23:32:17.270862412Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:17.273245 containerd[1666]: time="2026-01-23T23:32:17.273204339Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:17.274343 containerd[1666]: time="2026-01-23T23:32:17.274317182Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:17.277275 containerd[1666]: time="2026-01-23T23:32:17.277059351Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:17.278285 containerd[1666]: time="2026-01-23T23:32:17.278256674Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.51170749s" Jan 23 23:32:17.278338 containerd[1666]: time="2026-01-23T23:32:17.278289915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 23 23:32:17.282412 containerd[1666]: time="2026-01-23T23:32:17.282375367Z" level=info msg="CreateContainer within sandbox \"2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 23 23:32:17.293145 containerd[1666]: time="2026-01-23T23:32:17.293112680Z" level=info msg="Container 1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:17.301098 containerd[1666]: time="2026-01-23T23:32:17.300996624Z" level=info msg="CreateContainer within sandbox \"2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564\"" Jan 23 23:32:17.301684 containerd[1666]: time="2026-01-23T23:32:17.301656706Z" level=info msg="StartContainer for \"1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564\"" Jan 23 23:32:17.303137 containerd[1666]: time="2026-01-23T23:32:17.303107110Z" level=info msg="connecting to shim 1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564" address="unix:///run/containerd/s/d86ff896aaf873dbdaea0b6ec6220bcb1944e47bea37bb86656bb1e316470e93" protocol=ttrpc version=3 Jan 23 23:32:17.322154 systemd[1]: Started cri-containerd-1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564.scope - libcontainer container 1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564. Jan 23 23:32:17.377000 audit: BPF prog-id=166 op=LOAD Jan 23 23:32:17.377000 audit[3622]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3465 pid=3622 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:17.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162653238633264336339373563333561646561393436326338613130 Jan 23 23:32:17.377000 audit: BPF prog-id=167 op=LOAD Jan 23 23:32:17.377000 audit[3622]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3465 pid=3622 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:17.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162653238633264336339373563333561646561393436326338613130 Jan 23 23:32:17.377000 audit: BPF prog-id=167 op=UNLOAD Jan 23 23:32:17.377000 audit[3622]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3622 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:17.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162653238633264336339373563333561646561393436326338613130 Jan 23 23:32:17.377000 audit: BPF prog-id=166 op=UNLOAD Jan 23 23:32:17.377000 audit[3622]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3622 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:17.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162653238633264336339373563333561646561393436326338613130 Jan 23 23:32:17.377000 audit: BPF prog-id=168 op=LOAD Jan 23 23:32:17.377000 audit[3622]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3465 pid=3622 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:17.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3162653238633264336339373563333561646561393436326338613130 Jan 23 23:32:17.402669 containerd[1666]: time="2026-01-23T23:32:17.402562814Z" level=info msg="StartContainer for \"1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564\" returns successfully" Jan 23 23:32:17.413021 systemd[1]: cri-containerd-1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564.scope: Deactivated successfully. Jan 23 23:32:17.416637 containerd[1666]: time="2026-01-23T23:32:17.416369896Z" level=info msg="received container exit event container_id:\"1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564\" id:\"1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564\" pid:3634 exited_at:{seconds:1769211137 nanos:415949414}" Jan 23 23:32:17.417000 audit: BPF prog-id=168 op=UNLOAD Jan 23 23:32:17.419651 kernel: kauditd_printk_skb: 89 callbacks suppressed Jan 23 23:32:17.419717 kernel: audit: type=1334 audit(1769211137.417:563): prog-id=168 op=UNLOAD Jan 23 23:32:17.437844 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1be28c2d3c975c35adea9462c8a101d4f64b4834b296985d25d165eb3ea4b564-rootfs.mount: Deactivated successfully. Jan 23 23:32:18.636649 kubelet[2913]: E0123 23:32:18.636561 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:20.636540 kubelet[2913]: E0123 23:32:20.636497 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:21.709242 containerd[1666]: time="2026-01-23T23:32:21.709116748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 23 23:32:22.637001 kubelet[2913]: E0123 23:32:22.636856 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:24.637103 kubelet[2913]: E0123 23:32:24.637033 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:25.198685 containerd[1666]: time="2026-01-23T23:32:25.198623430Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:25.199707 containerd[1666]: time="2026-01-23T23:32:25.199638113Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 23 23:32:25.200299 containerd[1666]: time="2026-01-23T23:32:25.200262995Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:25.206404 containerd[1666]: time="2026-01-23T23:32:25.206349653Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:25.207192 containerd[1666]: time="2026-01-23T23:32:25.207163616Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.497964668s" Jan 23 23:32:25.207240 containerd[1666]: time="2026-01-23T23:32:25.207195616Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 23 23:32:25.211371 containerd[1666]: time="2026-01-23T23:32:25.211334469Z" level=info msg="CreateContainer within sandbox \"2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 23 23:32:25.222855 containerd[1666]: time="2026-01-23T23:32:25.221617740Z" level=info msg="Container bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:25.236735 containerd[1666]: time="2026-01-23T23:32:25.236654986Z" level=info msg="CreateContainer within sandbox \"2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913\"" Jan 23 23:32:25.238209 containerd[1666]: time="2026-01-23T23:32:25.237391068Z" level=info msg="StartContainer for \"bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913\"" Jan 23 23:32:25.240144 containerd[1666]: time="2026-01-23T23:32:25.240113196Z" level=info msg="connecting to shim bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913" address="unix:///run/containerd/s/d86ff896aaf873dbdaea0b6ec6220bcb1944e47bea37bb86656bb1e316470e93" protocol=ttrpc version=3 Jan 23 23:32:25.262427 systemd[1]: Started cri-containerd-bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913.scope - libcontainer container bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913. Jan 23 23:32:25.323000 audit: BPF prog-id=169 op=LOAD Jan 23 23:32:25.323000 audit[3684]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.328855 kernel: audit: type=1334 audit(1769211145.323:564): prog-id=169 op=LOAD Jan 23 23:32:25.328997 kernel: audit: type=1300 audit(1769211145.323:564): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.329022 kernel: audit: type=1327 audit(1769211145.323:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.323000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.323000 audit: BPF prog-id=170 op=LOAD Jan 23 23:32:25.332699 kernel: audit: type=1334 audit(1769211145.323:565): prog-id=170 op=LOAD Jan 23 23:32:25.323000 audit[3684]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.323000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.339612 kernel: audit: type=1300 audit(1769211145.323:565): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.339722 kernel: audit: type=1327 audit(1769211145.323:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.339758 kernel: audit: type=1334 audit(1769211145.324:566): prog-id=170 op=UNLOAD Jan 23 23:32:25.324000 audit: BPF prog-id=170 op=UNLOAD Jan 23 23:32:25.324000 audit[3684]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.343839 kernel: audit: type=1300 audit(1769211145.324:566): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.343908 kernel: audit: type=1327 audit(1769211145.324:566): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.324000 audit: BPF prog-id=169 op=UNLOAD Jan 23 23:32:25.348004 kernel: audit: type=1334 audit(1769211145.324:567): prog-id=169 op=UNLOAD Jan 23 23:32:25.324000 audit[3684]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.324000 audit: BPF prog-id=171 op=LOAD Jan 23 23:32:25.324000 audit[3684]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3465 pid=3684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:25.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264656432626665303763313033663363636133653864626132626538 Jan 23 23:32:25.365555 containerd[1666]: time="2026-01-23T23:32:25.365503419Z" level=info msg="StartContainer for \"bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913\" returns successfully" Jan 23 23:32:26.624417 systemd[1]: cri-containerd-bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913.scope: Deactivated successfully. Jan 23 23:32:26.625133 systemd[1]: cri-containerd-bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913.scope: Consumed 463ms CPU time, 192.6M memory peak, 165.9M written to disk. Jan 23 23:32:26.626976 containerd[1666]: time="2026-01-23T23:32:26.626841866Z" level=info msg="received container exit event container_id:\"bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913\" id:\"bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913\" pid:3696 exited_at:{seconds:1769211146 nanos:626639265}" Jan 23 23:32:26.628000 audit: BPF prog-id=171 op=UNLOAD Jan 23 23:32:26.636978 kubelet[2913]: E0123 23:32:26.636824 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:26.647751 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bded2bfe07c103f3cca3e8dba2be8ea8e76681b3181d0cb3f83483ea8682d913-rootfs.mount: Deactivated successfully. Jan 23 23:32:26.650922 kubelet[2913]: I0123 23:32:26.650892 2913 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 23 23:32:27.748250 kubelet[2913]: I0123 23:32:27.748160 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31688273-3f62-4a06-a170-1e0f188f7bf7-config-volume\") pod \"coredns-674b8bbfcf-22wrh\" (UID: \"31688273-3f62-4a06-a170-1e0f188f7bf7\") " pod="kube-system/coredns-674b8bbfcf-22wrh" Jan 23 23:32:27.748250 kubelet[2913]: I0123 23:32:27.748199 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb42f\" (UniqueName: \"kubernetes.io/projected/31688273-3f62-4a06-a170-1e0f188f7bf7-kube-api-access-zb42f\") pod \"coredns-674b8bbfcf-22wrh\" (UID: \"31688273-3f62-4a06-a170-1e0f188f7bf7\") " pod="kube-system/coredns-674b8bbfcf-22wrh" Jan 23 23:32:27.748211 systemd[1]: Created slice kubepods-burstable-pod31688273_3f62_4a06_a170_1e0f188f7bf7.slice - libcontainer container kubepods-burstable-pod31688273_3f62_4a06_a170_1e0f188f7bf7.slice. Jan 23 23:32:27.930876 systemd[1]: Created slice kubepods-burstable-pod9a039bd1_1840_4663_bae3_65c063cc9185.slice - libcontainer container kubepods-burstable-pod9a039bd1_1840_4663_bae3_65c063cc9185.slice. Jan 23 23:32:27.949921 kubelet[2913]: I0123 23:32:27.949886 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a039bd1-1840-4663-bae3-65c063cc9185-config-volume\") pod \"coredns-674b8bbfcf-dqzmb\" (UID: \"9a039bd1-1840-4663-bae3-65c063cc9185\") " pod="kube-system/coredns-674b8bbfcf-dqzmb" Jan 23 23:32:27.949921 kubelet[2913]: I0123 23:32:27.949924 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glfmz\" (UniqueName: \"kubernetes.io/projected/9a039bd1-1840-4663-bae3-65c063cc9185-kube-api-access-glfmz\") pod \"coredns-674b8bbfcf-dqzmb\" (UID: \"9a039bd1-1840-4663-bae3-65c063cc9185\") " pod="kube-system/coredns-674b8bbfcf-dqzmb" Jan 23 23:32:27.985098 systemd[1]: Created slice kubepods-besteffort-pod1d648e36_1a1c_46cb_8b04_91769885543b.slice - libcontainer container kubepods-besteffort-pod1d648e36_1a1c_46cb_8b04_91769885543b.slice. Jan 23 23:32:27.993592 systemd[1]: Created slice kubepods-besteffort-poda69a1122_8e77_47a0_ac55_a81fea68c3e7.slice - libcontainer container kubepods-besteffort-poda69a1122_8e77_47a0_ac55_a81fea68c3e7.slice. Jan 23 23:32:27.997849 systemd[1]: Created slice kubepods-besteffort-pod62170b86_7ada_415b_964e_cbba9c446534.slice - libcontainer container kubepods-besteffort-pod62170b86_7ada_415b_964e_cbba9c446534.slice. Jan 23 23:32:27.998904 containerd[1666]: time="2026-01-23T23:32:27.998563809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfnpf,Uid:a69a1122-8e77-47a0-ac55-a81fea68c3e7,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:28.051053 kubelet[2913]: I0123 23:32:28.050988 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d648e36-1a1c-46cb-8b04-91769885543b-tigera-ca-bundle\") pod \"calico-kube-controllers-68f445db44-27ndm\" (UID: \"1d648e36-1a1c-46cb-8b04-91769885543b\") " pod="calico-system/calico-kube-controllers-68f445db44-27ndm" Jan 23 23:32:28.051053 kubelet[2913]: I0123 23:32:28.051040 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4rsv\" (UniqueName: \"kubernetes.io/projected/62170b86-7ada-415b-964e-cbba9c446534-kube-api-access-w4rsv\") pod \"whisker-5b4dc988f8-bj7zx\" (UID: \"62170b86-7ada-415b-964e-cbba9c446534\") " pod="calico-system/whisker-5b4dc988f8-bj7zx" Jan 23 23:32:28.059105 kubelet[2913]: I0123 23:32:28.051078 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-484f6\" (UniqueName: \"kubernetes.io/projected/1d648e36-1a1c-46cb-8b04-91769885543b-kube-api-access-484f6\") pod \"calico-kube-controllers-68f445db44-27ndm\" (UID: \"1d648e36-1a1c-46cb-8b04-91769885543b\") " pod="calico-system/calico-kube-controllers-68f445db44-27ndm" Jan 23 23:32:28.059105 kubelet[2913]: I0123 23:32:28.051095 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62170b86-7ada-415b-964e-cbba9c446534-whisker-ca-bundle\") pod \"whisker-5b4dc988f8-bj7zx\" (UID: \"62170b86-7ada-415b-964e-cbba9c446534\") " pod="calico-system/whisker-5b4dc988f8-bj7zx" Jan 23 23:32:28.059105 kubelet[2913]: I0123 23:32:28.051123 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/62170b86-7ada-415b-964e-cbba9c446534-whisker-backend-key-pair\") pod \"whisker-5b4dc988f8-bj7zx\" (UID: \"62170b86-7ada-415b-964e-cbba9c446534\") " pod="calico-system/whisker-5b4dc988f8-bj7zx" Jan 23 23:32:28.059397 containerd[1666]: time="2026-01-23T23:32:28.059367194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22wrh,Uid:31688273-3f62-4a06-a170-1e0f188f7bf7,Namespace:kube-system,Attempt:0,}" Jan 23 23:32:28.111272 systemd[1]: Created slice kubepods-besteffort-pod8525930c_a129_42c2_8aaf_49aa89f532c7.slice - libcontainer container kubepods-besteffort-pod8525930c_a129_42c2_8aaf_49aa89f532c7.slice. Jan 23 23:32:28.121473 systemd[1]: Created slice kubepods-besteffort-podaece08d9_f39f_4025_afe7_4d9ae33375fe.slice - libcontainer container kubepods-besteffort-podaece08d9_f39f_4025_afe7_4d9ae33375fe.slice. Jan 23 23:32:28.130364 systemd[1]: Created slice kubepods-besteffort-pod97acd778_d005_41f9_8db5_25f87a68c090.slice - libcontainer container kubepods-besteffort-pod97acd778_d005_41f9_8db5_25f87a68c090.slice. Jan 23 23:32:28.152644 kubelet[2913]: I0123 23:32:28.152586 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/aece08d9-f39f-4025-afe7-4d9ae33375fe-calico-apiserver-certs\") pod \"calico-apiserver-579f6fb948-2l9kd\" (UID: \"aece08d9-f39f-4025-afe7-4d9ae33375fe\") " pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" Jan 23 23:32:28.152805 kubelet[2913]: I0123 23:32:28.152789 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/97acd778-d005-41f9-8db5-25f87a68c090-calico-apiserver-certs\") pod \"calico-apiserver-579f6fb948-m6qjq\" (UID: \"97acd778-d005-41f9-8db5-25f87a68c090\") " pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" Jan 23 23:32:28.152906 kubelet[2913]: I0123 23:32:28.152893 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8525930c-a129-42c2-8aaf-49aa89f532c7-goldmane-ca-bundle\") pod \"goldmane-666569f655-52tqh\" (UID: \"8525930c-a129-42c2-8aaf-49aa89f532c7\") " pod="calico-system/goldmane-666569f655-52tqh" Jan 23 23:32:28.153011 kubelet[2913]: I0123 23:32:28.152997 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8525930c-a129-42c2-8aaf-49aa89f532c7-goldmane-key-pair\") pod \"goldmane-666569f655-52tqh\" (UID: \"8525930c-a129-42c2-8aaf-49aa89f532c7\") " pod="calico-system/goldmane-666569f655-52tqh" Jan 23 23:32:28.153143 kubelet[2913]: I0123 23:32:28.153111 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdrj5\" (UniqueName: \"kubernetes.io/projected/8525930c-a129-42c2-8aaf-49aa89f532c7-kube-api-access-fdrj5\") pod \"goldmane-666569f655-52tqh\" (UID: \"8525930c-a129-42c2-8aaf-49aa89f532c7\") " pod="calico-system/goldmane-666569f655-52tqh" Jan 23 23:32:28.153180 kubelet[2913]: I0123 23:32:28.153156 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbrkn\" (UniqueName: \"kubernetes.io/projected/aece08d9-f39f-4025-afe7-4d9ae33375fe-kube-api-access-mbrkn\") pod \"calico-apiserver-579f6fb948-2l9kd\" (UID: \"aece08d9-f39f-4025-afe7-4d9ae33375fe\") " pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" Jan 23 23:32:28.153255 kubelet[2913]: I0123 23:32:28.153238 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gkz2\" (UniqueName: \"kubernetes.io/projected/97acd778-d005-41f9-8db5-25f87a68c090-kube-api-access-8gkz2\") pod \"calico-apiserver-579f6fb948-m6qjq\" (UID: \"97acd778-d005-41f9-8db5-25f87a68c090\") " pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" Jan 23 23:32:28.153375 kubelet[2913]: I0123 23:32:28.153342 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8525930c-a129-42c2-8aaf-49aa89f532c7-config\") pod \"goldmane-666569f655-52tqh\" (UID: \"8525930c-a129-42c2-8aaf-49aa89f532c7\") " pod="calico-system/goldmane-666569f655-52tqh" Jan 23 23:32:28.178006 containerd[1666]: time="2026-01-23T23:32:28.177929756Z" level=error msg="Failed to destroy network for sandbox \"ec7b7bd5d39026ded0d2dc020075ac2aa78cc67eaa824ea99dec91f92307816b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.180125 containerd[1666]: time="2026-01-23T23:32:28.180084043Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfnpf,Uid:a69a1122-8e77-47a0-ac55-a81fea68c3e7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec7b7bd5d39026ded0d2dc020075ac2aa78cc67eaa824ea99dec91f92307816b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.180502 kubelet[2913]: E0123 23:32:28.180449 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec7b7bd5d39026ded0d2dc020075ac2aa78cc67eaa824ea99dec91f92307816b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.180585 kubelet[2913]: E0123 23:32:28.180531 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec7b7bd5d39026ded0d2dc020075ac2aa78cc67eaa824ea99dec91f92307816b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rfnpf" Jan 23 23:32:28.180585 kubelet[2913]: E0123 23:32:28.180552 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec7b7bd5d39026ded0d2dc020075ac2aa78cc67eaa824ea99dec91f92307816b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rfnpf" Jan 23 23:32:28.180664 kubelet[2913]: E0123 23:32:28.180606 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ec7b7bd5d39026ded0d2dc020075ac2aa78cc67eaa824ea99dec91f92307816b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:28.187851 containerd[1666]: time="2026-01-23T23:32:28.187784266Z" level=error msg="Failed to destroy network for sandbox \"54dc99f1534cf141379ba1b189196ab9ef20046148f129f27c51738d99bb44d9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.190145 containerd[1666]: time="2026-01-23T23:32:28.190097833Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22wrh,Uid:31688273-3f62-4a06-a170-1e0f188f7bf7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dc99f1534cf141379ba1b189196ab9ef20046148f129f27c51738d99bb44d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.190831 kubelet[2913]: E0123 23:32:28.190331 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dc99f1534cf141379ba1b189196ab9ef20046148f129f27c51738d99bb44d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.190831 kubelet[2913]: E0123 23:32:28.190384 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dc99f1534cf141379ba1b189196ab9ef20046148f129f27c51738d99bb44d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-22wrh" Jan 23 23:32:28.190831 kubelet[2913]: E0123 23:32:28.190403 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dc99f1534cf141379ba1b189196ab9ef20046148f129f27c51738d99bb44d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-22wrh" Jan 23 23:32:28.190975 kubelet[2913]: E0123 23:32:28.190455 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-22wrh_kube-system(31688273-3f62-4a06-a170-1e0f188f7bf7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-22wrh_kube-system(31688273-3f62-4a06-a170-1e0f188f7bf7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"54dc99f1534cf141379ba1b189196ab9ef20046148f129f27c51738d99bb44d9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-22wrh" podUID="31688273-3f62-4a06-a170-1e0f188f7bf7" Jan 23 23:32:28.258081 containerd[1666]: time="2026-01-23T23:32:28.257973960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dqzmb,Uid:9a039bd1-1840-4663-bae3-65c063cc9185,Namespace:kube-system,Attempt:0,}" Jan 23 23:32:28.291660 containerd[1666]: time="2026-01-23T23:32:28.291607343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68f445db44-27ndm,Uid:1d648e36-1a1c-46cb-8b04-91769885543b,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:28.301987 containerd[1666]: time="2026-01-23T23:32:28.301551333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b4dc988f8-bj7zx,Uid:62170b86-7ada-415b-964e-cbba9c446534,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:28.313316 containerd[1666]: time="2026-01-23T23:32:28.313095208Z" level=error msg="Failed to destroy network for sandbox \"0e6d4ab522dc204f25bd78714f2639b57649e465e484af117607bac4911492fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.326497 containerd[1666]: time="2026-01-23T23:32:28.326434609Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dqzmb,Uid:9a039bd1-1840-4663-bae3-65c063cc9185,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e6d4ab522dc204f25bd78714f2639b57649e465e484af117607bac4911492fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.328147 kubelet[2913]: E0123 23:32:28.328082 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e6d4ab522dc204f25bd78714f2639b57649e465e484af117607bac4911492fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.328220 kubelet[2913]: E0123 23:32:28.328157 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e6d4ab522dc204f25bd78714f2639b57649e465e484af117607bac4911492fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dqzmb" Jan 23 23:32:28.328220 kubelet[2913]: E0123 23:32:28.328179 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e6d4ab522dc204f25bd78714f2639b57649e465e484af117607bac4911492fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dqzmb" Jan 23 23:32:28.328267 kubelet[2913]: E0123 23:32:28.328236 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dqzmb_kube-system(9a039bd1-1840-4663-bae3-65c063cc9185)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dqzmb_kube-system(9a039bd1-1840-4663-bae3-65c063cc9185)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0e6d4ab522dc204f25bd78714f2639b57649e465e484af117607bac4911492fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dqzmb" podUID="9a039bd1-1840-4663-bae3-65c063cc9185" Jan 23 23:32:28.340980 containerd[1666]: time="2026-01-23T23:32:28.340911293Z" level=error msg="Failed to destroy network for sandbox \"8cb8644c3ba03e54b51cbf95bcb53d2392b8d456ae746fbb90a0621af0c4a69b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.344184 containerd[1666]: time="2026-01-23T23:32:28.343494981Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68f445db44-27ndm,Uid:1d648e36-1a1c-46cb-8b04-91769885543b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cb8644c3ba03e54b51cbf95bcb53d2392b8d456ae746fbb90a0621af0c4a69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.344358 kubelet[2913]: E0123 23:32:28.343722 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cb8644c3ba03e54b51cbf95bcb53d2392b8d456ae746fbb90a0621af0c4a69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.344358 kubelet[2913]: E0123 23:32:28.343804 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cb8644c3ba03e54b51cbf95bcb53d2392b8d456ae746fbb90a0621af0c4a69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" Jan 23 23:32:28.344358 kubelet[2913]: E0123 23:32:28.343824 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cb8644c3ba03e54b51cbf95bcb53d2392b8d456ae746fbb90a0621af0c4a69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" Jan 23 23:32:28.344445 kubelet[2913]: E0123 23:32:28.343884 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-68f445db44-27ndm_calico-system(1d648e36-1a1c-46cb-8b04-91769885543b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-68f445db44-27ndm_calico-system(1d648e36-1a1c-46cb-8b04-91769885543b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8cb8644c3ba03e54b51cbf95bcb53d2392b8d456ae746fbb90a0621af0c4a69b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:32:28.356565 containerd[1666]: time="2026-01-23T23:32:28.356399620Z" level=error msg="Failed to destroy network for sandbox \"5366a39f60c1ef553274bade8c6bb2087223003631e629c92ef6d2800157dd99\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.358856 containerd[1666]: time="2026-01-23T23:32:28.358817148Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b4dc988f8-bj7zx,Uid:62170b86-7ada-415b-964e-cbba9c446534,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5366a39f60c1ef553274bade8c6bb2087223003631e629c92ef6d2800157dd99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.359216 kubelet[2913]: E0123 23:32:28.359185 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5366a39f60c1ef553274bade8c6bb2087223003631e629c92ef6d2800157dd99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.359362 kubelet[2913]: E0123 23:32:28.359339 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5366a39f60c1ef553274bade8c6bb2087223003631e629c92ef6d2800157dd99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b4dc988f8-bj7zx" Jan 23 23:32:28.359427 kubelet[2913]: E0123 23:32:28.359413 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5366a39f60c1ef553274bade8c6bb2087223003631e629c92ef6d2800157dd99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b4dc988f8-bj7zx" Jan 23 23:32:28.359570 kubelet[2913]: E0123 23:32:28.359522 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5b4dc988f8-bj7zx_calico-system(62170b86-7ada-415b-964e-cbba9c446534)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5b4dc988f8-bj7zx_calico-system(62170b86-7ada-415b-964e-cbba9c446534)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5366a39f60c1ef553274bade8c6bb2087223003631e629c92ef6d2800157dd99\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5b4dc988f8-bj7zx" podUID="62170b86-7ada-415b-964e-cbba9c446534" Jan 23 23:32:28.417004 containerd[1666]: time="2026-01-23T23:32:28.416930125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-52tqh,Uid:8525930c-a129-42c2-8aaf-49aa89f532c7,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:28.429295 containerd[1666]: time="2026-01-23T23:32:28.429257123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-2l9kd,Uid:aece08d9-f39f-4025-afe7-4d9ae33375fe,Namespace:calico-apiserver,Attempt:0,}" Jan 23 23:32:28.435200 containerd[1666]: time="2026-01-23T23:32:28.435116780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-m6qjq,Uid:97acd778-d005-41f9-8db5-25f87a68c090,Namespace:calico-apiserver,Attempt:0,}" Jan 23 23:32:28.474187 containerd[1666]: time="2026-01-23T23:32:28.474084939Z" level=error msg="Failed to destroy network for sandbox \"9c3cab53f450b2e757fdd79ba4331517c1ceded7be08af903c92e017686c3697\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.476331 containerd[1666]: time="2026-01-23T23:32:28.476281906Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-52tqh,Uid:8525930c-a129-42c2-8aaf-49aa89f532c7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c3cab53f450b2e757fdd79ba4331517c1ceded7be08af903c92e017686c3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.476613 kubelet[2913]: E0123 23:32:28.476512 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c3cab53f450b2e757fdd79ba4331517c1ceded7be08af903c92e017686c3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.476613 kubelet[2913]: E0123 23:32:28.476596 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c3cab53f450b2e757fdd79ba4331517c1ceded7be08af903c92e017686c3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-52tqh" Jan 23 23:32:28.476704 kubelet[2913]: E0123 23:32:28.476615 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c3cab53f450b2e757fdd79ba4331517c1ceded7be08af903c92e017686c3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-52tqh" Jan 23 23:32:28.476704 kubelet[2913]: E0123 23:32:28.476678 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-52tqh_calico-system(8525930c-a129-42c2-8aaf-49aa89f532c7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-52tqh_calico-system(8525930c-a129-42c2-8aaf-49aa89f532c7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c3cab53f450b2e757fdd79ba4331517c1ceded7be08af903c92e017686c3697\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:32:28.488501 containerd[1666]: time="2026-01-23T23:32:28.488451583Z" level=error msg="Failed to destroy network for sandbox \"0cc4a67ceed207ef0b3ecc2c53d6200555fd6f9e6479468af37c607559c088dc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.492581 containerd[1666]: time="2026-01-23T23:32:28.492460835Z" level=error msg="Failed to destroy network for sandbox \"c9d7604edb37ac0ead86a2dfb1737a6010698fb05ab5c6c69bd2cf66980bb9d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.493981 containerd[1666]: time="2026-01-23T23:32:28.493925960Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-2l9kd,Uid:aece08d9-f39f-4025-afe7-4d9ae33375fe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc4a67ceed207ef0b3ecc2c53d6200555fd6f9e6479468af37c607559c088dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.494177 kubelet[2913]: E0123 23:32:28.494134 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc4a67ceed207ef0b3ecc2c53d6200555fd6f9e6479468af37c607559c088dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.494248 kubelet[2913]: E0123 23:32:28.494194 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc4a67ceed207ef0b3ecc2c53d6200555fd6f9e6479468af37c607559c088dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" Jan 23 23:32:28.494248 kubelet[2913]: E0123 23:32:28.494214 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc4a67ceed207ef0b3ecc2c53d6200555fd6f9e6479468af37c607559c088dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" Jan 23 23:32:28.494299 kubelet[2913]: E0123 23:32:28.494268 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-579f6fb948-2l9kd_calico-apiserver(aece08d9-f39f-4025-afe7-4d9ae33375fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-579f6fb948-2l9kd_calico-apiserver(aece08d9-f39f-4025-afe7-4d9ae33375fe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0cc4a67ceed207ef0b3ecc2c53d6200555fd6f9e6479468af37c607559c088dc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:32:28.496863 containerd[1666]: time="2026-01-23T23:32:28.496804809Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-m6qjq,Uid:97acd778-d005-41f9-8db5-25f87a68c090,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9d7604edb37ac0ead86a2dfb1737a6010698fb05ab5c6c69bd2cf66980bb9d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.497094 kubelet[2913]: E0123 23:32:28.497054 2913 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9d7604edb37ac0ead86a2dfb1737a6010698fb05ab5c6c69bd2cf66980bb9d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 23:32:28.497144 kubelet[2913]: E0123 23:32:28.497108 2913 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9d7604edb37ac0ead86a2dfb1737a6010698fb05ab5c6c69bd2cf66980bb9d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" Jan 23 23:32:28.497144 kubelet[2913]: E0123 23:32:28.497128 2913 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9d7604edb37ac0ead86a2dfb1737a6010698fb05ab5c6c69bd2cf66980bb9d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" Jan 23 23:32:28.497194 kubelet[2913]: E0123 23:32:28.497171 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-579f6fb948-m6qjq_calico-apiserver(97acd778-d005-41f9-8db5-25f87a68c090)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-579f6fb948-m6qjq_calico-apiserver(97acd778-d005-41f9-8db5-25f87a68c090)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c9d7604edb37ac0ead86a2dfb1737a6010698fb05ab5c6c69bd2cf66980bb9d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:32:28.729701 containerd[1666]: time="2026-01-23T23:32:28.729664519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 23 23:32:28.903066 systemd[1]: run-netns-cni\x2d0e12ded0\x2dfde0\x2dacea\x2d9e59\x2def756e0aaa69.mount: Deactivated successfully. Jan 23 23:32:28.903434 systemd[1]: run-netns-cni\x2d84aeb48f\x2d8fd9\x2df5c8\x2d3bfa\x2dcf7dc05dc64b.mount: Deactivated successfully. Jan 23 23:32:35.546297 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount756247783.mount: Deactivated successfully. Jan 23 23:32:35.571139 containerd[1666]: time="2026-01-23T23:32:35.571023423Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:35.572156 containerd[1666]: time="2026-01-23T23:32:35.572109467Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 23 23:32:35.573246 containerd[1666]: time="2026-01-23T23:32:35.573219830Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:35.575049 containerd[1666]: time="2026-01-23T23:32:35.575012955Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 23:32:35.575557 containerd[1666]: time="2026-01-23T23:32:35.575531157Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 6.845819038s" Jan 23 23:32:35.575589 containerd[1666]: time="2026-01-23T23:32:35.575560237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 23 23:32:35.586529 containerd[1666]: time="2026-01-23T23:32:35.586485190Z" level=info msg="CreateContainer within sandbox \"2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 23 23:32:35.596839 containerd[1666]: time="2026-01-23T23:32:35.596503941Z" level=info msg="Container be8ed6f9c72e533c24b6cf2a882cbefd6a5271828afa02ce3df7e358152c6046: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:35.608732 containerd[1666]: time="2026-01-23T23:32:35.608683098Z" level=info msg="CreateContainer within sandbox \"2c2e18829e70e1ae03048c14f270227f792275a021f944757711efd3f040868e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"be8ed6f9c72e533c24b6cf2a882cbefd6a5271828afa02ce3df7e358152c6046\"" Jan 23 23:32:35.609315 containerd[1666]: time="2026-01-23T23:32:35.609275220Z" level=info msg="StartContainer for \"be8ed6f9c72e533c24b6cf2a882cbefd6a5271828afa02ce3df7e358152c6046\"" Jan 23 23:32:35.611010 containerd[1666]: time="2026-01-23T23:32:35.610977665Z" level=info msg="connecting to shim be8ed6f9c72e533c24b6cf2a882cbefd6a5271828afa02ce3df7e358152c6046" address="unix:///run/containerd/s/d86ff896aaf873dbdaea0b6ec6220bcb1944e47bea37bb86656bb1e316470e93" protocol=ttrpc version=3 Jan 23 23:32:35.638335 systemd[1]: Started cri-containerd-be8ed6f9c72e533c24b6cf2a882cbefd6a5271828afa02ce3df7e358152c6046.scope - libcontainer container be8ed6f9c72e533c24b6cf2a882cbefd6a5271828afa02ce3df7e358152c6046. Jan 23 23:32:35.696649 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 23 23:32:35.696750 kernel: audit: type=1334 audit(1769211155.693:570): prog-id=172 op=LOAD Jan 23 23:32:35.693000 audit: BPF prog-id=172 op=LOAD Jan 23 23:32:35.693000 audit[4013]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.704598 kernel: audit: type=1300 audit(1769211155.693:570): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.704697 kernel: audit: type=1327 audit(1769211155.693:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.705557 kernel: audit: type=1334 audit(1769211155.693:571): prog-id=173 op=LOAD Jan 23 23:32:35.693000 audit: BPF prog-id=173 op=LOAD Jan 23 23:32:35.709324 kernel: audit: type=1300 audit(1769211155.693:571): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.693000 audit[4013]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.712721 kernel: audit: type=1327 audit(1769211155.693:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.694000 audit: BPF prog-id=173 op=UNLOAD Jan 23 23:32:35.715260 kernel: audit: type=1334 audit(1769211155.694:572): prog-id=173 op=UNLOAD Jan 23 23:32:35.715296 kernel: audit: type=1300 audit(1769211155.694:572): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.694000 audit[4013]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.721367 kernel: audit: type=1327 audit(1769211155.694:572): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.721425 kernel: audit: type=1334 audit(1769211155.694:573): prog-id=172 op=UNLOAD Jan 23 23:32:35.694000 audit: BPF prog-id=172 op=UNLOAD Jan 23 23:32:35.694000 audit[4013]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.694000 audit: BPF prog-id=174 op=LOAD Jan 23 23:32:35.694000 audit[4013]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3465 pid=4013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:35.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265386564366639633732653533336332346236636632613838326362 Jan 23 23:32:35.732379 containerd[1666]: time="2026-01-23T23:32:35.732335755Z" level=info msg="StartContainer for \"be8ed6f9c72e533c24b6cf2a882cbefd6a5271828afa02ce3df7e358152c6046\" returns successfully" Jan 23 23:32:35.867877 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 23 23:32:35.867993 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 23 23:32:35.979999 kubelet[2913]: I0123 23:32:35.978924 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-zhpk5" podStartSLOduration=0.982839263 podStartE2EDuration="22.978908187s" podCreationTimestamp="2026-01-23 23:32:13 +0000 UTC" firstStartedPulling="2026-01-23 23:32:13.580308836 +0000 UTC m=+20.340951351" lastFinishedPulling="2026-01-23 23:32:35.57637772 +0000 UTC m=+42.337020275" observedRunningTime="2026-01-23 23:32:35.759195317 +0000 UTC m=+42.519837872" watchObservedRunningTime="2026-01-23 23:32:35.978908187 +0000 UTC m=+42.739550742" Jan 23 23:32:36.006768 kubelet[2913]: I0123 23:32:36.006714 2913 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4rsv\" (UniqueName: \"kubernetes.io/projected/62170b86-7ada-415b-964e-cbba9c446534-kube-api-access-w4rsv\") pod \"62170b86-7ada-415b-964e-cbba9c446534\" (UID: \"62170b86-7ada-415b-964e-cbba9c446534\") " Jan 23 23:32:36.007035 kubelet[2913]: I0123 23:32:36.006931 2913 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62170b86-7ada-415b-964e-cbba9c446534-whisker-ca-bundle\") pod \"62170b86-7ada-415b-964e-cbba9c446534\" (UID: \"62170b86-7ada-415b-964e-cbba9c446534\") " Jan 23 23:32:36.007399 kubelet[2913]: I0123 23:32:36.007097 2913 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/62170b86-7ada-415b-964e-cbba9c446534-whisker-backend-key-pair\") pod \"62170b86-7ada-415b-964e-cbba9c446534\" (UID: \"62170b86-7ada-415b-964e-cbba9c446534\") " Jan 23 23:32:36.007792 kubelet[2913]: I0123 23:32:36.007757 2913 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62170b86-7ada-415b-964e-cbba9c446534-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "62170b86-7ada-415b-964e-cbba9c446534" (UID: "62170b86-7ada-415b-964e-cbba9c446534"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 23 23:32:36.011550 kubelet[2913]: I0123 23:32:36.011475 2913 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62170b86-7ada-415b-964e-cbba9c446534-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "62170b86-7ada-415b-964e-cbba9c446534" (UID: "62170b86-7ada-415b-964e-cbba9c446534"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 23 23:32:36.013001 kubelet[2913]: I0123 23:32:36.012684 2913 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62170b86-7ada-415b-964e-cbba9c446534-kube-api-access-w4rsv" (OuterVolumeSpecName: "kube-api-access-w4rsv") pod "62170b86-7ada-415b-964e-cbba9c446534" (UID: "62170b86-7ada-415b-964e-cbba9c446534"). InnerVolumeSpecName "kube-api-access-w4rsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 23 23:32:36.108391 kubelet[2913]: I0123 23:32:36.108305 2913 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4rsv\" (UniqueName: \"kubernetes.io/projected/62170b86-7ada-415b-964e-cbba9c446534-kube-api-access-w4rsv\") on node \"ci-4593-0-0-1-266c03b17e\" DevicePath \"\"" Jan 23 23:32:36.108391 kubelet[2913]: I0123 23:32:36.108358 2913 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62170b86-7ada-415b-964e-cbba9c446534-whisker-ca-bundle\") on node \"ci-4593-0-0-1-266c03b17e\" DevicePath \"\"" Jan 23 23:32:36.108391 kubelet[2913]: I0123 23:32:36.108370 2913 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/62170b86-7ada-415b-964e-cbba9c446534-whisker-backend-key-pair\") on node \"ci-4593-0-0-1-266c03b17e\" DevicePath \"\"" Jan 23 23:32:36.547663 systemd[1]: var-lib-kubelet-pods-62170b86\x2d7ada\x2d415b\x2d964e\x2dcbba9c446534-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dw4rsv.mount: Deactivated successfully. Jan 23 23:32:36.547762 systemd[1]: var-lib-kubelet-pods-62170b86\x2d7ada\x2d415b\x2d964e\x2dcbba9c446534-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 23 23:32:36.750988 systemd[1]: Removed slice kubepods-besteffort-pod62170b86_7ada_415b_964e_cbba9c446534.slice - libcontainer container kubepods-besteffort-pod62170b86_7ada_415b_964e_cbba9c446534.slice. Jan 23 23:32:36.806826 systemd[1]: Created slice kubepods-besteffort-podba9f6f29_6ccf_4464_bff0_93a0f3e2b483.slice - libcontainer container kubepods-besteffort-podba9f6f29_6ccf_4464_bff0_93a0f3e2b483.slice. Jan 23 23:32:36.813787 kubelet[2913]: I0123 23:32:36.813739 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98wx\" (UniqueName: \"kubernetes.io/projected/ba9f6f29-6ccf-4464-bff0-93a0f3e2b483-kube-api-access-x98wx\") pod \"whisker-5c66cc49b-5j6h9\" (UID: \"ba9f6f29-6ccf-4464-bff0-93a0f3e2b483\") " pod="calico-system/whisker-5c66cc49b-5j6h9" Jan 23 23:32:36.813787 kubelet[2913]: I0123 23:32:36.813792 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ba9f6f29-6ccf-4464-bff0-93a0f3e2b483-whisker-backend-key-pair\") pod \"whisker-5c66cc49b-5j6h9\" (UID: \"ba9f6f29-6ccf-4464-bff0-93a0f3e2b483\") " pod="calico-system/whisker-5c66cc49b-5j6h9" Jan 23 23:32:36.813942 kubelet[2913]: I0123 23:32:36.813822 2913 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9f6f29-6ccf-4464-bff0-93a0f3e2b483-whisker-ca-bundle\") pod \"whisker-5c66cc49b-5j6h9\" (UID: \"ba9f6f29-6ccf-4464-bff0-93a0f3e2b483\") " pod="calico-system/whisker-5c66cc49b-5j6h9" Jan 23 23:32:37.111683 containerd[1666]: time="2026-01-23T23:32:37.111611682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c66cc49b-5j6h9,Uid:ba9f6f29-6ccf-4464-bff0-93a0f3e2b483,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:37.292681 systemd-networkd[1577]: calif3b1f5ed54f: Link UP Jan 23 23:32:37.292834 systemd-networkd[1577]: calif3b1f5ed54f: Gained carrier Jan 23 23:32:37.316468 containerd[1666]: 2026-01-23 23:32:37.133 [INFO][4078] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 23 23:32:37.316468 containerd[1666]: 2026-01-23 23:32:37.156 [INFO][4078] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0 whisker-5c66cc49b- calico-system ba9f6f29-6ccf-4464-bff0-93a0f3e2b483 897 0 2026-01-23 23:32:36 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5c66cc49b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e whisker-5c66cc49b-5j6h9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif3b1f5ed54f [] [] }} ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-" Jan 23 23:32:37.316468 containerd[1666]: 2026-01-23 23:32:37.156 [INFO][4078] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" Jan 23 23:32:37.316468 containerd[1666]: 2026-01-23 23:32:37.227 [INFO][4130] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" HandleID="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Workload="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.227 [INFO][4130] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" HandleID="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Workload="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003c0ac0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-1-266c03b17e", "pod":"whisker-5c66cc49b-5j6h9", "timestamp":"2026-01-23 23:32:37.227277435 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.227 [INFO][4130] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.227 [INFO][4130] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.227 [INFO][4130] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.239 [INFO][4130] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.248 [INFO][4130] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.253 [INFO][4130] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.257 [INFO][4130] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316698 containerd[1666]: 2026-01-23 23:32:37.259 [INFO][4130] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316892 containerd[1666]: 2026-01-23 23:32:37.260 [INFO][4130] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316892 containerd[1666]: 2026-01-23 23:32:37.262 [INFO][4130] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8 Jan 23 23:32:37.316892 containerd[1666]: 2026-01-23 23:32:37.270 [INFO][4130] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316892 containerd[1666]: 2026-01-23 23:32:37.277 [INFO][4130] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.129/26] block=192.168.49.128/26 handle="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316892 containerd[1666]: 2026-01-23 23:32:37.277 [INFO][4130] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.129/26] handle="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:37.316892 containerd[1666]: 2026-01-23 23:32:37.277 [INFO][4130] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:37.316892 containerd[1666]: 2026-01-23 23:32:37.277 [INFO][4130] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.129/26] IPv6=[] ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" HandleID="k8s-pod-network.f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Workload="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" Jan 23 23:32:37.317099 containerd[1666]: 2026-01-23 23:32:37.282 [INFO][4078] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0", GenerateName:"whisker-5c66cc49b-", Namespace:"calico-system", SelfLink:"", UID:"ba9f6f29-6ccf-4464-bff0-93a0f3e2b483", ResourceVersion:"897", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5c66cc49b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"whisker-5c66cc49b-5j6h9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.49.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif3b1f5ed54f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:37.317099 containerd[1666]: 2026-01-23 23:32:37.282 [INFO][4078] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.129/32] ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" Jan 23 23:32:37.317177 containerd[1666]: 2026-01-23 23:32:37.282 [INFO][4078] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif3b1f5ed54f ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" Jan 23 23:32:37.317177 containerd[1666]: 2026-01-23 23:32:37.293 [INFO][4078] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" Jan 23 23:32:37.317406 containerd[1666]: 2026-01-23 23:32:37.294 [INFO][4078] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0", GenerateName:"whisker-5c66cc49b-", Namespace:"calico-system", SelfLink:"", UID:"ba9f6f29-6ccf-4464-bff0-93a0f3e2b483", ResourceVersion:"897", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5c66cc49b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8", Pod:"whisker-5c66cc49b-5j6h9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.49.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif3b1f5ed54f", MAC:"82:44:3b:31:74:4e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:37.317463 containerd[1666]: 2026-01-23 23:32:37.313 [INFO][4078] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" Namespace="calico-system" Pod="whisker-5c66cc49b-5j6h9" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-whisker--5c66cc49b--5j6h9-eth0" Jan 23 23:32:37.354462 containerd[1666]: time="2026-01-23T23:32:37.354410862Z" level=info msg="connecting to shim f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8" address="unix:///run/containerd/s/7ce349bcce9a703b4df14fcd62ad547ff4f793c30891cea94824252617fe12f2" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:37.395257 systemd[1]: Started cri-containerd-f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8.scope - libcontainer container f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8. Jan 23 23:32:37.407000 audit: BPF prog-id=175 op=LOAD Jan 23 23:32:37.407000 audit[4277]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd1b00588 a2=98 a3=ffffd1b00578 items=0 ppid=4112 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.407000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 23:32:37.407000 audit: BPF prog-id=175 op=UNLOAD Jan 23 23:32:37.407000 audit[4277]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd1b00558 a3=0 items=0 ppid=4112 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.407000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 23:32:37.407000 audit: BPF prog-id=176 op=LOAD Jan 23 23:32:37.407000 audit[4277]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd1b00438 a2=74 a3=95 items=0 ppid=4112 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.407000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 23:32:37.407000 audit: BPF prog-id=176 op=UNLOAD Jan 23 23:32:37.407000 audit[4277]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4112 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.407000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 23:32:37.408000 audit: BPF prog-id=177 op=LOAD Jan 23 23:32:37.408000 audit[4277]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd1b00468 a2=40 a3=ffffd1b00498 items=0 ppid=4112 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.408000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 23:32:37.408000 audit: BPF prog-id=177 op=UNLOAD Jan 23 23:32:37.408000 audit[4277]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffd1b00498 items=0 ppid=4112 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.408000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 23:32:37.409000 audit: BPF prog-id=178 op=LOAD Jan 23 23:32:37.409000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffffe713378 a2=98 a3=fffffe713368 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.409000 audit: BPF prog-id=178 op=UNLOAD Jan 23 23:32:37.409000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffffe713348 a3=0 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.409000 audit: BPF prog-id=179 op=LOAD Jan 23 23:32:37.409000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffe713008 a2=74 a3=95 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.409000 audit: BPF prog-id=179 op=UNLOAD Jan 23 23:32:37.409000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.409000 audit: BPF prog-id=180 op=LOAD Jan 23 23:32:37.409000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffe713068 a2=94 a3=2 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.409000 audit: BPF prog-id=180 op=UNLOAD Jan 23 23:32:37.409000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.410000 audit: BPF prog-id=181 op=LOAD Jan 23 23:32:37.410000 audit: BPF prog-id=182 op=LOAD Jan 23 23:32:37.410000 audit[4249]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4237 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637313033656136316535623031666261306438303464613333326432 Jan 23 23:32:37.410000 audit: BPF prog-id=182 op=UNLOAD Jan 23 23:32:37.410000 audit[4249]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4237 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.410000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637313033656136316535623031666261306438303464613333326432 Jan 23 23:32:37.411000 audit: BPF prog-id=183 op=LOAD Jan 23 23:32:37.411000 audit[4249]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=4237 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637313033656136316535623031666261306438303464613333326432 Jan 23 23:32:37.411000 audit: BPF prog-id=184 op=LOAD Jan 23 23:32:37.411000 audit[4249]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=4237 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637313033656136316535623031666261306438303464613333326432 Jan 23 23:32:37.411000 audit: BPF prog-id=184 op=UNLOAD Jan 23 23:32:37.411000 audit[4249]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4237 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637313033656136316535623031666261306438303464613333326432 Jan 23 23:32:37.411000 audit: BPF prog-id=183 op=UNLOAD Jan 23 23:32:37.411000 audit[4249]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4237 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637313033656136316535623031666261306438303464613333326432 Jan 23 23:32:37.412000 audit: BPF prog-id=185 op=LOAD Jan 23 23:32:37.412000 audit[4249]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=4237 pid=4249 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637313033656136316535623031666261306438303464613333326432 Jan 23 23:32:37.442991 containerd[1666]: time="2026-01-23T23:32:37.442930292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c66cc49b-5j6h9,Uid:ba9f6f29-6ccf-4464-bff0-93a0f3e2b483,Namespace:calico-system,Attempt:0,} returns sandbox id \"f7103ea61e5b01fba0d804da332d216f068e9008ee20af97b486cbf3f76933a8\"" Jan 23 23:32:37.444514 containerd[1666]: time="2026-01-23T23:32:37.444291176Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 23:32:37.512000 audit: BPF prog-id=186 op=LOAD Jan 23 23:32:37.512000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffe713028 a2=40 a3=fffffe713058 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.512000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.512000 audit: BPF prog-id=186 op=UNLOAD Jan 23 23:32:37.512000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=fffffe713058 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.512000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.521000 audit: BPF prog-id=187 op=LOAD Jan 23 23:32:37.521000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffe713038 a2=94 a3=4 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.521000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=187 op=UNLOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=188 op=LOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffffe712e78 a2=94 a3=5 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=188 op=UNLOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=189 op=LOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffe7130a8 a2=94 a3=6 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=189 op=UNLOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=190 op=LOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffe712878 a2=94 a3=83 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=191 op=LOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=fffffe712638 a2=94 a3=2 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.522000 audit: BPF prog-id=191 op=UNLOAD Jan 23 23:32:37.522000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.522000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.523000 audit: BPF prog-id=190 op=UNLOAD Jan 23 23:32:37.523000 audit[4278]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1db3f620 a3=1db32b00 items=0 ppid=4112 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.523000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 23:32:37.532000 audit: BPF prog-id=192 op=LOAD Jan 23 23:32:37.532000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff0f3b3f8 a2=98 a3=fffff0f3b3e8 items=0 ppid=4112 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.532000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 23:32:37.532000 audit: BPF prog-id=192 op=UNLOAD Jan 23 23:32:37.532000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff0f3b3c8 a3=0 items=0 ppid=4112 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.532000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 23:32:37.532000 audit: BPF prog-id=193 op=LOAD Jan 23 23:32:37.532000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff0f3b2a8 a2=74 a3=95 items=0 ppid=4112 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.532000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 23:32:37.532000 audit: BPF prog-id=193 op=UNLOAD Jan 23 23:32:37.532000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4112 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.532000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 23:32:37.532000 audit: BPF prog-id=194 op=LOAD Jan 23 23:32:37.532000 audit[4287]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff0f3b2d8 a2=40 a3=fffff0f3b308 items=0 ppid=4112 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.532000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 23:32:37.532000 audit: BPF prog-id=194 op=UNLOAD Jan 23 23:32:37.532000 audit[4287]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=fffff0f3b308 items=0 ppid=4112 pid=4287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.532000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 23:32:37.618283 systemd-networkd[1577]: vxlan.calico: Link UP Jan 23 23:32:37.618334 systemd-networkd[1577]: vxlan.calico: Gained carrier Jan 23 23:32:37.635000 audit: BPF prog-id=195 op=LOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcac705b8 a2=98 a3=ffffcac705a8 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=195 op=UNLOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffcac70588 a3=0 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=196 op=LOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcac70298 a2=74 a3=95 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=196 op=UNLOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=197 op=LOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcac702f8 a2=94 a3=2 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=197 op=UNLOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=198 op=LOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcac70178 a2=40 a3=ffffcac701a8 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=198 op=UNLOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffcac701a8 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=199 op=LOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcac702c8 a2=94 a3=b7 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.635000 audit: BPF prog-id=199 op=UNLOAD Jan 23 23:32:37.635000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.635000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.636000 audit: BPF prog-id=200 op=LOAD Jan 23 23:32:37.636000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcac6f978 a2=94 a3=2 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.636000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.636000 audit: BPF prog-id=200 op=UNLOAD Jan 23 23:32:37.636000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.636000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.636000 audit: BPF prog-id=201 op=LOAD Jan 23 23:32:37.636000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcac6fb08 a2=94 a3=30 items=0 ppid=4112 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.636000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 23:32:37.638000 audit: BPF prog-id=202 op=LOAD Jan 23 23:32:37.638000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe068b458 a2=98 a3=ffffe068b448 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.638000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.638000 audit: BPF prog-id=202 op=UNLOAD Jan 23 23:32:37.638000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe068b428 a3=0 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.638000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.638000 audit: BPF prog-id=203 op=LOAD Jan 23 23:32:37.638000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe068b0e8 a2=74 a3=95 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.638000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.638000 audit: BPF prog-id=203 op=UNLOAD Jan 23 23:32:37.638000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.638000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.638000 audit: BPF prog-id=204 op=LOAD Jan 23 23:32:37.638000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe068b148 a2=94 a3=2 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.638000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.639000 audit: BPF prog-id=204 op=UNLOAD Jan 23 23:32:37.639000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.639000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.646645 kubelet[2913]: I0123 23:32:37.646552 2913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62170b86-7ada-415b-964e-cbba9c446534" path="/var/lib/kubelet/pods/62170b86-7ada-415b-964e-cbba9c446534/volumes" Jan 23 23:32:37.750000 audit: BPF prog-id=205 op=LOAD Jan 23 23:32:37.750000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe068b108 a2=40 a3=ffffe068b138 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.750000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.750000 audit: BPF prog-id=205 op=UNLOAD Jan 23 23:32:37.750000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffe068b138 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.750000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.759000 audit: BPF prog-id=206 op=LOAD Jan 23 23:32:37.759000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffe068b118 a2=94 a3=4 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.759000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=206 op=UNLOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=207 op=LOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe068af58 a2=94 a3=5 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=207 op=UNLOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=208 op=LOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffe068b188 a2=94 a3=6 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=208 op=UNLOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=209 op=LOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffe068a958 a2=94 a3=83 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=210 op=LOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffe068a718 a2=94 a3=2 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.760000 audit: BPF prog-id=210 op=UNLOAD Jan 23 23:32:37.760000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.760000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.761000 audit: BPF prog-id=209 op=UNLOAD Jan 23 23:32:37.761000 audit[4315]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1da28620 a3=1da1bb00 items=0 ppid=4112 pid=4315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.761000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 23:32:37.763917 containerd[1666]: time="2026-01-23T23:32:37.763881711Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:37.765752 containerd[1666]: time="2026-01-23T23:32:37.765708957Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 23:32:37.765833 containerd[1666]: time="2026-01-23T23:32:37.765794757Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:37.766065 kubelet[2913]: E0123 23:32:37.766013 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:32:37.766158 kubelet[2913]: E0123 23:32:37.766082 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:32:37.766986 kubelet[2913]: E0123 23:32:37.766751 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:89d4259d6b224a32924e4c2e731fe8b2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:37.767000 audit: BPF prog-id=201 op=UNLOAD Jan 23 23:32:37.767000 audit[4112]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=400018c080 a2=0 a3=0 items=0 ppid=4091 pid=4112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.767000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 23 23:32:37.768971 containerd[1666]: time="2026-01-23T23:32:37.768907206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 23:32:37.807000 audit[4344]: NETFILTER_CFG table=mangle:119 family=2 entries=16 op=nft_register_chain pid=4344 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:37.807000 audit[4344]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffe5babd00 a2=0 a3=ffff952f5fa8 items=0 ppid=4112 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.807000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:37.811000 audit[4347]: NETFILTER_CFG table=nat:120 family=2 entries=15 op=nft_register_chain pid=4347 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:37.811000 audit[4347]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=fffffb8d6940 a2=0 a3=ffffb78dffa8 items=0 ppid=4112 pid=4347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.811000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:37.814000 audit[4343]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4343 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:37.814000 audit[4343]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=fffff6e96df0 a2=0 a3=ffff856dffa8 items=0 ppid=4112 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.814000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:37.821000 audit[4345]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4345 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:37.821000 audit[4345]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffdd4ac7b0 a2=0 a3=ffffad535fa8 items=0 ppid=4112 pid=4345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:37.821000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:38.102851 containerd[1666]: time="2026-01-23T23:32:38.102614944Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:38.104051 containerd[1666]: time="2026-01-23T23:32:38.103991708Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 23:32:38.104097 containerd[1666]: time="2026-01-23T23:32:38.104069949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:38.104292 kubelet[2913]: E0123 23:32:38.104233 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:32:38.104292 kubelet[2913]: E0123 23:32:38.104286 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:32:38.104481 kubelet[2913]: E0123 23:32:38.104404 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:38.105638 kubelet[2913]: E0123 23:32:38.105583 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:32:38.752551 kubelet[2913]: E0123 23:32:38.752403 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:32:38.771000 audit[4358]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4358 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:38.771000 audit[4358]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff5c36900 a2=0 a3=1 items=0 ppid=3074 pid=4358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:38.771000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:38.778000 audit[4358]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4358 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:38.778000 audit[4358]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffff5c36900 a2=0 a3=1 items=0 ppid=3074 pid=4358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:38.778000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:38.897952 systemd-networkd[1577]: calif3b1f5ed54f: Gained IPv6LL Jan 23 23:32:39.665201 systemd-networkd[1577]: vxlan.calico: Gained IPv6LL Jan 23 23:32:40.637465 containerd[1666]: time="2026-01-23T23:32:40.637267674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-m6qjq,Uid:97acd778-d005-41f9-8db5-25f87a68c090,Namespace:calico-apiserver,Attempt:0,}" Jan 23 23:32:40.740260 systemd-networkd[1577]: cali599b1d40b43: Link UP Jan 23 23:32:40.741103 systemd-networkd[1577]: cali599b1d40b43: Gained carrier Jan 23 23:32:40.756276 containerd[1666]: 2026-01-23 23:32:40.674 [INFO][4362] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0 calico-apiserver-579f6fb948- calico-apiserver 97acd778-d005-41f9-8db5-25f87a68c090 832 0 2026-01-23 23:32:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:579f6fb948 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e calico-apiserver-579f6fb948-m6qjq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali599b1d40b43 [] [] }} ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-" Jan 23 23:32:40.756276 containerd[1666]: 2026-01-23 23:32:40.675 [INFO][4362] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" Jan 23 23:32:40.756276 containerd[1666]: 2026-01-23 23:32:40.696 [INFO][4376] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" HandleID="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.696 [INFO][4376] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" HandleID="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c7d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-1-266c03b17e", "pod":"calico-apiserver-579f6fb948-m6qjq", "timestamp":"2026-01-23 23:32:40.696242454 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.696 [INFO][4376] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.696 [INFO][4376] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.696 [INFO][4376] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.707 [INFO][4376] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.712 [INFO][4376] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.717 [INFO][4376] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.718 [INFO][4376] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756476 containerd[1666]: 2026-01-23 23:32:40.720 [INFO][4376] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756683 containerd[1666]: 2026-01-23 23:32:40.720 [INFO][4376] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756683 containerd[1666]: 2026-01-23 23:32:40.724 [INFO][4376] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76 Jan 23 23:32:40.756683 containerd[1666]: 2026-01-23 23:32:40.729 [INFO][4376] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756683 containerd[1666]: 2026-01-23 23:32:40.735 [INFO][4376] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.130/26] block=192.168.49.128/26 handle="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756683 containerd[1666]: 2026-01-23 23:32:40.736 [INFO][4376] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.130/26] handle="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:40.756683 containerd[1666]: 2026-01-23 23:32:40.736 [INFO][4376] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:40.756683 containerd[1666]: 2026-01-23 23:32:40.736 [INFO][4376] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.130/26] IPv6=[] ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" HandleID="k8s-pod-network.eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" Jan 23 23:32:40.756809 containerd[1666]: 2026-01-23 23:32:40.737 [INFO][4362] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0", GenerateName:"calico-apiserver-579f6fb948-", Namespace:"calico-apiserver", SelfLink:"", UID:"97acd778-d005-41f9-8db5-25f87a68c090", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"579f6fb948", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"calico-apiserver-579f6fb948-m6qjq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.49.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali599b1d40b43", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:40.756858 containerd[1666]: 2026-01-23 23:32:40.737 [INFO][4362] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.130/32] ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" Jan 23 23:32:40.756858 containerd[1666]: 2026-01-23 23:32:40.737 [INFO][4362] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali599b1d40b43 ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" Jan 23 23:32:40.756858 containerd[1666]: 2026-01-23 23:32:40.741 [INFO][4362] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" Jan 23 23:32:40.756914 containerd[1666]: 2026-01-23 23:32:40.741 [INFO][4362] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0", GenerateName:"calico-apiserver-579f6fb948-", Namespace:"calico-apiserver", SelfLink:"", UID:"97acd778-d005-41f9-8db5-25f87a68c090", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"579f6fb948", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76", Pod:"calico-apiserver-579f6fb948-m6qjq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.49.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali599b1d40b43", MAC:"9a:8d:99:15:fb:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:40.757039 containerd[1666]: 2026-01-23 23:32:40.752 [INFO][4362] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-m6qjq" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--m6qjq-eth0" Jan 23 23:32:40.763000 audit[4392]: NETFILTER_CFG table=filter:125 family=2 entries=50 op=nft_register_chain pid=4392 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:40.765054 kernel: kauditd_printk_skb: 231 callbacks suppressed Jan 23 23:32:40.765105 kernel: audit: type=1325 audit(1769211160.763:651): table=filter:125 family=2 entries=50 op=nft_register_chain pid=4392 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:40.763000 audit[4392]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=ffffd61e8310 a2=0 a3=ffff88112fa8 items=0 ppid=4112 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.770833 kernel: audit: type=1300 audit(1769211160.763:651): arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=ffffd61e8310 a2=0 a3=ffff88112fa8 items=0 ppid=4112 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.763000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:40.773082 kernel: audit: type=1327 audit(1769211160.763:651): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:40.778268 containerd[1666]: time="2026-01-23T23:32:40.778222304Z" level=info msg="connecting to shim eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76" address="unix:///run/containerd/s/8b886b840d93f25251a67f2957680c308c522432c1b7339f680d126c8949fcd2" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:40.807270 systemd[1]: Started cri-containerd-eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76.scope - libcontainer container eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76. Jan 23 23:32:40.825000 audit: BPF prog-id=211 op=LOAD Jan 23 23:32:40.827000 audit: BPF prog-id=212 op=LOAD Jan 23 23:32:40.829304 kernel: audit: type=1334 audit(1769211160.825:652): prog-id=211 op=LOAD Jan 23 23:32:40.829357 kernel: audit: type=1334 audit(1769211160.827:653): prog-id=212 op=LOAD Jan 23 23:32:40.829379 kernel: audit: type=1300 audit(1769211160.827:653): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.827000 audit[4413]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.835583 kernel: audit: type=1327 audit(1769211160.827:653): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.827000 audit: BPF prog-id=212 op=UNLOAD Jan 23 23:32:40.837143 kernel: audit: type=1334 audit(1769211160.827:654): prog-id=212 op=UNLOAD Jan 23 23:32:40.837299 kernel: audit: type=1300 audit(1769211160.827:654): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.827000 audit[4413]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.843718 kernel: audit: type=1327 audit(1769211160.827:654): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.827000 audit: BPF prog-id=213 op=LOAD Jan 23 23:32:40.827000 audit[4413]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.828000 audit: BPF prog-id=214 op=LOAD Jan 23 23:32:40.828000 audit[4413]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.828000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.831000 audit: BPF prog-id=214 op=UNLOAD Jan 23 23:32:40.831000 audit[4413]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.831000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.831000 audit: BPF prog-id=213 op=UNLOAD Jan 23 23:32:40.831000 audit[4413]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.831000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.831000 audit: BPF prog-id=215 op=LOAD Jan 23 23:32:40.831000 audit[4413]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4402 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:40.831000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562376335663233613035646134356162396366363730653632323838 Jan 23 23:32:40.871361 containerd[1666]: time="2026-01-23T23:32:40.871298628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-m6qjq,Uid:97acd778-d005-41f9-8db5-25f87a68c090,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"eb7c5f23a05da45ab9cf670e622887c6d345c85c8141d1839fb7af9206ae1a76\"" Jan 23 23:32:40.873479 containerd[1666]: time="2026-01-23T23:32:40.873434714Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:32:41.209488 containerd[1666]: time="2026-01-23T23:32:41.209404259Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:41.210874 containerd[1666]: time="2026-01-23T23:32:41.210825383Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:32:41.210940 containerd[1666]: time="2026-01-23T23:32:41.210885024Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:41.211093 kubelet[2913]: E0123 23:32:41.211053 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:41.211360 kubelet[2913]: E0123 23:32:41.211108 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:41.211360 kubelet[2913]: E0123 23:32:41.211256 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gkz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-m6qjq_calico-apiserver(97acd778-d005-41f9-8db5-25f87a68c090): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:41.212436 kubelet[2913]: E0123 23:32:41.212389 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:32:41.638159 containerd[1666]: time="2026-01-23T23:32:41.637852886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22wrh,Uid:31688273-3f62-4a06-a170-1e0f188f7bf7,Namespace:kube-system,Attempt:0,}" Jan 23 23:32:41.638159 containerd[1666]: time="2026-01-23T23:32:41.637919806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-2l9kd,Uid:aece08d9-f39f-4025-afe7-4d9ae33375fe,Namespace:calico-apiserver,Attempt:0,}" Jan 23 23:32:41.760635 kubelet[2913]: E0123 23:32:41.760582 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:32:41.776211 systemd-networkd[1577]: cali94e6f5a3de5: Link UP Jan 23 23:32:41.777535 systemd-networkd[1577]: cali94e6f5a3de5: Gained carrier Jan 23 23:32:41.788000 audit[4487]: NETFILTER_CFG table=filter:126 family=2 entries=20 op=nft_register_rule pid=4487 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:41.788000 audit[4487]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc8403ae0 a2=0 a3=1 items=0 ppid=3074 pid=4487 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.788000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:41.792000 audit[4487]: NETFILTER_CFG table=nat:127 family=2 entries=14 op=nft_register_rule pid=4487 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:41.792000 audit[4487]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffc8403ae0 a2=0 a3=1 items=0 ppid=3074 pid=4487 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.792000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:41.794090 containerd[1666]: 2026-01-23 23:32:41.691 [INFO][4439] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0 coredns-674b8bbfcf- kube-system 31688273-3f62-4a06-a170-1e0f188f7bf7 826 0 2026-01-23 23:31:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e coredns-674b8bbfcf-22wrh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali94e6f5a3de5 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-" Jan 23 23:32:41.794090 containerd[1666]: 2026-01-23 23:32:41.691 [INFO][4439] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" Jan 23 23:32:41.794090 containerd[1666]: 2026-01-23 23:32:41.726 [INFO][4468] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" HandleID="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Workload="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.726 [INFO][4468] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" HandleID="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Workload="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137e70), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-1-266c03b17e", "pod":"coredns-674b8bbfcf-22wrh", "timestamp":"2026-01-23 23:32:41.726571716 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.726 [INFO][4468] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.726 [INFO][4468] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.726 [INFO][4468] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.737 [INFO][4468] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.742 [INFO][4468] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.747 [INFO][4468] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.749 [INFO][4468] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794235 containerd[1666]: 2026-01-23 23:32:41.752 [INFO][4468] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794424 containerd[1666]: 2026-01-23 23:32:41.752 [INFO][4468] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794424 containerd[1666]: 2026-01-23 23:32:41.753 [INFO][4468] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0 Jan 23 23:32:41.794424 containerd[1666]: 2026-01-23 23:32:41.757 [INFO][4468] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794424 containerd[1666]: 2026-01-23 23:32:41.767 [INFO][4468] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.131/26] block=192.168.49.128/26 handle="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794424 containerd[1666]: 2026-01-23 23:32:41.767 [INFO][4468] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.131/26] handle="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.794424 containerd[1666]: 2026-01-23 23:32:41.767 [INFO][4468] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:41.794424 containerd[1666]: 2026-01-23 23:32:41.767 [INFO][4468] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.131/26] IPv6=[] ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" HandleID="k8s-pod-network.fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Workload="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" Jan 23 23:32:41.794551 containerd[1666]: 2026-01-23 23:32:41.771 [INFO][4439] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"31688273-3f62-4a06-a170-1e0f188f7bf7", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 31, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"coredns-674b8bbfcf-22wrh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.49.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali94e6f5a3de5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:41.794551 containerd[1666]: 2026-01-23 23:32:41.771 [INFO][4439] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.131/32] ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" Jan 23 23:32:41.794551 containerd[1666]: 2026-01-23 23:32:41.771 [INFO][4439] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali94e6f5a3de5 ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" Jan 23 23:32:41.794551 containerd[1666]: 2026-01-23 23:32:41.779 [INFO][4439] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" Jan 23 23:32:41.794551 containerd[1666]: 2026-01-23 23:32:41.781 [INFO][4439] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"31688273-3f62-4a06-a170-1e0f188f7bf7", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 31, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0", Pod:"coredns-674b8bbfcf-22wrh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.49.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali94e6f5a3de5", MAC:"1a:e2:a5:dd:48:5b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:41.794551 containerd[1666]: 2026-01-23 23:32:41.792 [INFO][4439] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-22wrh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--22wrh-eth0" Jan 23 23:32:41.814000 audit[4500]: NETFILTER_CFG table=filter:128 family=2 entries=46 op=nft_register_chain pid=4500 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:41.814000 audit[4500]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23740 a0=3 a1=ffffeec71bc0 a2=0 a3=ffff9d349fa8 items=0 ppid=4112 pid=4500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.814000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:41.816259 containerd[1666]: time="2026-01-23T23:32:41.816145510Z" level=info msg="connecting to shim fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0" address="unix:///run/containerd/s/95d84411feae6f090ab489ac79bb5fad13a5ca3c4e8229d542257213ebea575c" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:41.841185 systemd[1]: Started cri-containerd-fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0.scope - libcontainer container fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0. Jan 23 23:32:41.852000 audit: BPF prog-id=216 op=LOAD Jan 23 23:32:41.852000 audit: BPF prog-id=217 op=LOAD Jan 23 23:32:41.852000 audit[4517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4505 pid=4517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323862373966373964636261346362376439363435353266313161 Jan 23 23:32:41.852000 audit: BPF prog-id=217 op=UNLOAD Jan 23 23:32:41.852000 audit[4517]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4505 pid=4517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323862373966373964636261346362376439363435353266313161 Jan 23 23:32:41.852000 audit: BPF prog-id=218 op=LOAD Jan 23 23:32:41.852000 audit[4517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4505 pid=4517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323862373966373964636261346362376439363435353266313161 Jan 23 23:32:41.852000 audit: BPF prog-id=219 op=LOAD Jan 23 23:32:41.852000 audit[4517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4505 pid=4517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323862373966373964636261346362376439363435353266313161 Jan 23 23:32:41.852000 audit: BPF prog-id=219 op=UNLOAD Jan 23 23:32:41.852000 audit[4517]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4505 pid=4517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323862373966373964636261346362376439363435353266313161 Jan 23 23:32:41.852000 audit: BPF prog-id=218 op=UNLOAD Jan 23 23:32:41.852000 audit[4517]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4505 pid=4517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323862373966373964636261346362376439363435353266313161 Jan 23 23:32:41.852000 audit: BPF prog-id=220 op=LOAD Jan 23 23:32:41.852000 audit[4517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4505 pid=4517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662323862373966373964636261346362376439363435353266313161 Jan 23 23:32:41.875898 systemd-networkd[1577]: califf704bd1609: Link UP Jan 23 23:32:41.876471 systemd-networkd[1577]: califf704bd1609: Gained carrier Jan 23 23:32:41.892167 containerd[1666]: time="2026-01-23T23:32:41.891932021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22wrh,Uid:31688273-3f62-4a06-a170-1e0f188f7bf7,Namespace:kube-system,Attempt:0,} returns sandbox id \"fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0\"" Jan 23 23:32:41.901558 containerd[1666]: time="2026-01-23T23:32:41.901522530Z" level=info msg="CreateContainer within sandbox \"fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.702 [INFO][4446] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0 calico-apiserver-579f6fb948- calico-apiserver aece08d9-f39f-4025-afe7-4d9ae33375fe 831 0 2026-01-23 23:32:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:579f6fb948 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e calico-apiserver-579f6fb948-2l9kd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califf704bd1609 [] [] }} ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.702 [INFO][4446] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.731 [INFO][4474] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" HandleID="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.732 [INFO][4474] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" HandleID="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c760), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-1-266c03b17e", "pod":"calico-apiserver-579f6fb948-2l9kd", "timestamp":"2026-01-23 23:32:41.731852612 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.732 [INFO][4474] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.767 [INFO][4474] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.767 [INFO][4474] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.837 [INFO][4474] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.844 [INFO][4474] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.849 [INFO][4474] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.851 [INFO][4474] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.854 [INFO][4474] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.854 [INFO][4474] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.856 [INFO][4474] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6 Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.861 [INFO][4474] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.870 [INFO][4474] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.132/26] block=192.168.49.128/26 handle="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.870 [INFO][4474] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.132/26] handle="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.870 [INFO][4474] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:41.902708 containerd[1666]: 2026-01-23 23:32:41.870 [INFO][4474] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.132/26] IPv6=[] ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" HandleID="k8s-pod-network.14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" Jan 23 23:32:41.903366 containerd[1666]: 2026-01-23 23:32:41.872 [INFO][4446] cni-plugin/k8s.go 418: Populated endpoint ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0", GenerateName:"calico-apiserver-579f6fb948-", Namespace:"calico-apiserver", SelfLink:"", UID:"aece08d9-f39f-4025-afe7-4d9ae33375fe", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"579f6fb948", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"calico-apiserver-579f6fb948-2l9kd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.49.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califf704bd1609", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:41.903366 containerd[1666]: 2026-01-23 23:32:41.872 [INFO][4446] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.132/32] ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" Jan 23 23:32:41.903366 containerd[1666]: 2026-01-23 23:32:41.872 [INFO][4446] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califf704bd1609 ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" Jan 23 23:32:41.903366 containerd[1666]: 2026-01-23 23:32:41.878 [INFO][4446] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" Jan 23 23:32:41.903366 containerd[1666]: 2026-01-23 23:32:41.880 [INFO][4446] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0", GenerateName:"calico-apiserver-579f6fb948-", Namespace:"calico-apiserver", SelfLink:"", UID:"aece08d9-f39f-4025-afe7-4d9ae33375fe", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"579f6fb948", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6", Pod:"calico-apiserver-579f6fb948-2l9kd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.49.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califf704bd1609", MAC:"1a:32:ee:26:93:50", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:41.903366 containerd[1666]: 2026-01-23 23:32:41.899 [INFO][4446] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" Namespace="calico-apiserver" Pod="calico-apiserver-579f6fb948-2l9kd" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--apiserver--579f6fb948--2l9kd-eth0" Jan 23 23:32:41.911000 audit[4553]: NETFILTER_CFG table=filter:129 family=2 entries=51 op=nft_register_chain pid=4553 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:41.911000 audit[4553]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27116 a0=3 a1=ffffd5b0aa90 a2=0 a3=ffff9239afa8 items=0 ppid=4112 pid=4553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.911000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:41.920001 containerd[1666]: time="2026-01-23T23:32:41.919828306Z" level=info msg="Container 52b73d0dd25b3394bb82ef5b2e58bc2d1dcb185a6a5eb3eadfba7aedb4dfb8e8: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:41.927131 containerd[1666]: time="2026-01-23T23:32:41.927080528Z" level=info msg="CreateContainer within sandbox \"fb28b79f79dcba4cb7d964552f11a60b663a5915791efd7f55b900ecaf5156f0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"52b73d0dd25b3394bb82ef5b2e58bc2d1dcb185a6a5eb3eadfba7aedb4dfb8e8\"" Jan 23 23:32:41.927840 containerd[1666]: time="2026-01-23T23:32:41.927786370Z" level=info msg="StartContainer for \"52b73d0dd25b3394bb82ef5b2e58bc2d1dcb185a6a5eb3eadfba7aedb4dfb8e8\"" Jan 23 23:32:41.928750 containerd[1666]: time="2026-01-23T23:32:41.928638093Z" level=info msg="connecting to shim 52b73d0dd25b3394bb82ef5b2e58bc2d1dcb185a6a5eb3eadfba7aedb4dfb8e8" address="unix:///run/containerd/s/95d84411feae6f090ab489ac79bb5fad13a5ca3c4e8229d542257213ebea575c" protocol=ttrpc version=3 Jan 23 23:32:41.936689 containerd[1666]: time="2026-01-23T23:32:41.936587837Z" level=info msg="connecting to shim 14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6" address="unix:///run/containerd/s/ca9a3f9cd375c2a4d4cfc6804b7be720ccd351aa3583c1fa7acdf338cac23e80" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:41.951215 systemd[1]: Started cri-containerd-52b73d0dd25b3394bb82ef5b2e58bc2d1dcb185a6a5eb3eadfba7aedb4dfb8e8.scope - libcontainer container 52b73d0dd25b3394bb82ef5b2e58bc2d1dcb185a6a5eb3eadfba7aedb4dfb8e8. Jan 23 23:32:41.963231 systemd[1]: Started cri-containerd-14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6.scope - libcontainer container 14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6. Jan 23 23:32:41.969000 audit: BPF prog-id=221 op=LOAD Jan 23 23:32:41.969000 audit: BPF prog-id=222 op=LOAD Jan 23 23:32:41.969000 audit[4554]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4505 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.969000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532623733643064643235623333393462623832656635623265353862 Jan 23 23:32:41.970000 audit: BPF prog-id=222 op=UNLOAD Jan 23 23:32:41.970000 audit[4554]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4505 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.970000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532623733643064643235623333393462623832656635623265353862 Jan 23 23:32:41.971000 audit: BPF prog-id=223 op=LOAD Jan 23 23:32:41.971000 audit[4554]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4505 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.971000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532623733643064643235623333393462623832656635623265353862 Jan 23 23:32:41.971000 audit: BPF prog-id=224 op=LOAD Jan 23 23:32:41.971000 audit[4554]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4505 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.971000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532623733643064643235623333393462623832656635623265353862 Jan 23 23:32:41.971000 audit: BPF prog-id=224 op=UNLOAD Jan 23 23:32:41.971000 audit[4554]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4505 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.971000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532623733643064643235623333393462623832656635623265353862 Jan 23 23:32:41.971000 audit: BPF prog-id=223 op=UNLOAD Jan 23 23:32:41.971000 audit[4554]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4505 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.971000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532623733643064643235623333393462623832656635623265353862 Jan 23 23:32:41.971000 audit: BPF prog-id=225 op=LOAD Jan 23 23:32:41.971000 audit[4554]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4505 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.971000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532623733643064643235623333393462623832656635623265353862 Jan 23 23:32:41.978000 audit: BPF prog-id=226 op=LOAD Jan 23 23:32:41.978000 audit: BPF prog-id=227 op=LOAD Jan 23 23:32:41.978000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4568 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134626433356239623562653333363263613837393064663735306535 Jan 23 23:32:41.978000 audit: BPF prog-id=227 op=UNLOAD Jan 23 23:32:41.978000 audit[4586]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4568 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134626433356239623562653333363263613837393064663735306535 Jan 23 23:32:41.979000 audit: BPF prog-id=228 op=LOAD Jan 23 23:32:41.979000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4568 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134626433356239623562653333363263613837393064663735306535 Jan 23 23:32:41.979000 audit: BPF prog-id=229 op=LOAD Jan 23 23:32:41.979000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4568 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134626433356239623562653333363263613837393064663735306535 Jan 23 23:32:41.979000 audit: BPF prog-id=229 op=UNLOAD Jan 23 23:32:41.979000 audit[4586]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4568 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134626433356239623562653333363263613837393064663735306535 Jan 23 23:32:41.980000 audit: BPF prog-id=228 op=UNLOAD Jan 23 23:32:41.980000 audit[4586]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4568 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.980000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134626433356239623562653333363263613837393064663735306535 Jan 23 23:32:41.980000 audit: BPF prog-id=230 op=LOAD Jan 23 23:32:41.980000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4568 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:41.980000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134626433356239623562653333363263613837393064663735306535 Jan 23 23:32:42.001587 containerd[1666]: time="2026-01-23T23:32:42.001549995Z" level=info msg="StartContainer for \"52b73d0dd25b3394bb82ef5b2e58bc2d1dcb185a6a5eb3eadfba7aedb4dfb8e8\" returns successfully" Jan 23 23:32:42.009703 containerd[1666]: time="2026-01-23T23:32:42.009661180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-579f6fb948-2l9kd,Uid:aece08d9-f39f-4025-afe7-4d9ae33375fe,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"14bd35b9b5be3362ca8790df750e563cbec5ac76f401b61cb7753f1f75b7e5e6\"" Jan 23 23:32:42.011599 containerd[1666]: time="2026-01-23T23:32:42.011561266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:32:42.337063 containerd[1666]: time="2026-01-23T23:32:42.336791977Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:42.338747 containerd[1666]: time="2026-01-23T23:32:42.338656583Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:32:42.338836 containerd[1666]: time="2026-01-23T23:32:42.338737863Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:42.339061 kubelet[2913]: E0123 23:32:42.339002 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:42.339061 kubelet[2913]: E0123 23:32:42.339046 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:42.339881 kubelet[2913]: E0123 23:32:42.339835 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbrkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-2l9kd_calico-apiserver(aece08d9-f39f-4025-afe7-4d9ae33375fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:42.341784 kubelet[2913]: E0123 23:32:42.341732 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:32:42.435218 kubelet[2913]: I0123 23:32:42.435054 2913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 23:32:42.637347 containerd[1666]: time="2026-01-23T23:32:42.637310134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dqzmb,Uid:9a039bd1-1840-4663-bae3-65c063cc9185,Namespace:kube-system,Attempt:0,}" Jan 23 23:32:42.637470 containerd[1666]: time="2026-01-23T23:32:42.637382214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-52tqh,Uid:8525930c-a129-42c2-8aaf-49aa89f532c7,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:42.637806 containerd[1666]: time="2026-01-23T23:32:42.637713975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfnpf,Uid:a69a1122-8e77-47a0-ac55-a81fea68c3e7,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:42.674500 systemd-networkd[1577]: cali599b1d40b43: Gained IPv6LL Jan 23 23:32:42.767994 kubelet[2913]: E0123 23:32:42.767229 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:32:42.770173 kubelet[2913]: E0123 23:32:42.770127 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:32:42.782690 systemd-networkd[1577]: calic5f97266cbb: Link UP Jan 23 23:32:42.785489 systemd-networkd[1577]: calic5f97266cbb: Gained carrier Jan 23 23:32:42.793000 audit[4757]: NETFILTER_CFG table=filter:130 family=2 entries=20 op=nft_register_rule pid=4757 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:42.793000 audit[4757]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff40ce900 a2=0 a3=1 items=0 ppid=3074 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.793000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:42.801000 audit[4757]: NETFILTER_CFG table=nat:131 family=2 entries=14 op=nft_register_rule pid=4757 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:42.801000 audit[4757]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffff40ce900 a2=0 a3=1 items=0 ppid=3074 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.801000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.704 [INFO][4693] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0 goldmane-666569f655- calico-system 8525930c-a129-42c2-8aaf-49aa89f532c7 830 0 2026-01-23 23:32:11 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e goldmane-666569f655-52tqh eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic5f97266cbb [] [] }} ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.704 [INFO][4693] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.733 [INFO][4731] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" HandleID="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Workload="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.733 [INFO][4731] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" HandleID="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Workload="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034b590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-1-266c03b17e", "pod":"goldmane-666569f655-52tqh", "timestamp":"2026-01-23 23:32:42.733636948 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.733 [INFO][4731] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.733 [INFO][4731] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.734 [INFO][4731] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.743 [INFO][4731] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.748 [INFO][4731] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.752 [INFO][4731] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.754 [INFO][4731] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.756 [INFO][4731] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.756 [INFO][4731] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.757 [INFO][4731] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9 Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.765 [INFO][4731] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.773 [INFO][4731] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.133/26] block=192.168.49.128/26 handle="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.773 [INFO][4731] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.133/26] handle="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.774 [INFO][4731] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:42.804006 containerd[1666]: 2026-01-23 23:32:42.774 [INFO][4731] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.133/26] IPv6=[] ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" HandleID="k8s-pod-network.0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Workload="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" Jan 23 23:32:42.804815 containerd[1666]: 2026-01-23 23:32:42.776 [INFO][4693] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"8525930c-a129-42c2-8aaf-49aa89f532c7", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"goldmane-666569f655-52tqh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.49.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic5f97266cbb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:42.804815 containerd[1666]: 2026-01-23 23:32:42.778 [INFO][4693] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.133/32] ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" Jan 23 23:32:42.804815 containerd[1666]: 2026-01-23 23:32:42.778 [INFO][4693] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic5f97266cbb ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" Jan 23 23:32:42.804815 containerd[1666]: 2026-01-23 23:32:42.786 [INFO][4693] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" Jan 23 23:32:42.804815 containerd[1666]: 2026-01-23 23:32:42.787 [INFO][4693] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"8525930c-a129-42c2-8aaf-49aa89f532c7", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9", Pod:"goldmane-666569f655-52tqh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.49.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic5f97266cbb", MAC:"6a:7b:5f:d5:0c:10", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:42.804815 containerd[1666]: 2026-01-23 23:32:42.801 [INFO][4693] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" Namespace="calico-system" Pod="goldmane-666569f655-52tqh" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-goldmane--666569f655--52tqh-eth0" Jan 23 23:32:42.813040 kubelet[2913]: I0123 23:32:42.812725 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-22wrh" podStartSLOduration=43.812707189 podStartE2EDuration="43.812707189s" podCreationTimestamp="2026-01-23 23:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 23:32:42.811569705 +0000 UTC m=+49.572212260" watchObservedRunningTime="2026-01-23 23:32:42.812707189 +0000 UTC m=+49.573349744" Jan 23 23:32:42.830000 audit[4773]: NETFILTER_CFG table=filter:132 family=2 entries=17 op=nft_register_rule pid=4773 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:42.830000 audit[4773]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc92a6410 a2=0 a3=1 items=0 ppid=3074 pid=4773 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.830000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:42.834426 containerd[1666]: time="2026-01-23T23:32:42.834354055Z" level=info msg="connecting to shim 0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9" address="unix:///run/containerd/s/7bcdb3e940e4874157fe07da0831254ba17377e182070f9b314823276a8f61e8" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:42.836000 audit[4773]: NETFILTER_CFG table=nat:133 family=2 entries=35 op=nft_register_chain pid=4773 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:42.836000 audit[4773]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffc92a6410 a2=0 a3=1 items=0 ppid=3074 pid=4773 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.836000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:42.845000 audit[4785]: NETFILTER_CFG table=filter:134 family=2 entries=52 op=nft_register_chain pid=4785 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:42.845000 audit[4785]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27540 a0=3 a1=ffffc4e314f0 a2=0 a3=ffff91238fa8 items=0 ppid=4112 pid=4785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.845000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:42.862503 systemd[1]: Started cri-containerd-0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9.scope - libcontainer container 0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9. Jan 23 23:32:42.878000 audit: BPF prog-id=231 op=LOAD Jan 23 23:32:42.880000 audit: BPF prog-id=232 op=LOAD Jan 23 23:32:42.880000 audit[4791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186180 a2=98 a3=0 items=0 ppid=4779 pid=4791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062396261343536386537363363663861313261393038373264323232 Jan 23 23:32:42.880000 audit: BPF prog-id=232 op=UNLOAD Jan 23 23:32:42.880000 audit[4791]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4779 pid=4791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062396261343536386537363363663861313261393038373264323232 Jan 23 23:32:42.880000 audit: BPF prog-id=233 op=LOAD Jan 23 23:32:42.880000 audit[4791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001863e8 a2=98 a3=0 items=0 ppid=4779 pid=4791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062396261343536386537363363663861313261393038373264323232 Jan 23 23:32:42.881000 audit: BPF prog-id=234 op=LOAD Jan 23 23:32:42.881000 audit[4791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000186168 a2=98 a3=0 items=0 ppid=4779 pid=4791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062396261343536386537363363663861313261393038373264323232 Jan 23 23:32:42.881000 audit: BPF prog-id=234 op=UNLOAD Jan 23 23:32:42.881000 audit[4791]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4779 pid=4791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062396261343536386537363363663861313261393038373264323232 Jan 23 23:32:42.882000 audit: BPF prog-id=233 op=UNLOAD Jan 23 23:32:42.882000 audit[4791]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4779 pid=4791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.882000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062396261343536386537363363663861313261393038373264323232 Jan 23 23:32:42.882000 audit: BPF prog-id=235 op=LOAD Jan 23 23:32:42.882000 audit[4791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186648 a2=98 a3=0 items=0 ppid=4779 pid=4791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.882000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062396261343536386537363363663861313261393038373264323232 Jan 23 23:32:42.890361 systemd-networkd[1577]: cali7a0d9e657b1: Link UP Jan 23 23:32:42.891762 systemd-networkd[1577]: cali7a0d9e657b1: Gained carrier Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.705 [INFO][4686] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0 coredns-674b8bbfcf- kube-system 9a039bd1-1840-4663-bae3-65c063cc9185 827 0 2026-01-23 23:31:59 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e coredns-674b8bbfcf-dqzmb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7a0d9e657b1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.705 [INFO][4686] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.733 [INFO][4737] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" HandleID="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Workload="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.734 [INFO][4737] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" HandleID="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Workload="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd0e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-1-266c03b17e", "pod":"coredns-674b8bbfcf-dqzmb", "timestamp":"2026-01-23 23:32:42.733985829 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.734 [INFO][4737] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.774 [INFO][4737] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.774 [INFO][4737] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.847 [INFO][4737] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.853 [INFO][4737] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.861 [INFO][4737] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.863 [INFO][4737] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.866 [INFO][4737] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.866 [INFO][4737] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.868 [INFO][4737] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1 Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.873 [INFO][4737] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.880 [INFO][4737] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.134/26] block=192.168.49.128/26 handle="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.880 [INFO][4737] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.134/26] handle="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.881 [INFO][4737] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:42.913115 containerd[1666]: 2026-01-23 23:32:42.881 [INFO][4737] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.134/26] IPv6=[] ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" HandleID="k8s-pod-network.d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Workload="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" Jan 23 23:32:42.913607 containerd[1666]: 2026-01-23 23:32:42.884 [INFO][4686] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9a039bd1-1840-4663-bae3-65c063cc9185", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 31, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"coredns-674b8bbfcf-dqzmb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.49.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7a0d9e657b1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:42.913607 containerd[1666]: 2026-01-23 23:32:42.884 [INFO][4686] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.134/32] ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" Jan 23 23:32:42.913607 containerd[1666]: 2026-01-23 23:32:42.884 [INFO][4686] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a0d9e657b1 ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" Jan 23 23:32:42.913607 containerd[1666]: 2026-01-23 23:32:42.891 [INFO][4686] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" Jan 23 23:32:42.913607 containerd[1666]: 2026-01-23 23:32:42.892 [INFO][4686] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9a039bd1-1840-4663-bae3-65c063cc9185", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 31, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1", Pod:"coredns-674b8bbfcf-dqzmb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.49.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7a0d9e657b1", MAC:"3e:89:66:45:b0:a0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:42.913607 containerd[1666]: 2026-01-23 23:32:42.910 [INFO][4686] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dqzmb" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-coredns--674b8bbfcf--dqzmb-eth0" Jan 23 23:32:42.928156 containerd[1666]: time="2026-01-23T23:32:42.928049061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-52tqh,Uid:8525930c-a129-42c2-8aaf-49aa89f532c7,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b9ba4568e763cf8a12a90872d22250d337f3e2e7a171b468bdd95fafe1f48b9\"" Jan 23 23:32:42.930057 containerd[1666]: time="2026-01-23T23:32:42.930016667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 23:32:42.940000 audit[4829]: NETFILTER_CFG table=filter:135 family=2 entries=44 op=nft_register_chain pid=4829 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:42.940000 audit[4829]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21516 a0=3 a1=ffffe063bdc0 a2=0 a3=ffffa82bbfa8 items=0 ppid=4112 pid=4829 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:42.940000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:42.953552 containerd[1666]: time="2026-01-23T23:32:42.953496458Z" level=info msg="connecting to shim d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1" address="unix:///run/containerd/s/1c730a6ffb136ba20129726b2f08399ba4852f15fb47c44cd7b4965b494ff1aa" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:42.989173 systemd[1]: Started cri-containerd-d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1.scope - libcontainer container d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1. Jan 23 23:32:43.003459 systemd-networkd[1577]: cali7f200a9e274: Link UP Jan 23 23:32:43.003748 systemd-networkd[1577]: cali7f200a9e274: Gained carrier Jan 23 23:32:43.005000 audit: BPF prog-id=236 op=LOAD Jan 23 23:32:43.005000 audit: BPF prog-id=237 op=LOAD Jan 23 23:32:43.005000 audit[4850]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4839 pid=4850 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437386562396561356533653139666166376437313136343130313730 Jan 23 23:32:43.005000 audit: BPF prog-id=237 op=UNLOAD Jan 23 23:32:43.005000 audit[4850]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4839 pid=4850 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437386562396561356533653139666166376437313136343130313730 Jan 23 23:32:43.005000 audit: BPF prog-id=238 op=LOAD Jan 23 23:32:43.005000 audit[4850]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4839 pid=4850 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437386562396561356533653139666166376437313136343130313730 Jan 23 23:32:43.005000 audit: BPF prog-id=239 op=LOAD Jan 23 23:32:43.005000 audit[4850]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4839 pid=4850 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437386562396561356533653139666166376437313136343130313730 Jan 23 23:32:43.005000 audit: BPF prog-id=239 op=UNLOAD Jan 23 23:32:43.005000 audit[4850]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4839 pid=4850 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437386562396561356533653139666166376437313136343130313730 Jan 23 23:32:43.005000 audit: BPF prog-id=238 op=UNLOAD Jan 23 23:32:43.005000 audit[4850]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4839 pid=4850 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437386562396561356533653139666166376437313136343130313730 Jan 23 23:32:43.005000 audit: BPF prog-id=240 op=LOAD Jan 23 23:32:43.005000 audit[4850]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4839 pid=4850 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.005000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437386562396561356533653139666166376437313136343130313730 Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.714 [INFO][4704] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0 csi-node-driver- calico-system a69a1122-8e77-47a0-ac55-a81fea68c3e7 703 0 2026-01-23 23:32:13 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e csi-node-driver-rfnpf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7f200a9e274 [] [] }} ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.714 [INFO][4704] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.740 [INFO][4744] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" HandleID="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Workload="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.740 [INFO][4744] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" HandleID="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Workload="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3290), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-1-266c03b17e", "pod":"csi-node-driver-rfnpf", "timestamp":"2026-01-23 23:32:42.740777369 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.740 [INFO][4744] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.881 [INFO][4744] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.881 [INFO][4744] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.945 [INFO][4744] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.957 [INFO][4744] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.968 [INFO][4744] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.972 [INFO][4744] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.976 [INFO][4744] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.976 [INFO][4744] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.978 [INFO][4744] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201 Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.986 [INFO][4744] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.995 [INFO][4744] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.135/26] block=192.168.49.128/26 handle="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.995 [INFO][4744] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.135/26] handle="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.995 [INFO][4744] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:43.024317 containerd[1666]: 2026-01-23 23:32:42.995 [INFO][4744] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.135/26] IPv6=[] ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" HandleID="k8s-pod-network.53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Workload="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" Jan 23 23:32:43.025631 containerd[1666]: 2026-01-23 23:32:42.998 [INFO][4704] cni-plugin/k8s.go 418: Populated endpoint ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a69a1122-8e77-47a0-ac55-a81fea68c3e7", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"csi-node-driver-rfnpf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.49.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7f200a9e274", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:43.025631 containerd[1666]: 2026-01-23 23:32:42.998 [INFO][4704] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.135/32] ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" Jan 23 23:32:43.025631 containerd[1666]: 2026-01-23 23:32:42.998 [INFO][4704] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7f200a9e274 ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" Jan 23 23:32:43.025631 containerd[1666]: 2026-01-23 23:32:43.004 [INFO][4704] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" Jan 23 23:32:43.025631 containerd[1666]: 2026-01-23 23:32:43.004 [INFO][4704] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a69a1122-8e77-47a0-ac55-a81fea68c3e7", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201", Pod:"csi-node-driver-rfnpf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.49.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7f200a9e274", MAC:"b2:0a:80:94:b7:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:43.025631 containerd[1666]: 2026-01-23 23:32:43.020 [INFO][4704] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" Namespace="calico-system" Pod="csi-node-driver-rfnpf" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-csi--node--driver--rfnpf-eth0" Jan 23 23:32:43.043416 containerd[1666]: time="2026-01-23T23:32:43.043219132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dqzmb,Uid:9a039bd1-1840-4663-bae3-65c063cc9185,Namespace:kube-system,Attempt:0,} returns sandbox id \"d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1\"" Jan 23 23:32:43.045000 audit[4886]: NETFILTER_CFG table=filter:136 family=2 entries=58 op=nft_register_chain pid=4886 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:43.045000 audit[4886]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27164 a0=3 a1=ffffc6944940 a2=0 a3=ffffb58befa8 items=0 ppid=4112 pid=4886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.045000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:43.054523 containerd[1666]: time="2026-01-23T23:32:43.054486726Z" level=info msg="CreateContainer within sandbox \"d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 23:32:43.056793 containerd[1666]: time="2026-01-23T23:32:43.056608333Z" level=info msg="connecting to shim 53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201" address="unix:///run/containerd/s/d0945bfd4a38af8610c4183ed362bb2288c44bb332ad78ed545eec0dd707f2aa" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:43.057148 systemd-networkd[1577]: califf704bd1609: Gained IPv6LL Jan 23 23:32:43.082701 containerd[1666]: time="2026-01-23T23:32:43.082559052Z" level=info msg="Container 921588fc5ff47c6e49f7dd8b2555ce9e5fdcfcf22254bf4ae01df32bcb58d56b: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:32:43.088276 systemd[1]: Started cri-containerd-53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201.scope - libcontainer container 53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201. Jan 23 23:32:43.089064 containerd[1666]: time="2026-01-23T23:32:43.089025712Z" level=info msg="CreateContainer within sandbox \"d78eb9ea5e3e19faf7d7116410170b2a4acfbc040aaeab34aa21e134bc2fd1c1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"921588fc5ff47c6e49f7dd8b2555ce9e5fdcfcf22254bf4ae01df32bcb58d56b\"" Jan 23 23:32:43.090662 containerd[1666]: time="2026-01-23T23:32:43.089883154Z" level=info msg="StartContainer for \"921588fc5ff47c6e49f7dd8b2555ce9e5fdcfcf22254bf4ae01df32bcb58d56b\"" Jan 23 23:32:43.090828 containerd[1666]: time="2026-01-23T23:32:43.090803837Z" level=info msg="connecting to shim 921588fc5ff47c6e49f7dd8b2555ce9e5fdcfcf22254bf4ae01df32bcb58d56b" address="unix:///run/containerd/s/1c730a6ffb136ba20129726b2f08399ba4852f15fb47c44cd7b4965b494ff1aa" protocol=ttrpc version=3 Jan 23 23:32:43.100000 audit: BPF prog-id=241 op=LOAD Jan 23 23:32:43.101000 audit: BPF prog-id=242 op=LOAD Jan 23 23:32:43.101000 audit[4907]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4896 pid=4907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533616332393566666337643632633465363230633761663863653333 Jan 23 23:32:43.101000 audit: BPF prog-id=242 op=UNLOAD Jan 23 23:32:43.101000 audit[4907]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4896 pid=4907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533616332393566666337643632633465363230633761663863653333 Jan 23 23:32:43.101000 audit: BPF prog-id=243 op=LOAD Jan 23 23:32:43.101000 audit[4907]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4896 pid=4907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533616332393566666337643632633465363230633761663863653333 Jan 23 23:32:43.101000 audit: BPF prog-id=244 op=LOAD Jan 23 23:32:43.101000 audit[4907]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4896 pid=4907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533616332393566666337643632633465363230633761663863653333 Jan 23 23:32:43.101000 audit: BPF prog-id=244 op=UNLOAD Jan 23 23:32:43.101000 audit[4907]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4896 pid=4907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533616332393566666337643632633465363230633761663863653333 Jan 23 23:32:43.102000 audit: BPF prog-id=243 op=UNLOAD Jan 23 23:32:43.102000 audit[4907]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4896 pid=4907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.102000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533616332393566666337643632633465363230633761663863653333 Jan 23 23:32:43.102000 audit: BPF prog-id=245 op=LOAD Jan 23 23:32:43.102000 audit[4907]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4896 pid=4907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.102000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533616332393566666337643632633465363230633761663863653333 Jan 23 23:32:43.113203 systemd[1]: Started cri-containerd-921588fc5ff47c6e49f7dd8b2555ce9e5fdcfcf22254bf4ae01df32bcb58d56b.scope - libcontainer container 921588fc5ff47c6e49f7dd8b2555ce9e5fdcfcf22254bf4ae01df32bcb58d56b. Jan 23 23:32:43.120836 containerd[1666]: time="2026-01-23T23:32:43.120793488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfnpf,Uid:a69a1122-8e77-47a0-ac55-a81fea68c3e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"53ac295ffc7d62c4e620c7af8ce33f47fe29fba1f80e6ea46e5ba82f6d3f4201\"" Jan 23 23:32:43.122184 systemd-networkd[1577]: cali94e6f5a3de5: Gained IPv6LL Jan 23 23:32:43.124000 audit: BPF prog-id=246 op=LOAD Jan 23 23:32:43.125000 audit: BPF prog-id=247 op=LOAD Jan 23 23:32:43.125000 audit[4922]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4839 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932313538386663356666343763366534396637646438623235353563 Jan 23 23:32:43.125000 audit: BPF prog-id=247 op=UNLOAD Jan 23 23:32:43.125000 audit[4922]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4839 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932313538386663356666343763366534396637646438623235353563 Jan 23 23:32:43.125000 audit: BPF prog-id=248 op=LOAD Jan 23 23:32:43.125000 audit[4922]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4839 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932313538386663356666343763366534396637646438623235353563 Jan 23 23:32:43.126000 audit: BPF prog-id=249 op=LOAD Jan 23 23:32:43.126000 audit[4922]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4839 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932313538386663356666343763366534396637646438623235353563 Jan 23 23:32:43.126000 audit: BPF prog-id=249 op=UNLOAD Jan 23 23:32:43.126000 audit[4922]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4839 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932313538386663356666343763366534396637646438623235353563 Jan 23 23:32:43.126000 audit: BPF prog-id=248 op=UNLOAD Jan 23 23:32:43.126000 audit[4922]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4839 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932313538386663356666343763366534396637646438623235353563 Jan 23 23:32:43.126000 audit: BPF prog-id=250 op=LOAD Jan 23 23:32:43.126000 audit[4922]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4839 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932313538386663356666343763366534396637646438623235353563 Jan 23 23:32:43.144496 containerd[1666]: time="2026-01-23T23:32:43.144388320Z" level=info msg="StartContainer for \"921588fc5ff47c6e49f7dd8b2555ce9e5fdcfcf22254bf4ae01df32bcb58d56b\" returns successfully" Jan 23 23:32:43.278407 containerd[1666]: time="2026-01-23T23:32:43.278214129Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:43.281938 containerd[1666]: time="2026-01-23T23:32:43.281901020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 23:32:43.282127 containerd[1666]: time="2026-01-23T23:32:43.281981860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:43.282289 kubelet[2913]: E0123 23:32:43.282247 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:32:43.282362 kubelet[2913]: E0123 23:32:43.282299 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:32:43.282652 containerd[1666]: time="2026-01-23T23:32:43.282574422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 23:32:43.282885 kubelet[2913]: E0123 23:32:43.282765 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdrj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-52tqh_calico-system(8525930c-a129-42c2-8aaf-49aa89f532c7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:43.283972 kubelet[2913]: E0123 23:32:43.283932 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:32:43.605147 containerd[1666]: time="2026-01-23T23:32:43.605100645Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:43.608218 containerd[1666]: time="2026-01-23T23:32:43.608164255Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 23:32:43.608289 containerd[1666]: time="2026-01-23T23:32:43.608228255Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:43.608431 kubelet[2913]: E0123 23:32:43.608396 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:32:43.608844 kubelet[2913]: E0123 23:32:43.608443 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:32:43.608844 kubelet[2913]: E0123 23:32:43.608568 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:43.610834 containerd[1666]: time="2026-01-23T23:32:43.610797783Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 23:32:43.639185 containerd[1666]: time="2026-01-23T23:32:43.639144389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68f445db44-27ndm,Uid:1d648e36-1a1c-46cb-8b04-91769885543b,Namespace:calico-system,Attempt:0,}" Jan 23 23:32:43.765845 systemd-networkd[1577]: calia60713d1967: Link UP Jan 23 23:32:43.766447 systemd-networkd[1577]: calia60713d1967: Gained carrier Jan 23 23:32:43.780429 kubelet[2913]: E0123 23:32:43.780369 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:32:43.782389 kubelet[2913]: E0123 23:32:43.782348 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.686 [INFO][4967] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0 calico-kube-controllers-68f445db44- calico-system 1d648e36-1a1c-46cb-8b04-91769885543b 828 0 2026-01-23 23:32:13 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:68f445db44 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4593-0-0-1-266c03b17e calico-kube-controllers-68f445db44-27ndm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calia60713d1967 [] [] }} ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.686 [INFO][4967] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.714 [INFO][4981] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" HandleID="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.714 [INFO][4981] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" HandleID="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a1da0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-1-266c03b17e", "pod":"calico-kube-controllers-68f445db44-27ndm", "timestamp":"2026-01-23 23:32:43.714331899 +0000 UTC"}, Hostname:"ci-4593-0-0-1-266c03b17e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.714 [INFO][4981] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.714 [INFO][4981] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.714 [INFO][4981] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-1-266c03b17e' Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.724 [INFO][4981] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.729 [INFO][4981] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.734 [INFO][4981] ipam/ipam.go 511: Trying affinity for 192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.736 [INFO][4981] ipam/ipam.go 158: Attempting to load block cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.738 [INFO][4981] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.49.128/26 host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.738 [INFO][4981] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.49.128/26 handle="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.740 [INFO][4981] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521 Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.745 [INFO][4981] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.49.128/26 handle="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.757 [INFO][4981] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.49.136/26] block=192.168.49.128/26 handle="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.757 [INFO][4981] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.49.136/26] handle="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" host="ci-4593-0-0-1-266c03b17e" Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.757 [INFO][4981] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 23:32:43.790220 containerd[1666]: 2026-01-23 23:32:43.757 [INFO][4981] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.49.136/26] IPv6=[] ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" HandleID="k8s-pod-network.968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Workload="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" Jan 23 23:32:43.791278 containerd[1666]: 2026-01-23 23:32:43.760 [INFO][4967] cni-plugin/k8s.go 418: Populated endpoint ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0", GenerateName:"calico-kube-controllers-68f445db44-", Namespace:"calico-system", SelfLink:"", UID:"1d648e36-1a1c-46cb-8b04-91769885543b", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68f445db44", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"", Pod:"calico-kube-controllers-68f445db44-27ndm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.49.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia60713d1967", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:43.791278 containerd[1666]: 2026-01-23 23:32:43.760 [INFO][4967] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.49.136/32] ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" Jan 23 23:32:43.791278 containerd[1666]: 2026-01-23 23:32:43.760 [INFO][4967] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia60713d1967 ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" Jan 23 23:32:43.791278 containerd[1666]: 2026-01-23 23:32:43.767 [INFO][4967] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" Jan 23 23:32:43.791278 containerd[1666]: 2026-01-23 23:32:43.767 [INFO][4967] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0", GenerateName:"calico-kube-controllers-68f445db44-", Namespace:"calico-system", SelfLink:"", UID:"1d648e36-1a1c-46cb-8b04-91769885543b", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 23, 32, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68f445db44", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-1-266c03b17e", ContainerID:"968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521", Pod:"calico-kube-controllers-68f445db44-27ndm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.49.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia60713d1967", MAC:"32:0f:f3:ec:b8:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 23:32:43.791278 containerd[1666]: 2026-01-23 23:32:43.782 [INFO][4967] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" Namespace="calico-system" Pod="calico-kube-controllers-68f445db44-27ndm" WorkloadEndpoint="ci--4593--0--0--1--266c03b17e-k8s-calico--kube--controllers--68f445db44--27ndm-eth0" Jan 23 23:32:43.814284 kubelet[2913]: I0123 23:32:43.814176 2913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dqzmb" podStartSLOduration=44.814159323 podStartE2EDuration="44.814159323s" podCreationTimestamp="2026-01-23 23:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 23:32:43.799303238 +0000 UTC m=+50.559945753" watchObservedRunningTime="2026-01-23 23:32:43.814159323 +0000 UTC m=+50.574801878" Jan 23 23:32:43.820000 audit[4997]: NETFILTER_CFG table=filter:137 family=2 entries=52 op=nft_register_chain pid=4997 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 23:32:43.820000 audit[4997]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24296 a0=3 a1=fffff501b670 a2=0 a3=ffffa707bfa8 items=0 ppid=4112 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.820000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 23:32:43.843000 audit[5002]: NETFILTER_CFG table=filter:138 family=2 entries=14 op=nft_register_rule pid=5002 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:43.843000 audit[5002]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff5804e30 a2=0 a3=1 items=0 ppid=3074 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.843000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:43.845969 containerd[1666]: time="2026-01-23T23:32:43.845618499Z" level=info msg="connecting to shim 968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521" address="unix:///run/containerd/s/b815020e41afe0f963f9cc6715e3641ea7c6aa1578d37ee6cd7625536c470795" namespace=k8s.io protocol=ttrpc version=3 Jan 23 23:32:43.849000 audit[5002]: NETFILTER_CFG table=nat:139 family=2 entries=44 op=nft_register_rule pid=5002 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:43.849000 audit[5002]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=fffff5804e30 a2=0 a3=1 items=0 ppid=3074 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.849000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:43.874448 systemd[1]: Started cri-containerd-968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521.scope - libcontainer container 968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521. Jan 23 23:32:43.884000 audit: BPF prog-id=251 op=LOAD Jan 23 23:32:43.884000 audit: BPF prog-id=252 op=LOAD Jan 23 23:32:43.884000 audit[5019]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.884000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936386663373938343966623265303639373231626265303133333163 Jan 23 23:32:43.885000 audit: BPF prog-id=252 op=UNLOAD Jan 23 23:32:43.885000 audit[5019]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936386663373938343966623265303639373231626265303133333163 Jan 23 23:32:43.885000 audit: BPF prog-id=253 op=LOAD Jan 23 23:32:43.885000 audit[5019]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936386663373938343966623265303639373231626265303133333163 Jan 23 23:32:43.885000 audit: BPF prog-id=254 op=LOAD Jan 23 23:32:43.885000 audit[5019]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936386663373938343966623265303639373231626265303133333163 Jan 23 23:32:43.885000 audit: BPF prog-id=254 op=UNLOAD Jan 23 23:32:43.885000 audit[5019]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936386663373938343966623265303639373231626265303133333163 Jan 23 23:32:43.885000 audit: BPF prog-id=253 op=UNLOAD Jan 23 23:32:43.885000 audit[5019]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936386663373938343966623265303639373231626265303133333163 Jan 23 23:32:43.886000 audit: BPF prog-id=255 op=LOAD Jan 23 23:32:43.886000 audit[5019]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=5008 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:43.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936386663373938343966623265303639373231626265303133333163 Jan 23 23:32:43.909533 containerd[1666]: time="2026-01-23T23:32:43.909491094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68f445db44-27ndm,Uid:1d648e36-1a1c-46cb-8b04-91769885543b,Namespace:calico-system,Attempt:0,} returns sandbox id \"968fc79849fb2e069721bbe01331cd5fa3ee9fecf757d4a15f2e476ffbca7521\"" Jan 23 23:32:43.928762 containerd[1666]: time="2026-01-23T23:32:43.928543672Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:43.930801 containerd[1666]: time="2026-01-23T23:32:43.930750039Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 23:32:43.930898 containerd[1666]: time="2026-01-23T23:32:43.930817479Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:43.931032 kubelet[2913]: E0123 23:32:43.930953 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:32:43.931093 kubelet[2913]: E0123 23:32:43.931044 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:32:43.931367 kubelet[2913]: E0123 23:32:43.931262 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:43.931603 containerd[1666]: time="2026-01-23T23:32:43.931447521Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 23:32:43.932704 kubelet[2913]: E0123 23:32:43.932648 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:43.953444 systemd-networkd[1577]: calic5f97266cbb: Gained IPv6LL Jan 23 23:32:44.268657 containerd[1666]: time="2026-01-23T23:32:44.268524869Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:44.270518 containerd[1666]: time="2026-01-23T23:32:44.270426835Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 23:32:44.270597 containerd[1666]: time="2026-01-23T23:32:44.270520515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:44.270699 kubelet[2913]: E0123 23:32:44.270653 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:32:44.270738 kubelet[2913]: E0123 23:32:44.270709 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:32:44.270896 kubelet[2913]: E0123 23:32:44.270851 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-484f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-68f445db44-27ndm_calico-system(1d648e36-1a1c-46cb-8b04-91769885543b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:44.272303 kubelet[2913]: E0123 23:32:44.272039 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:32:44.721294 systemd-networkd[1577]: cali7f200a9e274: Gained IPv6LL Jan 23 23:32:44.784929 kubelet[2913]: E0123 23:32:44.784879 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:32:44.785682 kubelet[2913]: E0123 23:32:44.785548 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:44.786136 systemd-networkd[1577]: cali7a0d9e657b1: Gained IPv6LL Jan 23 23:32:44.787628 kubelet[2913]: E0123 23:32:44.787591 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:32:44.872000 audit[5046]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=5046 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:44.872000 audit[5046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff3178ec0 a2=0 a3=1 items=0 ppid=3074 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:44.872000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:44.891000 audit[5046]: NETFILTER_CFG table=nat:141 family=2 entries=56 op=nft_register_chain pid=5046 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:32:44.891000 audit[5046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=fffff3178ec0 a2=0 a3=1 items=0 ppid=3074 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:32:44.891000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:32:45.426122 systemd-networkd[1577]: calia60713d1967: Gained IPv6LL Jan 23 23:32:45.788402 kubelet[2913]: E0123 23:32:45.788013 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:32:52.638194 containerd[1666]: time="2026-01-23T23:32:52.638157274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 23:32:52.971604 containerd[1666]: time="2026-01-23T23:32:52.971372571Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:52.974048 containerd[1666]: time="2026-01-23T23:32:52.973923298Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 23:32:52.974048 containerd[1666]: time="2026-01-23T23:32:52.973940338Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:52.974158 kubelet[2913]: E0123 23:32:52.974124 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:32:52.974415 kubelet[2913]: E0123 23:32:52.974164 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:32:52.974415 kubelet[2913]: E0123 23:32:52.974276 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:89d4259d6b224a32924e4c2e731fe8b2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:52.976988 containerd[1666]: time="2026-01-23T23:32:52.976779387Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 23:32:53.310942 containerd[1666]: time="2026-01-23T23:32:53.310798286Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:53.312668 containerd[1666]: time="2026-01-23T23:32:53.312530131Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 23:32:53.312668 containerd[1666]: time="2026-01-23T23:32:53.312635971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:53.312936 kubelet[2913]: E0123 23:32:53.312820 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:32:53.312936 kubelet[2913]: E0123 23:32:53.312927 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:32:53.313484 kubelet[2913]: E0123 23:32:53.313409 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:53.314667 kubelet[2913]: E0123 23:32:53.314582 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:32:54.638637 containerd[1666]: time="2026-01-23T23:32:54.638429055Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:32:54.974320 containerd[1666]: time="2026-01-23T23:32:54.974186839Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:54.975883 containerd[1666]: time="2026-01-23T23:32:54.975837324Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:32:54.975986 containerd[1666]: time="2026-01-23T23:32:54.975877084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:54.976089 kubelet[2913]: E0123 23:32:54.976054 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:54.976564 kubelet[2913]: E0123 23:32:54.976096 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:54.976564 kubelet[2913]: E0123 23:32:54.976306 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gkz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-m6qjq_calico-apiserver(97acd778-d005-41f9-8db5-25f87a68c090): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:54.976697 containerd[1666]: time="2026-01-23T23:32:54.976513686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:32:54.977666 kubelet[2913]: E0123 23:32:54.977621 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:32:55.306411 containerd[1666]: time="2026-01-23T23:32:55.306277571Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:55.307633 containerd[1666]: time="2026-01-23T23:32:55.307534935Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:32:55.307633 containerd[1666]: time="2026-01-23T23:32:55.307577455Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:55.307797 kubelet[2913]: E0123 23:32:55.307748 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:55.307797 kubelet[2913]: E0123 23:32:55.307794 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:32:55.309135 kubelet[2913]: E0123 23:32:55.307924 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbrkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-2l9kd_calico-apiserver(aece08d9-f39f-4025-afe7-4d9ae33375fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:55.309135 kubelet[2913]: E0123 23:32:55.309098 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:32:56.637431 containerd[1666]: time="2026-01-23T23:32:56.637387671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 23:32:56.975543 containerd[1666]: time="2026-01-23T23:32:56.975303182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:56.978513 containerd[1666]: time="2026-01-23T23:32:56.977095547Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 23:32:56.978513 containerd[1666]: time="2026-01-23T23:32:56.977170667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:56.978683 kubelet[2913]: E0123 23:32:56.977356 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:32:56.978683 kubelet[2913]: E0123 23:32:56.977431 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:32:56.978683 kubelet[2913]: E0123 23:32:56.977571 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:56.979697 containerd[1666]: time="2026-01-23T23:32:56.979662515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 23:32:57.317192 containerd[1666]: time="2026-01-23T23:32:57.317052504Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:57.319554 containerd[1666]: time="2026-01-23T23:32:57.319496631Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 23:32:57.319608 containerd[1666]: time="2026-01-23T23:32:57.319549551Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:57.319849 kubelet[2913]: E0123 23:32:57.319760 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:32:57.319849 kubelet[2913]: E0123 23:32:57.319808 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:32:57.320089 kubelet[2913]: E0123 23:32:57.319924 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:57.321215 kubelet[2913]: E0123 23:32:57.321080 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:32:58.637723 containerd[1666]: time="2026-01-23T23:32:58.637671211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 23:32:58.969517 containerd[1666]: time="2026-01-23T23:32:58.969391783Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:58.970729 containerd[1666]: time="2026-01-23T23:32:58.970658907Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 23:32:58.970893 containerd[1666]: time="2026-01-23T23:32:58.970735627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:58.971010 kubelet[2913]: E0123 23:32:58.970934 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:32:58.971273 kubelet[2913]: E0123 23:32:58.971011 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:32:58.971580 kubelet[2913]: E0123 23:32:58.971521 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdrj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-52tqh_calico-system(8525930c-a129-42c2-8aaf-49aa89f532c7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:58.972898 kubelet[2913]: E0123 23:32:58.972850 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:32:59.638847 containerd[1666]: time="2026-01-23T23:32:59.638626744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 23:32:59.971217 containerd[1666]: time="2026-01-23T23:32:59.971080758Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:32:59.974136 containerd[1666]: time="2026-01-23T23:32:59.973977647Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 23:32:59.974136 containerd[1666]: time="2026-01-23T23:32:59.974024967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 23:32:59.974361 kubelet[2913]: E0123 23:32:59.974291 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:32:59.975307 kubelet[2913]: E0123 23:32:59.974352 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:32:59.975307 kubelet[2913]: E0123 23:32:59.974507 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-484f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-68f445db44-27ndm_calico-system(1d648e36-1a1c-46cb-8b04-91769885543b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 23:32:59.976974 kubelet[2913]: E0123 23:32:59.976075 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:33:04.638193 kubelet[2913]: E0123 23:33:04.638120 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:33:07.641048 kubelet[2913]: E0123 23:33:07.640991 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:33:08.638275 kubelet[2913]: E0123 23:33:08.638233 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:33:10.638822 kubelet[2913]: E0123 23:33:10.638139 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:33:14.637591 kubelet[2913]: E0123 23:33:14.637537 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:33:14.637591 kubelet[2913]: E0123 23:33:14.637541 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:33:18.638467 containerd[1666]: time="2026-01-23T23:33:18.638388089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 23:33:18.974819 containerd[1666]: time="2026-01-23T23:33:18.974701955Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:18.976107 containerd[1666]: time="2026-01-23T23:33:18.976058919Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 23:33:18.976182 containerd[1666]: time="2026-01-23T23:33:18.976089719Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:18.976281 kubelet[2913]: E0123 23:33:18.976244 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:33:18.976640 kubelet[2913]: E0123 23:33:18.976290 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:33:18.976640 kubelet[2913]: E0123 23:33:18.976400 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:18.978409 containerd[1666]: time="2026-01-23T23:33:18.978377446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 23:33:19.300660 containerd[1666]: time="2026-01-23T23:33:19.300439068Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:19.301899 containerd[1666]: time="2026-01-23T23:33:19.301844952Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 23:33:19.301995 containerd[1666]: time="2026-01-23T23:33:19.301935313Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:19.302173 kubelet[2913]: E0123 23:33:19.302107 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:33:19.302173 kubelet[2913]: E0123 23:33:19.302157 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:33:19.302336 kubelet[2913]: E0123 23:33:19.302286 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:19.303732 kubelet[2913]: E0123 23:33:19.303689 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:33:19.638993 containerd[1666]: time="2026-01-23T23:33:19.638807140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 23:33:19.973265 containerd[1666]: time="2026-01-23T23:33:19.973141640Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:19.975133 containerd[1666]: time="2026-01-23T23:33:19.975092526Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 23:33:19.975480 containerd[1666]: time="2026-01-23T23:33:19.975126286Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:19.975543 kubelet[2913]: E0123 23:33:19.975292 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:33:19.975543 kubelet[2913]: E0123 23:33:19.975348 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:33:19.975725 kubelet[2913]: E0123 23:33:19.975678 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:89d4259d6b224a32924e4c2e731fe8b2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:19.977939 containerd[1666]: time="2026-01-23T23:33:19.977809934Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 23:33:20.303838 containerd[1666]: time="2026-01-23T23:33:20.303709168Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:20.308824 containerd[1666]: time="2026-01-23T23:33:20.308758503Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 23:33:20.309154 containerd[1666]: time="2026-01-23T23:33:20.308852383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:20.311027 kubelet[2913]: E0123 23:33:20.309147 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:33:20.311027 kubelet[2913]: E0123 23:33:20.309190 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:33:20.311027 kubelet[2913]: E0123 23:33:20.309306 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:20.311027 kubelet[2913]: E0123 23:33:20.310527 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:33:22.637813 containerd[1666]: time="2026-01-23T23:33:22.637726366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:33:22.958790 containerd[1666]: time="2026-01-23T23:33:22.958525664Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:22.960915 containerd[1666]: time="2026-01-23T23:33:22.960791751Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:33:22.960915 containerd[1666]: time="2026-01-23T23:33:22.960873472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:22.961115 kubelet[2913]: E0123 23:33:22.961064 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:33:22.961645 kubelet[2913]: E0123 23:33:22.961124 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:33:22.961645 kubelet[2913]: E0123 23:33:22.961361 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gkz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-m6qjq_calico-apiserver(97acd778-d005-41f9-8db5-25f87a68c090): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:22.961751 containerd[1666]: time="2026-01-23T23:33:22.961471953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:33:22.962914 kubelet[2913]: E0123 23:33:22.962883 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:33:23.297092 containerd[1666]: time="2026-01-23T23:33:23.296930216Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:23.299887 containerd[1666]: time="2026-01-23T23:33:23.299806945Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:33:23.300012 containerd[1666]: time="2026-01-23T23:33:23.299903825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:23.300195 kubelet[2913]: E0123 23:33:23.300120 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:33:23.300195 kubelet[2913]: E0123 23:33:23.300187 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:33:23.300745 kubelet[2913]: E0123 23:33:23.300361 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbrkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-2l9kd_calico-apiserver(aece08d9-f39f-4025-afe7-4d9ae33375fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:23.302032 kubelet[2913]: E0123 23:33:23.302002 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:33:25.637385 containerd[1666]: time="2026-01-23T23:33:25.637332194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 23:33:26.121080 containerd[1666]: time="2026-01-23T23:33:26.121029549Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:26.125835 containerd[1666]: time="2026-01-23T23:33:26.125767644Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 23:33:26.125968 containerd[1666]: time="2026-01-23T23:33:26.125850884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:26.128063 kubelet[2913]: E0123 23:33:26.128001 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:33:26.128994 kubelet[2913]: E0123 23:33:26.128458 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:33:26.129170 kubelet[2913]: E0123 23:33:26.129115 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-484f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-68f445db44-27ndm_calico-system(1d648e36-1a1c-46cb-8b04-91769885543b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:26.131359 kubelet[2913]: E0123 23:33:26.130422 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:33:27.640184 containerd[1666]: time="2026-01-23T23:33:27.640136862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 23:33:27.960062 containerd[1666]: time="2026-01-23T23:33:27.959802877Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:33:27.964165 containerd[1666]: time="2026-01-23T23:33:27.964088770Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 23:33:27.964428 containerd[1666]: time="2026-01-23T23:33:27.964138970Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 23:33:27.964595 kubelet[2913]: E0123 23:33:27.964530 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:33:27.964595 kubelet[2913]: E0123 23:33:27.964582 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:33:27.964907 kubelet[2913]: E0123 23:33:27.964716 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdrj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-52tqh_calico-system(8525930c-a129-42c2-8aaf-49aa89f532c7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 23:33:27.965952 kubelet[2913]: E0123 23:33:27.965918 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:33:30.637670 kubelet[2913]: E0123 23:33:30.637614 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:33:30.638704 kubelet[2913]: E0123 23:33:30.638355 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:33:35.639482 kubelet[2913]: E0123 23:33:35.639339 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:33:37.638122 kubelet[2913]: E0123 23:33:37.638059 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:33:40.637587 kubelet[2913]: E0123 23:33:40.637526 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:33:41.639559 kubelet[2913]: E0123 23:33:41.639208 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:33:42.639890 kubelet[2913]: E0123 23:33:42.639288 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:33:45.639206 kubelet[2913]: E0123 23:33:45.639151 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:33:48.638209 kubelet[2913]: E0123 23:33:48.638069 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:33:50.638350 kubelet[2913]: E0123 23:33:50.638295 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:33:53.638651 kubelet[2913]: E0123 23:33:53.638600 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:33:54.637788 kubelet[2913]: E0123 23:33:54.637729 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:33:56.638228 kubelet[2913]: E0123 23:33:56.638165 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:33:57.639778 kubelet[2913]: E0123 23:33:57.639722 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:34:01.638024 kubelet[2913]: E0123 23:34:01.637703 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:34:03.638099 containerd[1666]: time="2026-01-23T23:34:03.638052048Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:34:03.972907 containerd[1666]: time="2026-01-23T23:34:03.972779309Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:03.974660 containerd[1666]: time="2026-01-23T23:34:03.974617074Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:34:03.974793 containerd[1666]: time="2026-01-23T23:34:03.974734715Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:03.974903 kubelet[2913]: E0123 23:34:03.974871 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:34:03.975182 kubelet[2913]: E0123 23:34:03.974915 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:34:03.975552 kubelet[2913]: E0123 23:34:03.975446 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbrkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-2l9kd_calico-apiserver(aece08d9-f39f-4025-afe7-4d9ae33375fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:03.976628 kubelet[2913]: E0123 23:34:03.976585 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:34:04.637483 containerd[1666]: time="2026-01-23T23:34:04.637440336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 23:34:04.978714 containerd[1666]: time="2026-01-23T23:34:04.978550616Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:04.980706 containerd[1666]: time="2026-01-23T23:34:04.980641862Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 23:34:04.980809 containerd[1666]: time="2026-01-23T23:34:04.980737703Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:04.980916 kubelet[2913]: E0123 23:34:04.980849 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:34:04.980916 kubelet[2913]: E0123 23:34:04.980899 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:34:04.981381 kubelet[2913]: E0123 23:34:04.981177 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:04.983194 containerd[1666]: time="2026-01-23T23:34:04.983152870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 23:34:05.315050 containerd[1666]: time="2026-01-23T23:34:05.314080919Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:05.316463 containerd[1666]: time="2026-01-23T23:34:05.316413406Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 23:34:05.316532 containerd[1666]: time="2026-01-23T23:34:05.316500527Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:05.316715 kubelet[2913]: E0123 23:34:05.316669 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:34:05.316766 kubelet[2913]: E0123 23:34:05.316720 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:34:05.316875 kubelet[2913]: E0123 23:34:05.316836 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:05.318182 kubelet[2913]: E0123 23:34:05.318130 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:34:08.640346 containerd[1666]: time="2026-01-23T23:34:08.640300383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 23:34:08.968014 containerd[1666]: time="2026-01-23T23:34:08.967565822Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:08.969090 containerd[1666]: time="2026-01-23T23:34:08.968971866Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 23:34:08.969090 containerd[1666]: time="2026-01-23T23:34:08.968977626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:08.969241 kubelet[2913]: E0123 23:34:08.969169 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:34:08.969241 kubelet[2913]: E0123 23:34:08.969220 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:34:08.969590 kubelet[2913]: E0123 23:34:08.969322 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:89d4259d6b224a32924e4c2e731fe8b2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:08.972069 containerd[1666]: time="2026-01-23T23:34:08.972045035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 23:34:09.291868 containerd[1666]: time="2026-01-23T23:34:09.291654690Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:09.293709 containerd[1666]: time="2026-01-23T23:34:09.293601896Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 23:34:09.293709 containerd[1666]: time="2026-01-23T23:34:09.293653696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:09.295149 kubelet[2913]: E0123 23:34:09.293849 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:34:09.295149 kubelet[2913]: E0123 23:34:09.293912 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:34:09.295149 kubelet[2913]: E0123 23:34:09.294064 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:09.295737 kubelet[2913]: E0123 23:34:09.295686 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:34:09.638971 containerd[1666]: time="2026-01-23T23:34:09.638608748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 23:34:09.972392 containerd[1666]: time="2026-01-23T23:34:09.972188365Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:09.974127 containerd[1666]: time="2026-01-23T23:34:09.974068091Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 23:34:09.974221 containerd[1666]: time="2026-01-23T23:34:09.974148211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:09.974271 kubelet[2913]: E0123 23:34:09.974239 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:34:09.974500 kubelet[2913]: E0123 23:34:09.974282 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:34:09.974537 containerd[1666]: time="2026-01-23T23:34:09.974519093Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 23:34:09.974992 kubelet[2913]: E0123 23:34:09.974862 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdrj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-52tqh_calico-system(8525930c-a129-42c2-8aaf-49aa89f532c7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:09.976184 kubelet[2913]: E0123 23:34:09.975971 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:34:10.480179 containerd[1666]: time="2026-01-23T23:34:10.480073234Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:10.482345 containerd[1666]: time="2026-01-23T23:34:10.482293601Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 23:34:10.482425 containerd[1666]: time="2026-01-23T23:34:10.482341841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:10.482812 kubelet[2913]: E0123 23:34:10.482546 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:34:10.482812 kubelet[2913]: E0123 23:34:10.482595 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 23:34:10.482812 kubelet[2913]: E0123 23:34:10.482738 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-484f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-68f445db44-27ndm_calico-system(1d648e36-1a1c-46cb-8b04-91769885543b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:10.483931 kubelet[2913]: E0123 23:34:10.483893 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:34:15.640846 containerd[1666]: time="2026-01-23T23:34:15.640572973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:34:15.899748 systemd[1]: Started sshd@9-10.0.10.88:22-68.220.241.50:44928.service - OpenSSH per-connection server daemon (68.220.241.50:44928). Jan 23 23:34:15.904231 kernel: kauditd_printk_skb: 239 callbacks suppressed Jan 23 23:34:15.904278 kernel: audit: type=1130 audit(1769211255.898:740): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.10.88:22-68.220.241.50:44928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:15.898000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.10.88:22-68.220.241.50:44928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:15.969077 containerd[1666]: time="2026-01-23T23:34:15.969026854Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:34:15.971126 containerd[1666]: time="2026-01-23T23:34:15.971066781Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:34:15.971188 containerd[1666]: time="2026-01-23T23:34:15.971127261Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:34:15.971346 kubelet[2913]: E0123 23:34:15.971292 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:34:15.971346 kubelet[2913]: E0123 23:34:15.971342 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:34:15.971669 kubelet[2913]: E0123 23:34:15.971515 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gkz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-m6qjq_calico-apiserver(97acd778-d005-41f9-8db5-25f87a68c090): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:34:15.973151 kubelet[2913]: E0123 23:34:15.973002 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:34:16.429000 audit[5201]: USER_ACCT pid=5201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.431132 sshd[5201]: Accepted publickey for core from 68.220.241.50 port 44928 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:16.435055 kernel: audit: type=1101 audit(1769211256.429:741): pid=5201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.434000 audit[5201]: CRED_ACQ pid=5201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.435979 sshd-session[5201]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:16.442210 kernel: audit: type=1103 audit(1769211256.434:742): pid=5201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.442497 kernel: audit: type=1006 audit(1769211256.434:743): pid=5201 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 23 23:34:16.446825 kernel: audit: type=1300 audit(1769211256.434:743): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffed38010 a2=3 a3=0 items=0 ppid=1 pid=5201 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:16.434000 audit[5201]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffed38010 a2=3 a3=0 items=0 ppid=1 pid=5201 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:16.445536 systemd-logind[1646]: New session 11 of user core. Jan 23 23:34:16.434000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:16.449248 kernel: audit: type=1327 audit(1769211256.434:743): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:16.456352 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 23 23:34:16.458000 audit[5201]: USER_START pid=5201 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.463976 kernel: audit: type=1105 audit(1769211256.458:744): pid=5201 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.464000 audit[5205]: CRED_ACQ pid=5205 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.468974 kernel: audit: type=1103 audit(1769211256.464:745): pid=5205 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.639087 kubelet[2913]: E0123 23:34:16.639031 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:34:16.799190 sshd[5205]: Connection closed by 68.220.241.50 port 44928 Jan 23 23:34:16.799189 sshd-session[5201]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:16.800000 audit[5201]: USER_END pid=5201 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.806542 systemd[1]: sshd@9-10.0.10.88:22-68.220.241.50:44928.service: Deactivated successfully. Jan 23 23:34:16.808367 systemd[1]: session-11.scope: Deactivated successfully. Jan 23 23:34:16.800000 audit[5201]: CRED_DISP pid=5201 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.812146 kernel: audit: type=1106 audit(1769211256.800:746): pid=5201 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.812258 kernel: audit: type=1104 audit(1769211256.800:747): pid=5201 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:16.812191 systemd-logind[1646]: Session 11 logged out. Waiting for processes to exit. Jan 23 23:34:16.805000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.10.88:22-68.220.241.50:44928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:16.813537 systemd-logind[1646]: Removed session 11. Jan 23 23:34:17.637495 kubelet[2913]: E0123 23:34:17.637431 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:34:21.905233 systemd[1]: Started sshd@10-10.0.10.88:22-68.220.241.50:44942.service - OpenSSH per-connection server daemon (68.220.241.50:44942). Jan 23 23:34:21.906179 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 23:34:21.906249 kernel: audit: type=1130 audit(1769211261.904:749): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.10.88:22-68.220.241.50:44942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:21.904000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.10.88:22-68.220.241.50:44942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:22.436000 audit[5221]: USER_ACCT pid=5221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.438051 sshd[5221]: Accepted publickey for core from 68.220.241.50 port 44942 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:22.440000 audit[5221]: CRED_ACQ pid=5221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.442355 sshd-session[5221]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:22.444394 kernel: audit: type=1101 audit(1769211262.436:750): pid=5221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.444473 kernel: audit: type=1103 audit(1769211262.440:751): pid=5221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.446286 kernel: audit: type=1006 audit(1769211262.440:752): pid=5221 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 23 23:34:22.440000 audit[5221]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe48f1040 a2=3 a3=0 items=0 ppid=1 pid=5221 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:22.449699 kernel: audit: type=1300 audit(1769211262.440:752): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe48f1040 a2=3 a3=0 items=0 ppid=1 pid=5221 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:22.440000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:22.451066 kernel: audit: type=1327 audit(1769211262.440:752): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:22.451804 systemd-logind[1646]: New session 12 of user core. Jan 23 23:34:22.467245 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 23 23:34:22.468000 audit[5221]: USER_START pid=5221 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.470000 audit[5225]: CRED_ACQ pid=5225 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.476225 kernel: audit: type=1105 audit(1769211262.468:753): pid=5221 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.476326 kernel: audit: type=1103 audit(1769211262.470:754): pid=5225 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.788153 sshd[5225]: Connection closed by 68.220.241.50 port 44942 Jan 23 23:34:22.788449 sshd-session[5221]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:22.789000 audit[5221]: USER_END pid=5221 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.793682 systemd[1]: sshd@10-10.0.10.88:22-68.220.241.50:44942.service: Deactivated successfully. Jan 23 23:34:22.789000 audit[5221]: CRED_DISP pid=5221 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.796576 systemd[1]: session-12.scope: Deactivated successfully. Jan 23 23:34:22.797761 systemd-logind[1646]: Session 12 logged out. Waiting for processes to exit. Jan 23 23:34:22.797898 kernel: audit: type=1106 audit(1769211262.789:755): pid=5221 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.797941 kernel: audit: type=1104 audit(1769211262.789:756): pid=5221 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:22.792000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.10.88:22-68.220.241.50:44942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:22.799310 systemd-logind[1646]: Removed session 12. Jan 23 23:34:23.638979 kubelet[2913]: E0123 23:34:23.638900 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:34:24.637380 kubelet[2913]: E0123 23:34:24.637092 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:34:25.638910 kubelet[2913]: E0123 23:34:25.638850 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:34:27.898624 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 23:34:27.898734 kernel: audit: type=1130 audit(1769211267.895:758): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.10.88:22-68.220.241.50:47118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:27.895000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.10.88:22-68.220.241.50:47118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:27.896723 systemd[1]: Started sshd@11-10.0.10.88:22-68.220.241.50:47118.service - OpenSSH per-connection server daemon (68.220.241.50:47118). Jan 23 23:34:28.400000 audit[5243]: USER_ACCT pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.401730 sshd[5243]: Accepted publickey for core from 68.220.241.50 port 47118 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:28.405020 kernel: audit: type=1101 audit(1769211268.400:759): pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.404000 audit[5243]: CRED_ACQ pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.406245 sshd-session[5243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:28.410301 kernel: audit: type=1103 audit(1769211268.404:760): pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.410386 kernel: audit: type=1006 audit(1769211268.404:761): pid=5243 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 23 23:34:28.404000 audit[5243]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffca58ed20 a2=3 a3=0 items=0 ppid=1 pid=5243 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:28.413719 kernel: audit: type=1300 audit(1769211268.404:761): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffca58ed20 a2=3 a3=0 items=0 ppid=1 pid=5243 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:28.414004 kernel: audit: type=1327 audit(1769211268.404:761): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:28.404000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:28.415638 systemd-logind[1646]: New session 13 of user core. Jan 23 23:34:28.427218 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 23 23:34:28.429000 audit[5243]: USER_START pid=5243 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.431000 audit[5247]: CRED_ACQ pid=5247 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.436709 kernel: audit: type=1105 audit(1769211268.429:762): pid=5243 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.436763 kernel: audit: type=1103 audit(1769211268.431:763): pid=5247 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.746047 sshd[5247]: Connection closed by 68.220.241.50 port 47118 Jan 23 23:34:28.746143 sshd-session[5243]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:28.747000 audit[5243]: USER_END pid=5243 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.747000 audit[5243]: CRED_DISP pid=5243 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.751242 systemd[1]: sshd@11-10.0.10.88:22-68.220.241.50:47118.service: Deactivated successfully. Jan 23 23:34:28.753507 systemd[1]: session-13.scope: Deactivated successfully. Jan 23 23:34:28.755102 kernel: audit: type=1106 audit(1769211268.747:764): pid=5243 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.755173 kernel: audit: type=1104 audit(1769211268.747:765): pid=5243 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:28.750000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.10.88:22-68.220.241.50:47118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:28.755762 systemd-logind[1646]: Session 13 logged out. Waiting for processes to exit. Jan 23 23:34:28.758016 systemd-logind[1646]: Removed session 13. Jan 23 23:34:28.854000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.10.88:22-68.220.241.50:47132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:28.855244 systemd[1]: Started sshd@12-10.0.10.88:22-68.220.241.50:47132.service - OpenSSH per-connection server daemon (68.220.241.50:47132). Jan 23 23:34:29.379839 sshd[5262]: Accepted publickey for core from 68.220.241.50 port 47132 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:29.378000 audit[5262]: USER_ACCT pid=5262 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:29.379000 audit[5262]: CRED_ACQ pid=5262 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:29.379000 audit[5262]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcd1f6390 a2=3 a3=0 items=0 ppid=1 pid=5262 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:29.379000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:29.381634 sshd-session[5262]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:29.386137 systemd-logind[1646]: New session 14 of user core. Jan 23 23:34:29.396220 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 23 23:34:29.398000 audit[5262]: USER_START pid=5262 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:29.400000 audit[5266]: CRED_ACQ pid=5266 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:29.641938 kubelet[2913]: E0123 23:34:29.640564 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:34:29.786824 sshd[5266]: Connection closed by 68.220.241.50 port 47132 Jan 23 23:34:29.787131 sshd-session[5262]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:29.787000 audit[5262]: USER_END pid=5262 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:29.787000 audit[5262]: CRED_DISP pid=5262 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:29.791368 systemd[1]: sshd@12-10.0.10.88:22-68.220.241.50:47132.service: Deactivated successfully. Jan 23 23:34:29.792000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.10.88:22-68.220.241.50:47132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:29.795040 systemd[1]: session-14.scope: Deactivated successfully. Jan 23 23:34:29.796532 systemd-logind[1646]: Session 14 logged out. Waiting for processes to exit. Jan 23 23:34:29.798140 systemd-logind[1646]: Removed session 14. Jan 23 23:34:29.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.10.88:22-68.220.241.50:47136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:29.891115 systemd[1]: Started sshd@13-10.0.10.88:22-68.220.241.50:47136.service - OpenSSH per-connection server daemon (68.220.241.50:47136). Jan 23 23:34:30.410000 audit[5278]: USER_ACCT pid=5278 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:30.412005 sshd[5278]: Accepted publickey for core from 68.220.241.50 port 47136 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:30.412000 audit[5278]: CRED_ACQ pid=5278 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:30.412000 audit[5278]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc30bb660 a2=3 a3=0 items=0 ppid=1 pid=5278 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:30.412000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:30.413919 sshd-session[5278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:30.421175 systemd-logind[1646]: New session 15 of user core. Jan 23 23:34:30.431302 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 23 23:34:30.433000 audit[5278]: USER_START pid=5278 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:30.434000 audit[5285]: CRED_ACQ pid=5285 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:30.637556 kubelet[2913]: E0123 23:34:30.637478 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:34:30.767628 sshd[5285]: Connection closed by 68.220.241.50 port 47136 Jan 23 23:34:30.768058 sshd-session[5278]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:30.768000 audit[5278]: USER_END pid=5278 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:30.768000 audit[5278]: CRED_DISP pid=5278 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:30.772866 systemd[1]: sshd@13-10.0.10.88:22-68.220.241.50:47136.service: Deactivated successfully. Jan 23 23:34:30.772000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.10.88:22-68.220.241.50:47136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:30.775519 systemd[1]: session-15.scope: Deactivated successfully. Jan 23 23:34:30.776431 systemd-logind[1646]: Session 15 logged out. Waiting for processes to exit. Jan 23 23:34:30.777456 systemd-logind[1646]: Removed session 15. Jan 23 23:34:32.637296 kubelet[2913]: E0123 23:34:32.637252 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:34:34.638387 kubelet[2913]: E0123 23:34:34.638306 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:34:35.867433 systemd[1]: Started sshd@14-10.0.10.88:22-68.220.241.50:52924.service - OpenSSH per-connection server daemon (68.220.241.50:52924). Jan 23 23:34:35.871025 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 23 23:34:35.871111 kernel: audit: type=1130 audit(1769211275.866:785): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.10.88:22-68.220.241.50:52924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:35.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.10.88:22-68.220.241.50:52924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:36.381000 audit[5298]: USER_ACCT pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.386229 sshd[5298]: Accepted publickey for core from 68.220.241.50 port 52924 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:36.386991 kernel: audit: type=1101 audit(1769211276.381:786): pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.385000 audit[5298]: CRED_ACQ pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.390401 sshd-session[5298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:36.392516 kernel: audit: type=1103 audit(1769211276.385:787): pid=5298 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.392568 kernel: audit: type=1006 audit(1769211276.385:788): pid=5298 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 23 23:34:36.385000 audit[5298]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd0894300 a2=3 a3=0 items=0 ppid=1 pid=5298 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:36.396299 kernel: audit: type=1300 audit(1769211276.385:788): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd0894300 a2=3 a3=0 items=0 ppid=1 pid=5298 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:36.396431 kernel: audit: type=1327 audit(1769211276.385:788): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:36.385000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:36.400057 systemd-logind[1646]: New session 16 of user core. Jan 23 23:34:36.407211 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 23 23:34:36.410000 audit[5298]: USER_START pid=5298 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.412000 audit[5302]: CRED_ACQ pid=5302 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.418282 kernel: audit: type=1105 audit(1769211276.410:789): pid=5298 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.418361 kernel: audit: type=1103 audit(1769211276.412:790): pid=5302 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.638181 kubelet[2913]: E0123 23:34:36.638007 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:34:36.638945 kubelet[2913]: E0123 23:34:36.638572 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:34:36.731322 sshd[5302]: Connection closed by 68.220.241.50 port 52924 Jan 23 23:34:36.731661 sshd-session[5298]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:36.731000 audit[5298]: USER_END pid=5298 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.736097 systemd[1]: sshd@14-10.0.10.88:22-68.220.241.50:52924.service: Deactivated successfully. Jan 23 23:34:36.731000 audit[5298]: CRED_DISP pid=5298 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.739576 kernel: audit: type=1106 audit(1769211276.731:791): pid=5298 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.739645 kernel: audit: type=1104 audit(1769211276.731:792): pid=5298 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:36.737000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.10.88:22-68.220.241.50:52924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:36.739630 systemd[1]: session-16.scope: Deactivated successfully. Jan 23 23:34:36.741354 systemd-logind[1646]: Session 16 logged out. Waiting for processes to exit. Jan 23 23:34:36.742666 systemd-logind[1646]: Removed session 16. Jan 23 23:34:36.837279 systemd[1]: Started sshd@15-10.0.10.88:22-68.220.241.50:52936.service - OpenSSH per-connection server daemon (68.220.241.50:52936). Jan 23 23:34:36.836000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.10.88:22-68.220.241.50:52936 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:37.350000 audit[5316]: USER_ACCT pid=5316 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:37.352707 sshd[5316]: Accepted publickey for core from 68.220.241.50 port 52936 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:37.352000 audit[5316]: CRED_ACQ pid=5316 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:37.352000 audit[5316]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd49d160 a2=3 a3=0 items=0 ppid=1 pid=5316 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:37.352000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:37.354139 sshd-session[5316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:37.360023 systemd-logind[1646]: New session 17 of user core. Jan 23 23:34:37.370233 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 23 23:34:37.371000 audit[5316]: USER_START pid=5316 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:37.373000 audit[5321]: CRED_ACQ pid=5321 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:37.767041 sshd[5321]: Connection closed by 68.220.241.50 port 52936 Jan 23 23:34:37.767660 sshd-session[5316]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:37.768000 audit[5316]: USER_END pid=5316 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:37.768000 audit[5316]: CRED_DISP pid=5316 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:37.773412 systemd[1]: sshd@15-10.0.10.88:22-68.220.241.50:52936.service: Deactivated successfully. Jan 23 23:34:37.772000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.10.88:22-68.220.241.50:52936 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:37.775900 systemd[1]: session-17.scope: Deactivated successfully. Jan 23 23:34:37.776649 systemd-logind[1646]: Session 17 logged out. Waiting for processes to exit. Jan 23 23:34:37.778123 systemd-logind[1646]: Removed session 17. Jan 23 23:34:37.874000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.10.88:22-68.220.241.50:52952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:37.875394 systemd[1]: Started sshd@16-10.0.10.88:22-68.220.241.50:52952.service - OpenSSH per-connection server daemon (68.220.241.50:52952). Jan 23 23:34:38.388000 audit[5332]: USER_ACCT pid=5332 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:38.389938 sshd[5332]: Accepted publickey for core from 68.220.241.50 port 52952 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:38.390000 audit[5332]: CRED_ACQ pid=5332 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:38.390000 audit[5332]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed3a5f40 a2=3 a3=0 items=0 ppid=1 pid=5332 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:38.390000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:38.391753 sshd-session[5332]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:38.395913 systemd-logind[1646]: New session 18 of user core. Jan 23 23:34:38.405299 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 23 23:34:38.406000 audit[5332]: USER_START pid=5332 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:38.408000 audit[5336]: CRED_ACQ pid=5336 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:39.196000 audit[5347]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5347 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:34:39.196000 audit[5347]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffffde22db0 a2=0 a3=1 items=0 ppid=3074 pid=5347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:39.196000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:34:39.205000 audit[5347]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5347 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:34:39.205000 audit[5347]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffffde22db0 a2=0 a3=1 items=0 ppid=3074 pid=5347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:39.205000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:34:39.224000 audit[5349]: NETFILTER_CFG table=filter:144 family=2 entries=38 op=nft_register_rule pid=5349 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:34:39.224000 audit[5349]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff4ffeff0 a2=0 a3=1 items=0 ppid=3074 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:39.224000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:34:39.229000 audit[5349]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5349 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:34:39.229000 audit[5349]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff4ffeff0 a2=0 a3=1 items=0 ppid=3074 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:39.229000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:34:39.293374 sshd[5336]: Connection closed by 68.220.241.50 port 52952 Jan 23 23:34:39.293663 sshd-session[5332]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:39.296000 audit[5332]: USER_END pid=5332 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:39.296000 audit[5332]: CRED_DISP pid=5332 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:39.301580 systemd-logind[1646]: Session 18 logged out. Waiting for processes to exit. Jan 23 23:34:39.301784 systemd[1]: sshd@16-10.0.10.88:22-68.220.241.50:52952.service: Deactivated successfully. Jan 23 23:34:39.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.10.88:22-68.220.241.50:52952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:39.305762 systemd[1]: session-18.scope: Deactivated successfully. Jan 23 23:34:39.312658 systemd-logind[1646]: Removed session 18. Jan 23 23:34:39.399000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.10.88:22-68.220.241.50:52966 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:39.399859 systemd[1]: Started sshd@17-10.0.10.88:22-68.220.241.50:52966.service - OpenSSH per-connection server daemon (68.220.241.50:52966). Jan 23 23:34:39.919000 audit[5354]: USER_ACCT pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:39.920989 sshd[5354]: Accepted publickey for core from 68.220.241.50 port 52966 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:39.920000 audit[5354]: CRED_ACQ pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:39.920000 audit[5354]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcda9e2d0 a2=3 a3=0 items=0 ppid=1 pid=5354 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:39.920000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:39.922571 sshd-session[5354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:39.927666 systemd-logind[1646]: New session 19 of user core. Jan 23 23:34:39.936230 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 23 23:34:39.937000 audit[5354]: USER_START pid=5354 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:39.939000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:40.375885 sshd[5358]: Connection closed by 68.220.241.50 port 52966 Jan 23 23:34:40.376298 sshd-session[5354]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:40.376000 audit[5354]: USER_END pid=5354 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:40.376000 audit[5354]: CRED_DISP pid=5354 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:40.380552 systemd[1]: sshd@17-10.0.10.88:22-68.220.241.50:52966.service: Deactivated successfully. Jan 23 23:34:40.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.10.88:22-68.220.241.50:52966 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:40.382648 systemd[1]: session-19.scope: Deactivated successfully. Jan 23 23:34:40.383835 systemd-logind[1646]: Session 19 logged out. Waiting for processes to exit. Jan 23 23:34:40.385486 systemd-logind[1646]: Removed session 19. Jan 23 23:34:40.490000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.10.88:22-68.220.241.50:52970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:40.490892 systemd[1]: Started sshd@18-10.0.10.88:22-68.220.241.50:52970.service - OpenSSH per-connection server daemon (68.220.241.50:52970). Jan 23 23:34:41.004000 audit[5370]: USER_ACCT pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.005888 sshd[5370]: Accepted publickey for core from 68.220.241.50 port 52970 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:41.008758 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 23 23:34:41.008845 kernel: audit: type=1101 audit(1769211281.004:826): pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.008867 kernel: audit: type=1103 audit(1769211281.005:827): pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.005000 audit[5370]: CRED_ACQ pid=5370 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.009607 sshd-session[5370]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:41.014133 kernel: audit: type=1006 audit(1769211281.005:828): pid=5370 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 23 23:34:41.016080 kernel: audit: type=1300 audit(1769211281.005:828): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff51e9930 a2=3 a3=0 items=0 ppid=1 pid=5370 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:41.005000 audit[5370]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff51e9930 a2=3 a3=0 items=0 ppid=1 pid=5370 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:41.005000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:41.021647 kernel: audit: type=1327 audit(1769211281.005:828): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:41.021935 systemd-logind[1646]: New session 20 of user core. Jan 23 23:34:41.029153 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 23 23:34:41.030000 audit[5370]: USER_START pid=5370 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.035987 kernel: audit: type=1105 audit(1769211281.030:829): pid=5370 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.035000 audit[5374]: CRED_ACQ pid=5374 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.040026 kernel: audit: type=1103 audit(1769211281.035:830): pid=5374 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.353952 sshd[5374]: Connection closed by 68.220.241.50 port 52970 Jan 23 23:34:41.355690 sshd-session[5370]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:41.355000 audit[5370]: USER_END pid=5370 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.359711 systemd-logind[1646]: Session 20 logged out. Waiting for processes to exit. Jan 23 23:34:41.359912 systemd[1]: sshd@18-10.0.10.88:22-68.220.241.50:52970.service: Deactivated successfully. Jan 23 23:34:41.356000 audit[5370]: CRED_DISP pid=5370 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.362399 systemd[1]: session-20.scope: Deactivated successfully. Jan 23 23:34:41.363980 kernel: audit: type=1106 audit(1769211281.355:831): pid=5370 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.364058 kernel: audit: type=1104 audit(1769211281.356:832): pid=5370 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:41.364087 kernel: audit: type=1131 audit(1769211281.360:833): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.10.88:22-68.220.241.50:52970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:41.360000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.10.88:22-68.220.241.50:52970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:41.366166 systemd-logind[1646]: Removed session 20. Jan 23 23:34:41.639397 kubelet[2913]: E0123 23:34:41.639279 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:34:43.638907 kubelet[2913]: E0123 23:34:43.638856 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:34:43.715000 audit[5414]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5414 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:34:43.715000 audit[5414]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff94fe260 a2=0 a3=1 items=0 ppid=3074 pid=5414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:43.715000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:34:43.722000 audit[5414]: NETFILTER_CFG table=nat:147 family=2 entries=104 op=nft_register_chain pid=5414 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 23:34:43.722000 audit[5414]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=fffff94fe260 a2=0 a3=1 items=0 ppid=3074 pid=5414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:43.722000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 23:34:44.637232 kubelet[2913]: E0123 23:34:44.637184 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:34:46.461000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.10.88:22-68.220.241.50:49032 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:46.462095 systemd[1]: Started sshd@19-10.0.10.88:22-68.220.241.50:49032.service - OpenSSH per-connection server daemon (68.220.241.50:49032). Jan 23 23:34:46.463517 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 23 23:34:46.463550 kernel: audit: type=1130 audit(1769211286.461:836): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.10.88:22-68.220.241.50:49032 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:46.982000 audit[5416]: USER_ACCT pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:46.983989 sshd[5416]: Accepted publickey for core from 68.220.241.50 port 49032 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:46.986000 audit[5416]: CRED_ACQ pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:46.988281 sshd-session[5416]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:46.989980 kernel: audit: type=1101 audit(1769211286.982:837): pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:46.990028 kernel: audit: type=1103 audit(1769211286.986:838): pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:46.992043 kernel: audit: type=1006 audit(1769211286.986:839): pid=5416 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 23 23:34:46.993019 kernel: audit: type=1300 audit(1769211286.986:839): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd356a580 a2=3 a3=0 items=0 ppid=1 pid=5416 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:46.986000 audit[5416]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd356a580 a2=3 a3=0 items=0 ppid=1 pid=5416 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:46.986000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:46.996731 kernel: audit: type=1327 audit(1769211286.986:839): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:47.000416 systemd-logind[1646]: New session 21 of user core. Jan 23 23:34:47.010321 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 23 23:34:47.012000 audit[5416]: USER_START pid=5416 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.016000 audit[5420]: CRED_ACQ pid=5420 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.020091 kernel: audit: type=1105 audit(1769211287.012:840): pid=5416 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.020162 kernel: audit: type=1103 audit(1769211287.016:841): pid=5420 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.338007 sshd[5420]: Connection closed by 68.220.241.50 port 49032 Jan 23 23:34:47.338528 sshd-session[5416]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:47.339000 audit[5416]: USER_END pid=5416 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.343352 systemd[1]: sshd@19-10.0.10.88:22-68.220.241.50:49032.service: Deactivated successfully. Jan 23 23:34:47.339000 audit[5416]: CRED_DISP pid=5416 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.345206 systemd[1]: session-21.scope: Deactivated successfully. Jan 23 23:34:47.346622 systemd-logind[1646]: Session 21 logged out. Waiting for processes to exit. Jan 23 23:34:47.346887 kernel: audit: type=1106 audit(1769211287.339:842): pid=5416 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.347054 kernel: audit: type=1104 audit(1769211287.339:843): pid=5416 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:47.342000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.10.88:22-68.220.241.50:49032 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:47.348342 systemd-logind[1646]: Removed session 21. Jan 23 23:34:48.638569 kubelet[2913]: E0123 23:34:48.638504 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:34:49.638586 kubelet[2913]: E0123 23:34:49.638366 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:34:50.637559 kubelet[2913]: E0123 23:34:50.637515 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:34:52.449000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.10.88:22-68.220.241.50:49042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:52.452692 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 23:34:52.452741 kernel: audit: type=1130 audit(1769211292.449:845): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.10.88:22-68.220.241.50:49042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:52.451739 systemd[1]: Started sshd@20-10.0.10.88:22-68.220.241.50:49042.service - OpenSSH per-connection server daemon (68.220.241.50:49042). Jan 23 23:34:52.638283 kubelet[2913]: E0123 23:34:52.638213 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:34:52.990000 audit[5434]: USER_ACCT pid=5434 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:52.992159 sshd[5434]: Accepted publickey for core from 68.220.241.50 port 49042 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:52.994000 audit[5434]: CRED_ACQ pid=5434 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:52.996231 sshd-session[5434]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:52.999941 kernel: audit: type=1101 audit(1769211292.990:846): pid=5434 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.000045 kernel: audit: type=1103 audit(1769211292.994:847): pid=5434 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.000102 kernel: audit: type=1006 audit(1769211292.994:848): pid=5434 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 23 23:34:52.994000 audit[5434]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe0180ca0 a2=3 a3=0 items=0 ppid=1 pid=5434 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:53.006336 kernel: audit: type=1300 audit(1769211292.994:848): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe0180ca0 a2=3 a3=0 items=0 ppid=1 pid=5434 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:52.994000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:53.007726 kernel: audit: type=1327 audit(1769211292.994:848): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:53.010867 systemd-logind[1646]: New session 22 of user core. Jan 23 23:34:53.021166 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 23 23:34:53.021000 audit[5434]: USER_START pid=5434 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.025000 audit[5438]: CRED_ACQ pid=5438 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.030626 kernel: audit: type=1105 audit(1769211293.021:849): pid=5434 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.030711 kernel: audit: type=1103 audit(1769211293.025:850): pid=5438 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.352749 sshd[5438]: Connection closed by 68.220.241.50 port 49042 Jan 23 23:34:53.353155 sshd-session[5434]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:53.352000 audit[5434]: USER_END pid=5434 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.359487 systemd[1]: sshd@20-10.0.10.88:22-68.220.241.50:49042.service: Deactivated successfully. Jan 23 23:34:53.362622 systemd[1]: session-22.scope: Deactivated successfully. Jan 23 23:34:53.352000 audit[5434]: CRED_DISP pid=5434 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.366656 kernel: audit: type=1106 audit(1769211293.352:851): pid=5434 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.366771 kernel: audit: type=1104 audit(1769211293.352:852): pid=5434 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:53.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.10.88:22-68.220.241.50:49042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:53.372144 systemd-logind[1646]: Session 22 logged out. Waiting for processes to exit. Jan 23 23:34:53.374419 systemd-logind[1646]: Removed session 22. Jan 23 23:34:56.637542 kubelet[2913]: E0123 23:34:56.637475 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:34:57.638618 kubelet[2913]: E0123 23:34:57.638533 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:34:58.464588 systemd[1]: Started sshd@21-10.0.10.88:22-68.220.241.50:38932.service - OpenSSH per-connection server daemon (68.220.241.50:38932). Jan 23 23:34:58.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.10.88:22-68.220.241.50:38932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:58.468133 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 23:34:58.468187 kernel: audit: type=1130 audit(1769211298.463:854): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.10.88:22-68.220.241.50:38932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:59.005000 audit[5453]: USER_ACCT pid=5453 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.008165 sshd[5453]: Accepted publickey for core from 68.220.241.50 port 38932 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:34:59.010853 sshd-session[5453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:34:59.008000 audit[5453]: CRED_ACQ pid=5453 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.014027 kernel: audit: type=1101 audit(1769211299.005:855): pid=5453 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.014091 kernel: audit: type=1103 audit(1769211299.008:856): pid=5453 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.014112 kernel: audit: type=1006 audit(1769211299.008:857): pid=5453 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 23 23:34:59.008000 audit[5453]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe83f10d0 a2=3 a3=0 items=0 ppid=1 pid=5453 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:59.019208 kernel: audit: type=1300 audit(1769211299.008:857): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe83f10d0 a2=3 a3=0 items=0 ppid=1 pid=5453 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:34:59.008000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:59.020655 kernel: audit: type=1327 audit(1769211299.008:857): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:34:59.021239 systemd-logind[1646]: New session 23 of user core. Jan 23 23:34:59.038154 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 23 23:34:59.038000 audit[5453]: USER_START pid=5453 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.039000 audit[5457]: CRED_ACQ pid=5457 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.046532 kernel: audit: type=1105 audit(1769211299.038:858): pid=5453 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.046595 kernel: audit: type=1103 audit(1769211299.039:859): pid=5457 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.367060 sshd[5457]: Connection closed by 68.220.241.50 port 38932 Jan 23 23:34:59.367813 sshd-session[5453]: pam_unix(sshd:session): session closed for user core Jan 23 23:34:59.367000 audit[5453]: USER_END pid=5453 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.372820 systemd[1]: sshd@21-10.0.10.88:22-68.220.241.50:38932.service: Deactivated successfully. Jan 23 23:34:59.368000 audit[5453]: CRED_DISP pid=5453 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.375022 systemd[1]: session-23.scope: Deactivated successfully. Jan 23 23:34:59.376437 kernel: audit: type=1106 audit(1769211299.367:860): pid=5453 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.376548 kernel: audit: type=1104 audit(1769211299.368:861): pid=5453 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:34:59.377586 systemd-logind[1646]: Session 23 logged out. Waiting for processes to exit. Jan 23 23:34:59.371000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.10.88:22-68.220.241.50:38932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:34:59.378708 systemd-logind[1646]: Removed session 23. Jan 23 23:34:59.639594 kubelet[2913]: E0123 23:34:59.639459 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:35:02.637421 kubelet[2913]: E0123 23:35:02.637376 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:35:02.637848 kubelet[2913]: E0123 23:35:02.637414 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:35:04.475000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.10.88:22-68.220.241.50:53098 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:35:04.476459 systemd[1]: Started sshd@22-10.0.10.88:22-68.220.241.50:53098.service - OpenSSH per-connection server daemon (68.220.241.50:53098). Jan 23 23:35:04.482077 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 23:35:04.482190 kernel: audit: type=1130 audit(1769211304.475:863): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.10.88:22-68.220.241.50:53098 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:35:05.018000 audit[5473]: USER_ACCT pid=5473 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.020355 sshd[5473]: Accepted publickey for core from 68.220.241.50 port 53098 ssh2: RSA SHA256:uAdbPsXXjav5Vt41mNhqY3SeXhQmGg8xPzi0M2sqAvw Jan 23 23:35:05.024006 kernel: audit: type=1101 audit(1769211305.018:864): pid=5473 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.023000 audit[5473]: CRED_ACQ pid=5473 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.026686 sshd-session[5473]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 23:35:05.029851 kernel: audit: type=1103 audit(1769211305.023:865): pid=5473 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.029942 kernel: audit: type=1006 audit(1769211305.024:866): pid=5473 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 23 23:35:05.024000 audit[5473]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd5610f0 a2=3 a3=0 items=0 ppid=1 pid=5473 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:05.033314 kernel: audit: type=1300 audit(1769211305.024:866): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd5610f0 a2=3 a3=0 items=0 ppid=1 pid=5473 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:05.033508 kernel: audit: type=1327 audit(1769211305.024:866): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:35:05.024000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 23:35:05.037019 systemd-logind[1646]: New session 24 of user core. Jan 23 23:35:05.044186 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 23 23:35:05.046000 audit[5473]: USER_START pid=5473 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.047000 audit[5477]: CRED_ACQ pid=5477 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.053371 kernel: audit: type=1105 audit(1769211305.046:867): pid=5473 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.053468 kernel: audit: type=1103 audit(1769211305.047:868): pid=5477 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.369019 sshd[5477]: Connection closed by 68.220.241.50 port 53098 Jan 23 23:35:05.370140 sshd-session[5473]: pam_unix(sshd:session): session closed for user core Jan 23 23:35:05.370000 audit[5473]: USER_END pid=5473 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.376001 systemd[1]: sshd@22-10.0.10.88:22-68.220.241.50:53098.service: Deactivated successfully. Jan 23 23:35:05.370000 audit[5473]: CRED_DISP pid=5473 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.378913 kernel: audit: type=1106 audit(1769211305.370:869): pid=5473 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.378997 kernel: audit: type=1104 audit(1769211305.370:870): pid=5473 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 23 23:35:05.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.10.88:22-68.220.241.50:53098 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 23:35:05.380464 systemd[1]: session-24.scope: Deactivated successfully. Jan 23 23:35:05.381943 systemd-logind[1646]: Session 24 logged out. Waiting for processes to exit. Jan 23 23:35:05.382946 systemd-logind[1646]: Removed session 24. Jan 23 23:35:07.638482 kubelet[2913]: E0123 23:35:07.638369 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:35:09.639115 kubelet[2913]: E0123 23:35:09.639066 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:35:10.638070 kubelet[2913]: E0123 23:35:10.638019 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:35:13.638220 kubelet[2913]: E0123 23:35:13.638151 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:35:13.638590 kubelet[2913]: E0123 23:35:13.638540 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:35:13.639086 kubelet[2913]: E0123 23:35:13.638946 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:35:19.638749 kubelet[2913]: E0123 23:35:19.638660 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:35:22.637674 kubelet[2913]: E0123 23:35:22.637621 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:35:23.637426 kubelet[2913]: E0123 23:35:23.637383 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:35:26.637829 kubelet[2913]: E0123 23:35:26.637738 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7" Jan 23 23:35:27.637443 kubelet[2913]: E0123 23:35:27.637387 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68f445db44-27ndm" podUID="1d648e36-1a1c-46cb-8b04-91769885543b" Jan 23 23:35:28.637925 kubelet[2913]: E0123 23:35:28.637692 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:35:33.638434 containerd[1666]: time="2026-01-23T23:35:33.638391209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 23:35:34.174714 containerd[1666]: time="2026-01-23T23:35:34.174607164Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:35:34.175951 containerd[1666]: time="2026-01-23T23:35:34.175903328Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 23:35:34.176034 containerd[1666]: time="2026-01-23T23:35:34.175988968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 23:35:34.176200 kubelet[2913]: E0123 23:35:34.176162 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:35:34.176477 kubelet[2913]: E0123 23:35:34.176213 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 23:35:34.176477 kubelet[2913]: E0123 23:35:34.176327 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 23:35:34.178402 containerd[1666]: time="2026-01-23T23:35:34.178212655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 23:35:34.505984 containerd[1666]: time="2026-01-23T23:35:34.505825454Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:35:34.507307 containerd[1666]: time="2026-01-23T23:35:34.507263739Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 23:35:34.507510 containerd[1666]: time="2026-01-23T23:35:34.507353859Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 23:35:34.507620 kubelet[2913]: E0123 23:35:34.507578 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:35:34.507717 kubelet[2913]: E0123 23:35:34.507702 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 23:35:34.508194 kubelet[2913]: E0123 23:35:34.507917 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6g8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rfnpf_calico-system(a69a1122-8e77-47a0-ac55-a81fea68c3e7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 23:35:34.509612 kubelet[2913]: E0123 23:35:34.509574 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rfnpf" podUID="a69a1122-8e77-47a0-ac55-a81fea68c3e7" Jan 23 23:35:37.075610 kubelet[2913]: E0123 23:35:37.075172 2913 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.10.88:35734->10.0.10.71:2379: read: connection timed out" event="&Event{ObjectMeta:{goldmane-666569f655-52tqh.188d802e05b20247 calico-system 1739 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:goldmane-666569f655-52tqh,UID:8525930c-a129-42c2-8aaf-49aa89f532c7,APIVersion:v1,ResourceVersion:814,FieldPath:spec.containers{goldmane},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4593-0-0-1-266c03b17e,},FirstTimestamp:2026-01-23 23:32:43 +0000 UTC,LastTimestamp:2026-01-23 23:35:26.637684458 +0000 UTC m=+213.398327013,Count:11,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-1-266c03b17e,}" Jan 23 23:35:37.135154 systemd[1]: cri-containerd-a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a.scope: Deactivated successfully. Jan 23 23:35:37.135512 systemd[1]: cri-containerd-a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a.scope: Consumed 38.107s CPU time, 116.1M memory peak. Jan 23 23:35:37.137000 audit: BPF prog-id=256 op=LOAD Jan 23 23:35:37.137402 systemd[1]: cri-containerd-804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106.scope: Deactivated successfully. Jan 23 23:35:37.138690 containerd[1666]: time="2026-01-23T23:35:37.138093642Z" level=info msg="received container exit event container_id:\"a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a\" id:\"a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a\" pid:3245 exit_status:1 exited_at:{seconds:1769211337 nanos:137461880}" Jan 23 23:35:37.137696 systemd[1]: cri-containerd-804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106.scope: Consumed 3.532s CPU time, 63.4M memory peak. Jan 23 23:35:37.139111 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 23:35:37.139174 kernel: audit: type=1334 audit(1769211337.137:872): prog-id=256 op=LOAD Jan 23 23:35:37.137000 audit: BPF prog-id=88 op=UNLOAD Jan 23 23:35:37.140573 containerd[1666]: time="2026-01-23T23:35:37.140541290Z" level=info msg="received container exit event container_id:\"804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106\" id:\"804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106\" pid:2747 exit_status:1 exited_at:{seconds:1769211337 nanos:139640927}" Jan 23 23:35:37.141404 kernel: audit: type=1334 audit(1769211337.137:873): prog-id=88 op=UNLOAD Jan 23 23:35:37.142366 kernel: audit: type=1334 audit(1769211337.141:874): prog-id=146 op=UNLOAD Jan 23 23:35:37.141000 audit: BPF prog-id=146 op=UNLOAD Jan 23 23:35:37.141000 audit: BPF prog-id=150 op=UNLOAD Jan 23 23:35:37.144454 kernel: audit: type=1334 audit(1769211337.141:875): prog-id=150 op=UNLOAD Jan 23 23:35:37.145000 audit: BPF prog-id=103 op=UNLOAD Jan 23 23:35:37.148979 kernel: audit: type=1334 audit(1769211337.145:876): prog-id=103 op=UNLOAD Jan 23 23:35:37.149043 kernel: audit: type=1334 audit(1769211337.145:877): prog-id=107 op=UNLOAD Jan 23 23:35:37.145000 audit: BPF prog-id=107 op=UNLOAD Jan 23 23:35:37.163658 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a-rootfs.mount: Deactivated successfully. Jan 23 23:35:37.169774 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106-rootfs.mount: Deactivated successfully. Jan 23 23:35:37.267484 kubelet[2913]: E0123 23:35:37.267439 2913 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.10.88:35922->10.0.10.71:2379: read: connection timed out" Jan 23 23:35:37.271000 audit: BPF prog-id=257 op=LOAD Jan 23 23:35:37.271490 systemd[1]: cri-containerd-feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4.scope: Deactivated successfully. Jan 23 23:35:37.271808 systemd[1]: cri-containerd-feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4.scope: Consumed 3.954s CPU time, 23.9M memory peak. Jan 23 23:35:37.271000 audit: BPF prog-id=83 op=UNLOAD Jan 23 23:35:37.274048 kernel: audit: type=1334 audit(1769211337.271:878): prog-id=257 op=LOAD Jan 23 23:35:37.274121 kernel: audit: type=1334 audit(1769211337.271:879): prog-id=83 op=UNLOAD Jan 23 23:35:37.274254 containerd[1666]: time="2026-01-23T23:35:37.274207617Z" level=info msg="received container exit event container_id:\"feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4\" id:\"feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4\" pid:2740 exit_status:1 exited_at:{seconds:1769211337 nanos:273778656}" Jan 23 23:35:37.278000 audit: BPF prog-id=98 op=UNLOAD Jan 23 23:35:37.278000 audit: BPF prog-id=102 op=UNLOAD Jan 23 23:35:37.280639 kernel: audit: type=1334 audit(1769211337.278:880): prog-id=98 op=UNLOAD Jan 23 23:35:37.280707 kernel: audit: type=1334 audit(1769211337.278:881): prog-id=102 op=UNLOAD Jan 23 23:35:37.295408 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4-rootfs.mount: Deactivated successfully. Jan 23 23:35:37.638647 containerd[1666]: time="2026-01-23T23:35:37.638611329Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:35:37.981313 containerd[1666]: time="2026-01-23T23:35:37.981083333Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:35:37.982550 containerd[1666]: time="2026-01-23T23:35:37.982485457Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:35:37.982659 containerd[1666]: time="2026-01-23T23:35:37.982556858Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:35:37.982711 kubelet[2913]: E0123 23:35:37.982672 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:35:37.982806 kubelet[2913]: E0123 23:35:37.982717 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:35:37.982917 kubelet[2913]: E0123 23:35:37.982856 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gkz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-m6qjq_calico-apiserver(97acd778-d005-41f9-8db5-25f87a68c090): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:35:37.984221 kubelet[2913]: E0123 23:35:37.984173 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-m6qjq" podUID="97acd778-d005-41f9-8db5-25f87a68c090" Jan 23 23:35:38.162809 kubelet[2913]: I0123 23:35:38.162779 2913 scope.go:117] "RemoveContainer" containerID="feca142022b9794d85f573172829aea203b459bdfdb1aaa1502b70e4ebb093f4" Jan 23 23:35:38.164910 kubelet[2913]: I0123 23:35:38.164882 2913 scope.go:117] "RemoveContainer" containerID="804b5ac76c084cd31742d4ea14804bbce8fb7a65afb9157a268209fa5684d106" Jan 23 23:35:38.165872 containerd[1666]: time="2026-01-23T23:35:38.165775096Z" level=info msg="CreateContainer within sandbox \"f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 23 23:35:38.166551 kubelet[2913]: I0123 23:35:38.166530 2913 scope.go:117] "RemoveContainer" containerID="a71d2325e808acb5b9bad868d62afed0bcacfbf57f5e8abd173755c13de0536a" Jan 23 23:35:38.167457 containerd[1666]: time="2026-01-23T23:35:38.167403741Z" level=info msg="CreateContainer within sandbox \"eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 23 23:35:38.168314 containerd[1666]: time="2026-01-23T23:35:38.168286184Z" level=info msg="CreateContainer within sandbox \"32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 23 23:35:38.178563 containerd[1666]: time="2026-01-23T23:35:38.178511215Z" level=info msg="Container e82fa7cbc89181fc3f7d0d7da23be0fa1352ece5c7bffc1a7e773af34203c0bd: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:35:38.182178 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2558442968.mount: Deactivated successfully. Jan 23 23:35:38.188171 containerd[1666]: time="2026-01-23T23:35:38.188133965Z" level=info msg="Container 1a3662a5fd4821663571f0cca0a9311c0659990b993e7423bdabbbb194f75c27: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:35:38.188685 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3995725832.mount: Deactivated successfully. Jan 23 23:35:38.190931 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1866356372.mount: Deactivated successfully. Jan 23 23:35:38.196843 containerd[1666]: time="2026-01-23T23:35:38.196809511Z" level=info msg="Container 7d566803ca6b86e043bffac163b1e858686e9fd4a9a82a4c26a34d544bd9a3f6: CDI devices from CRI Config.CDIDevices: []" Jan 23 23:35:38.197500 containerd[1666]: time="2026-01-23T23:35:38.197440593Z" level=info msg="CreateContainer within sandbox \"f164806d78fd53773c8a4b2a8bb4b970bd026fba7a9ecff2acf9e3e4aea26b95\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"e82fa7cbc89181fc3f7d0d7da23be0fa1352ece5c7bffc1a7e773af34203c0bd\"" Jan 23 23:35:38.197938 containerd[1666]: time="2026-01-23T23:35:38.197908754Z" level=info msg="StartContainer for \"e82fa7cbc89181fc3f7d0d7da23be0fa1352ece5c7bffc1a7e773af34203c0bd\"" Jan 23 23:35:38.199178 containerd[1666]: time="2026-01-23T23:35:38.199142078Z" level=info msg="connecting to shim e82fa7cbc89181fc3f7d0d7da23be0fa1352ece5c7bffc1a7e773af34203c0bd" address="unix:///run/containerd/s/404ace6a350006bd41f85b8f4ed807af7d4637e839837a653813c40a77e9d9d1" protocol=ttrpc version=3 Jan 23 23:35:38.203454 containerd[1666]: time="2026-01-23T23:35:38.203394771Z" level=info msg="CreateContainer within sandbox \"eae44b8acca936e6dea9c87741fbe1da2112af6277d2d0f8f5938f2f649ed8f0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"1a3662a5fd4821663571f0cca0a9311c0659990b993e7423bdabbbb194f75c27\"" Jan 23 23:35:38.203881 containerd[1666]: time="2026-01-23T23:35:38.203826412Z" level=info msg="StartContainer for \"1a3662a5fd4821663571f0cca0a9311c0659990b993e7423bdabbbb194f75c27\"" Jan 23 23:35:38.205285 containerd[1666]: time="2026-01-23T23:35:38.205230817Z" level=info msg="connecting to shim 1a3662a5fd4821663571f0cca0a9311c0659990b993e7423bdabbbb194f75c27" address="unix:///run/containerd/s/ee96e29dca3a6a96fe9d8db47cc062d60aa944bcdc6c4ce5ffb2fb721b507d21" protocol=ttrpc version=3 Jan 23 23:35:38.208563 containerd[1666]: time="2026-01-23T23:35:38.208520107Z" level=info msg="CreateContainer within sandbox \"32e4cc316933ce76c03c9abf4eda68b0db2fa044494978941f1af1d65b70b9a4\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"7d566803ca6b86e043bffac163b1e858686e9fd4a9a82a4c26a34d544bd9a3f6\"" Jan 23 23:35:38.209403 containerd[1666]: time="2026-01-23T23:35:38.209163629Z" level=info msg="StartContainer for \"7d566803ca6b86e043bffac163b1e858686e9fd4a9a82a4c26a34d544bd9a3f6\"" Jan 23 23:35:38.210141 containerd[1666]: time="2026-01-23T23:35:38.210109912Z" level=info msg="connecting to shim 7d566803ca6b86e043bffac163b1e858686e9fd4a9a82a4c26a34d544bd9a3f6" address="unix:///run/containerd/s/37c65ef8244f7a2b524bd6a141501db3bb0fd76f4bff8ee4d3dcd7852fc475fa" protocol=ttrpc version=3 Jan 23 23:35:38.224131 systemd[1]: Started cri-containerd-1a3662a5fd4821663571f0cca0a9311c0659990b993e7423bdabbbb194f75c27.scope - libcontainer container 1a3662a5fd4821663571f0cca0a9311c0659990b993e7423bdabbbb194f75c27. Jan 23 23:35:38.225216 systemd[1]: Started cri-containerd-e82fa7cbc89181fc3f7d0d7da23be0fa1352ece5c7bffc1a7e773af34203c0bd.scope - libcontainer container e82fa7cbc89181fc3f7d0d7da23be0fa1352ece5c7bffc1a7e773af34203c0bd. Jan 23 23:35:38.228707 systemd[1]: Started cri-containerd-7d566803ca6b86e043bffac163b1e858686e9fd4a9a82a4c26a34d544bd9a3f6.scope - libcontainer container 7d566803ca6b86e043bffac163b1e858686e9fd4a9a82a4c26a34d544bd9a3f6. Jan 23 23:35:38.240000 audit: BPF prog-id=258 op=LOAD Jan 23 23:35:38.240000 audit: BPF prog-id=259 op=LOAD Jan 23 23:35:38.240000 audit[5573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2592 pid=5573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.240000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161333636326135666434383231363633353731663063636130613933 Jan 23 23:35:38.240000 audit: BPF prog-id=259 op=UNLOAD Jan 23 23:35:38.240000 audit[5573]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=5573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.240000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161333636326135666434383231363633353731663063636130613933 Jan 23 23:35:38.241000 audit: BPF prog-id=260 op=LOAD Jan 23 23:35:38.241000 audit: BPF prog-id=261 op=LOAD Jan 23 23:35:38.241000 audit[5573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2592 pid=5573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161333636326135666434383231363633353731663063636130613933 Jan 23 23:35:38.241000 audit: BPF prog-id=262 op=LOAD Jan 23 23:35:38.241000 audit[5573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2592 pid=5573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.241000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161333636326135666434383231363633353731663063636130613933 Jan 23 23:35:38.242000 audit: BPF prog-id=262 op=UNLOAD Jan 23 23:35:38.242000 audit[5573]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=5573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161333636326135666434383231363633353731663063636130613933 Jan 23 23:35:38.242000 audit: BPF prog-id=261 op=UNLOAD Jan 23 23:35:38.242000 audit: BPF prog-id=263 op=LOAD Jan 23 23:35:38.242000 audit[5567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2590 pid=5567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538326661376362633839313831666333663764306437646132336265 Jan 23 23:35:38.242000 audit: BPF prog-id=263 op=UNLOAD Jan 23 23:35:38.242000 audit[5567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=5567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538326661376362633839313831666333663764306437646132336265 Jan 23 23:35:38.242000 audit[5573]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2592 pid=5573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161333636326135666434383231363633353731663063636130613933 Jan 23 23:35:38.242000 audit: BPF prog-id=264 op=LOAD Jan 23 23:35:38.242000 audit: BPF prog-id=265 op=LOAD Jan 23 23:35:38.242000 audit[5573]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2592 pid=5573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.242000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161333636326135666434383231363633353731663063636130613933 Jan 23 23:35:38.243000 audit: BPF prog-id=266 op=LOAD Jan 23 23:35:38.243000 audit[5581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2976 pid=5581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3764353636383033636136623836653034336266666163313633623165 Jan 23 23:35:38.243000 audit: BPF prog-id=266 op=UNLOAD Jan 23 23:35:38.243000 audit[5581]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=5581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3764353636383033636136623836653034336266666163313633623165 Jan 23 23:35:38.243000 audit: BPF prog-id=267 op=LOAD Jan 23 23:35:38.243000 audit[5581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2976 pid=5581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3764353636383033636136623836653034336266666163313633623165 Jan 23 23:35:38.243000 audit: BPF prog-id=268 op=LOAD Jan 23 23:35:38.243000 audit[5581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2976 pid=5581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3764353636383033636136623836653034336266666163313633623165 Jan 23 23:35:38.243000 audit: BPF prog-id=268 op=UNLOAD Jan 23 23:35:38.243000 audit[5581]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=5581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3764353636383033636136623836653034336266666163313633623165 Jan 23 23:35:38.243000 audit: BPF prog-id=267 op=UNLOAD Jan 23 23:35:38.243000 audit[5581]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2976 pid=5581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3764353636383033636136623836653034336266666163313633623165 Jan 23 23:35:38.243000 audit: BPF prog-id=269 op=LOAD Jan 23 23:35:38.243000 audit[5581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2976 pid=5581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3764353636383033636136623836653034336266666163313633623165 Jan 23 23:35:38.244000 audit: BPF prog-id=270 op=LOAD Jan 23 23:35:38.244000 audit[5567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2590 pid=5567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538326661376362633839313831666333663764306437646132336265 Jan 23 23:35:38.244000 audit: BPF prog-id=271 op=LOAD Jan 23 23:35:38.244000 audit[5567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2590 pid=5567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538326661376362633839313831666333663764306437646132336265 Jan 23 23:35:38.244000 audit: BPF prog-id=271 op=UNLOAD Jan 23 23:35:38.244000 audit[5567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=5567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538326661376362633839313831666333663764306437646132336265 Jan 23 23:35:38.244000 audit: BPF prog-id=270 op=UNLOAD Jan 23 23:35:38.244000 audit[5567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2590 pid=5567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538326661376362633839313831666333663764306437646132336265 Jan 23 23:35:38.244000 audit: BPF prog-id=272 op=LOAD Jan 23 23:35:38.244000 audit[5567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2590 pid=5567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 23:35:38.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538326661376362633839313831666333663764306437646132336265 Jan 23 23:35:38.278653 containerd[1666]: time="2026-01-23T23:35:38.278607361Z" level=info msg="StartContainer for \"7d566803ca6b86e043bffac163b1e858686e9fd4a9a82a4c26a34d544bd9a3f6\" returns successfully" Jan 23 23:35:38.283562 containerd[1666]: time="2026-01-23T23:35:38.283518576Z" level=info msg="StartContainer for \"e82fa7cbc89181fc3f7d0d7da23be0fa1352ece5c7bffc1a7e773af34203c0bd\" returns successfully" Jan 23 23:35:38.285289 containerd[1666]: time="2026-01-23T23:35:38.285259141Z" level=info msg="StartContainer for \"1a3662a5fd4821663571f0cca0a9311c0659990b993e7423bdabbbb194f75c27\" returns successfully" Jan 23 23:35:38.641016 containerd[1666]: time="2026-01-23T23:35:38.640768745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 23:35:38.968149 containerd[1666]: time="2026-01-23T23:35:38.967889863Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:35:38.970937 containerd[1666]: time="2026-01-23T23:35:38.970870792Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 23:35:38.971044 containerd[1666]: time="2026-01-23T23:35:38.970967032Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 23:35:38.971227 kubelet[2913]: E0123 23:35:38.971193 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:35:38.971332 kubelet[2913]: E0123 23:35:38.971316 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 23:35:38.971540 kubelet[2913]: E0123 23:35:38.971499 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbrkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-579f6fb948-2l9kd_calico-apiserver(aece08d9-f39f-4025-afe7-4d9ae33375fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 23:35:38.972835 kubelet[2913]: E0123 23:35:38.972795 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-579f6fb948-2l9kd" podUID="aece08d9-f39f-4025-afe7-4d9ae33375fe" Jan 23 23:35:39.638853 containerd[1666]: time="2026-01-23T23:35:39.638616268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 23:35:39.979336 containerd[1666]: time="2026-01-23T23:35:39.979004186Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:35:39.980986 containerd[1666]: time="2026-01-23T23:35:39.980892432Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 23:35:39.981167 containerd[1666]: time="2026-01-23T23:35:39.981124273Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 23:35:39.981337 kubelet[2913]: E0123 23:35:39.981300 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:35:39.981622 kubelet[2913]: E0123 23:35:39.981347 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 23:35:39.981622 kubelet[2913]: E0123 23:35:39.981449 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:89d4259d6b224a32924e4c2e731fe8b2,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 23:35:39.983661 containerd[1666]: time="2026-01-23T23:35:39.983641201Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 23:35:40.301389 containerd[1666]: time="2026-01-23T23:35:40.300911728Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:35:40.303388 containerd[1666]: time="2026-01-23T23:35:40.303318415Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 23:35:40.303456 containerd[1666]: time="2026-01-23T23:35:40.303408896Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 23:35:40.303651 kubelet[2913]: E0123 23:35:40.303585 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:35:40.303735 kubelet[2913]: E0123 23:35:40.303654 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 23:35:40.303806 kubelet[2913]: E0123 23:35:40.303771 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x98wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5c66cc49b-5j6h9_calico-system(ba9f6f29-6ccf-4464-bff0-93a0f3e2b483): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 23:35:40.305158 kubelet[2913]: E0123 23:35:40.305057 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5c66cc49b-5j6h9" podUID="ba9f6f29-6ccf-4464-bff0-93a0f3e2b483" Jan 23 23:35:40.637738 containerd[1666]: time="2026-01-23T23:35:40.637688795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 23:35:40.961294 containerd[1666]: time="2026-01-23T23:35:40.961073261Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 23:35:40.962691 containerd[1666]: time="2026-01-23T23:35:40.962597666Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 23:35:40.962874 containerd[1666]: time="2026-01-23T23:35:40.962650906Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 23:35:40.962915 kubelet[2913]: E0123 23:35:40.962861 2913 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:35:40.962915 kubelet[2913]: E0123 23:35:40.962906 2913 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 23:35:40.963151 kubelet[2913]: E0123 23:35:40.963101 2913 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdrj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-52tqh_calico-system(8525930c-a129-42c2-8aaf-49aa89f532c7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 23:35:40.964369 kubelet[2913]: E0123 23:35:40.964301 2913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-52tqh" podUID="8525930c-a129-42c2-8aaf-49aa89f532c7"