Jan 27 23:57:12.453554 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 27 23:57:12.453578 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Jan 27 22:20:26 -00 2026 Jan 27 23:57:12.453588 kernel: KASLR enabled Jan 27 23:57:12.453593 kernel: efi: EFI v2.7 by EDK II Jan 27 23:57:12.453599 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 27 23:57:12.453605 kernel: random: crng init done Jan 27 23:57:12.453612 kernel: secureboot: Secure boot disabled Jan 27 23:57:12.453619 kernel: ACPI: Early table checksum verification disabled Jan 27 23:57:12.453625 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 27 23:57:12.453633 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 27 23:57:12.453641 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453649 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453656 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453662 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453671 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453678 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453684 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453691 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453698 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453704 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 23:57:12.453711 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 27 23:57:12.453718 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 27 23:57:12.453739 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 27 23:57:12.453748 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 27 23:57:12.453755 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 27 23:57:12.453762 kernel: Zone ranges: Jan 27 23:57:12.453768 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 27 23:57:12.453774 kernel: DMA32 empty Jan 27 23:57:12.453781 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 27 23:57:12.453787 kernel: Device empty Jan 27 23:57:12.453794 kernel: Movable zone start for each node Jan 27 23:57:12.453800 kernel: Early memory node ranges Jan 27 23:57:12.453807 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 27 23:57:12.453814 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 27 23:57:12.453820 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 27 23:57:12.453828 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 27 23:57:12.453835 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 27 23:57:12.453841 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 27 23:57:12.453848 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 27 23:57:12.453855 kernel: psci: probing for conduit method from ACPI. Jan 27 23:57:12.453864 kernel: psci: PSCIv1.3 detected in firmware. Jan 27 23:57:12.453872 kernel: psci: Using standard PSCI v0.2 function IDs Jan 27 23:57:12.453879 kernel: psci: Trusted OS migration not required Jan 27 23:57:12.453886 kernel: psci: SMC Calling Convention v1.1 Jan 27 23:57:12.453893 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 27 23:57:12.453900 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 27 23:57:12.453907 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 27 23:57:12.453913 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 27 23:57:12.453920 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 27 23:57:12.453928 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 27 23:57:12.453935 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 27 23:57:12.453942 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 27 23:57:12.453949 kernel: Detected PIPT I-cache on CPU0 Jan 27 23:57:12.453956 kernel: CPU features: detected: GIC system register CPU interface Jan 27 23:57:12.453963 kernel: CPU features: detected: Spectre-v4 Jan 27 23:57:12.453970 kernel: CPU features: detected: Spectre-BHB Jan 27 23:57:12.453977 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 27 23:57:12.453984 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 27 23:57:12.453991 kernel: CPU features: detected: ARM erratum 1418040 Jan 27 23:57:12.453998 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 27 23:57:12.454006 kernel: alternatives: applying boot alternatives Jan 27 23:57:12.454014 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=880c7a57ca1a4cf41361128ef304e12abcda0ba85f8697ad932e9820a1865169 Jan 27 23:57:12.454021 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 27 23:57:12.454028 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 27 23:57:12.454035 kernel: Fallback order for Node 0: 0 Jan 27 23:57:12.454042 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 27 23:57:12.454048 kernel: Policy zone: Normal Jan 27 23:57:12.454055 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 27 23:57:12.454062 kernel: software IO TLB: area num 4. Jan 27 23:57:12.454069 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 27 23:57:12.454077 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 27 23:57:12.454084 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 27 23:57:12.454092 kernel: rcu: RCU event tracing is enabled. Jan 27 23:57:12.454099 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 27 23:57:12.454106 kernel: Trampoline variant of Tasks RCU enabled. Jan 27 23:57:12.454113 kernel: Tracing variant of Tasks RCU enabled. Jan 27 23:57:12.454120 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 27 23:57:12.454127 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 27 23:57:12.454134 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 27 23:57:12.454141 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 27 23:57:12.454148 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 27 23:57:12.454156 kernel: GICv3: 256 SPIs implemented Jan 27 23:57:12.454163 kernel: GICv3: 0 Extended SPIs implemented Jan 27 23:57:12.454170 kernel: Root IRQ handler: gic_handle_irq Jan 27 23:57:12.454177 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 27 23:57:12.454183 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 27 23:57:12.454190 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 27 23:57:12.454197 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 27 23:57:12.454204 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 27 23:57:12.454212 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 27 23:57:12.454218 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 27 23:57:12.454225 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 27 23:57:12.454232 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 27 23:57:12.454240 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 23:57:12.454247 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 27 23:57:12.454254 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 27 23:57:12.454261 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 27 23:57:12.454268 kernel: arm-pv: using stolen time PV Jan 27 23:57:12.454276 kernel: Console: colour dummy device 80x25 Jan 27 23:57:12.454283 kernel: ACPI: Core revision 20240827 Jan 27 23:57:12.454291 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 27 23:57:12.454300 kernel: pid_max: default: 32768 minimum: 301 Jan 27 23:57:12.454307 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 27 23:57:12.454314 kernel: landlock: Up and running. Jan 27 23:57:12.454321 kernel: SELinux: Initializing. Jan 27 23:57:12.454329 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 27 23:57:12.454336 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 27 23:57:12.454344 kernel: rcu: Hierarchical SRCU implementation. Jan 27 23:57:12.454351 kernel: rcu: Max phase no-delay instances is 400. Jan 27 23:57:12.454360 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 27 23:57:12.454367 kernel: Remapping and enabling EFI services. Jan 27 23:57:12.454374 kernel: smp: Bringing up secondary CPUs ... Jan 27 23:57:12.454381 kernel: Detected PIPT I-cache on CPU1 Jan 27 23:57:12.454389 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 27 23:57:12.454396 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 27 23:57:12.454403 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 23:57:12.454412 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 27 23:57:12.454419 kernel: Detected PIPT I-cache on CPU2 Jan 27 23:57:12.454431 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 27 23:57:12.454440 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 27 23:57:12.454448 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 23:57:12.454455 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 27 23:57:12.454462 kernel: Detected PIPT I-cache on CPU3 Jan 27 23:57:12.454470 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 27 23:57:12.454480 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 27 23:57:12.454488 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 23:57:12.454495 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 27 23:57:12.454503 kernel: smp: Brought up 1 node, 4 CPUs Jan 27 23:57:12.454510 kernel: SMP: Total of 4 processors activated. Jan 27 23:57:12.454518 kernel: CPU: All CPU(s) started at EL1 Jan 27 23:57:12.454526 kernel: CPU features: detected: 32-bit EL0 Support Jan 27 23:57:12.454534 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 27 23:57:12.454542 kernel: CPU features: detected: Common not Private translations Jan 27 23:57:12.454549 kernel: CPU features: detected: CRC32 instructions Jan 27 23:57:12.454557 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 27 23:57:12.454565 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 27 23:57:12.454572 kernel: CPU features: detected: LSE atomic instructions Jan 27 23:57:12.454581 kernel: CPU features: detected: Privileged Access Never Jan 27 23:57:12.454589 kernel: CPU features: detected: RAS Extension Support Jan 27 23:57:12.454597 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 27 23:57:12.454604 kernel: alternatives: applying system-wide alternatives Jan 27 23:57:12.454612 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 27 23:57:12.454620 kernel: Memory: 16324368K/16777216K available (11200K kernel code, 2458K rwdata, 9092K rodata, 12480K init, 1038K bss, 430064K reserved, 16384K cma-reserved) Jan 27 23:57:12.454628 kernel: devtmpfs: initialized Jan 27 23:57:12.454637 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 27 23:57:12.454644 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 27 23:57:12.454652 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 27 23:57:12.454659 kernel: 0 pages in range for non-PLT usage Jan 27 23:57:12.454667 kernel: 515152 pages in range for PLT usage Jan 27 23:57:12.454674 kernel: pinctrl core: initialized pinctrl subsystem Jan 27 23:57:12.454682 kernel: SMBIOS 3.0.0 present. Jan 27 23:57:12.454690 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 27 23:57:12.454699 kernel: DMI: Memory slots populated: 1/1 Jan 27 23:57:12.454706 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 27 23:57:12.454714 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 27 23:57:12.454721 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 27 23:57:12.454735 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 27 23:57:12.454743 kernel: audit: initializing netlink subsys (disabled) Jan 27 23:57:12.454751 kernel: audit: type=2000 audit(0.039:1): state=initialized audit_enabled=0 res=1 Jan 27 23:57:12.454760 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 27 23:57:12.454767 kernel: cpuidle: using governor menu Jan 27 23:57:12.454775 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 27 23:57:12.454783 kernel: ASID allocator initialised with 32768 entries Jan 27 23:57:12.454790 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 27 23:57:12.454798 kernel: Serial: AMBA PL011 UART driver Jan 27 23:57:12.454805 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 27 23:57:12.454814 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 27 23:57:12.454821 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 27 23:57:12.454829 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 27 23:57:12.454836 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 27 23:57:12.454844 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 27 23:57:12.454851 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 27 23:57:12.454859 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 27 23:57:12.454867 kernel: ACPI: Added _OSI(Module Device) Jan 27 23:57:12.454875 kernel: ACPI: Added _OSI(Processor Device) Jan 27 23:57:12.454894 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 27 23:57:12.454902 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 27 23:57:12.454910 kernel: ACPI: Interpreter enabled Jan 27 23:57:12.454917 kernel: ACPI: Using GIC for interrupt routing Jan 27 23:57:12.454925 kernel: ACPI: MCFG table detected, 1 entries Jan 27 23:57:12.454933 kernel: ACPI: CPU0 has been hot-added Jan 27 23:57:12.454942 kernel: ACPI: CPU1 has been hot-added Jan 27 23:57:12.454950 kernel: ACPI: CPU2 has been hot-added Jan 27 23:57:12.454957 kernel: ACPI: CPU3 has been hot-added Jan 27 23:57:12.454965 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 27 23:57:12.454973 kernel: printk: legacy console [ttyAMA0] enabled Jan 27 23:57:12.454980 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 27 23:57:12.455140 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 27 23:57:12.455230 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 27 23:57:12.455326 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 27 23:57:12.455414 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 27 23:57:12.455494 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 27 23:57:12.455504 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 27 23:57:12.455512 kernel: PCI host bridge to bus 0000:00 Jan 27 23:57:12.455599 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 27 23:57:12.455673 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 27 23:57:12.455768 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 27 23:57:12.455843 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 27 23:57:12.455942 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 27 23:57:12.456035 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.456122 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 27 23:57:12.456202 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 27 23:57:12.456281 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 27 23:57:12.456360 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 27 23:57:12.456449 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.456532 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 27 23:57:12.456612 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 27 23:57:12.456690 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 27 23:57:12.456809 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.456894 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 27 23:57:12.456977 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 27 23:57:12.457056 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 27 23:57:12.457135 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 27 23:57:12.457224 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.457305 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 27 23:57:12.457386 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 27 23:57:12.457470 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 27 23:57:12.457556 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.457635 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 27 23:57:12.457712 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 27 23:57:12.457805 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 27 23:57:12.457884 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 27 23:57:12.457971 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.458050 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 27 23:57:12.458128 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 27 23:57:12.458204 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 27 23:57:12.458293 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 27 23:57:12.458383 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.458479 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 27 23:57:12.458564 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 27 23:57:12.458654 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.458745 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 27 23:57:12.458828 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 27 23:57:12.458935 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.459024 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 27 23:57:12.459105 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 27 23:57:12.459191 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.459271 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 27 23:57:12.459352 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 27 23:57:12.459437 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.459516 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 27 23:57:12.459593 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 27 23:57:12.459684 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.461862 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 27 23:57:12.461977 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 27 23:57:12.462089 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.462179 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 27 23:57:12.462263 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 27 23:57:12.462352 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.462432 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 27 23:57:12.462514 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 27 23:57:12.462599 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.462679 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 27 23:57:12.462779 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 27 23:57:12.462868 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.462972 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 27 23:57:12.463052 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 27 23:57:12.463139 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.463218 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 27 23:57:12.463295 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 27 23:57:12.463382 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.463468 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 27 23:57:12.463548 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 27 23:57:12.463629 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 27 23:57:12.463747 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 27 23:57:12.463840 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.463921 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 27 23:57:12.464003 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 27 23:57:12.464081 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 27 23:57:12.464159 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 27 23:57:12.464246 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.464325 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 27 23:57:12.464402 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 27 23:57:12.464481 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 27 23:57:12.464558 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 27 23:57:12.464647 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.464753 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 27 23:57:12.464841 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 27 23:57:12.464920 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 27 23:57:12.465001 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 27 23:57:12.465088 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.465167 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 27 23:57:12.465246 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 27 23:57:12.465323 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 27 23:57:12.465401 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 27 23:57:12.465487 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.465566 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 27 23:57:12.465644 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 27 23:57:12.465722 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 27 23:57:12.465824 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 27 23:57:12.465911 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.465992 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 27 23:57:12.466108 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 27 23:57:12.466189 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 27 23:57:12.466268 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 27 23:57:12.466353 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.466432 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 27 23:57:12.466513 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 27 23:57:12.466590 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 27 23:57:12.466668 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 27 23:57:12.466823 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.466928 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 27 23:57:12.467011 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 27 23:57:12.467093 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 27 23:57:12.467171 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 27 23:57:12.467257 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.467336 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 27 23:57:12.467413 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 27 23:57:12.467490 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 27 23:57:12.467570 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 27 23:57:12.467655 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.467745 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 27 23:57:12.467829 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 27 23:57:12.467909 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 27 23:57:12.467987 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 27 23:57:12.468076 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.468156 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 27 23:57:12.468233 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 27 23:57:12.468312 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 27 23:57:12.468390 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 27 23:57:12.468475 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.468554 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 27 23:57:12.468632 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 27 23:57:12.468710 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 27 23:57:12.468801 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 27 23:57:12.468893 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.468972 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 27 23:57:12.469050 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 27 23:57:12.469127 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 27 23:57:12.469204 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 27 23:57:12.469292 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.469370 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 27 23:57:12.469447 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 27 23:57:12.469524 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 27 23:57:12.469601 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 27 23:57:12.469686 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 23:57:12.469778 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 27 23:57:12.469857 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 27 23:57:12.469935 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 27 23:57:12.470012 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 27 23:57:12.470100 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 27 23:57:12.470181 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 27 23:57:12.470264 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 27 23:57:12.470344 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 27 23:57:12.470435 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 27 23:57:12.470518 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 27 23:57:12.470608 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 27 23:57:12.470691 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 27 23:57:12.470787 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 27 23:57:12.470893 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 27 23:57:12.470982 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 27 23:57:12.471077 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 27 23:57:12.471167 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 27 23:57:12.471252 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 27 23:57:12.471341 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 27 23:57:12.471425 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 27 23:57:12.471506 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 27 23:57:12.471588 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 27 23:57:12.471671 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 27 23:57:12.471769 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 27 23:57:12.471858 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 27 23:57:12.471938 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 27 23:57:12.472021 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 27 23:57:12.472105 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 27 23:57:12.472185 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 27 23:57:12.472265 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 27 23:57:12.472347 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 27 23:57:12.472432 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 27 23:57:12.472515 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 27 23:57:12.472600 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 27 23:57:12.472699 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 27 23:57:12.472809 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 27 23:57:12.472896 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 27 23:57:12.472979 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 27 23:57:12.473058 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 27 23:57:12.473139 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 27 23:57:12.473218 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 27 23:57:12.473296 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 27 23:57:12.473379 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 27 23:57:12.473460 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 27 23:57:12.473539 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 27 23:57:12.473623 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 27 23:57:12.473704 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 27 23:57:12.473828 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 27 23:57:12.473922 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 27 23:57:12.474005 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 27 23:57:12.474084 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 27 23:57:12.474167 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 27 23:57:12.474247 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 27 23:57:12.474325 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 27 23:57:12.474411 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 27 23:57:12.474507 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 27 23:57:12.474586 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 27 23:57:12.474670 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 27 23:57:12.474760 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 27 23:57:12.474841 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 27 23:57:12.474949 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 27 23:57:12.475037 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 27 23:57:12.475118 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 27 23:57:12.475204 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 27 23:57:12.475286 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 27 23:57:12.475372 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 27 23:57:12.475477 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 27 23:57:12.475557 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 27 23:57:12.475638 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 27 23:57:12.475720 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 27 23:57:12.475845 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 27 23:57:12.475928 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 27 23:57:12.476023 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 27 23:57:12.476107 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 27 23:57:12.476187 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 27 23:57:12.476272 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 27 23:57:12.476354 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 27 23:57:12.476438 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 27 23:57:12.476526 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 27 23:57:12.476609 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 27 23:57:12.476689 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 27 23:57:12.476781 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 27 23:57:12.476866 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 27 23:57:12.476945 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 27 23:57:12.477028 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 27 23:57:12.477126 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 27 23:57:12.477205 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 27 23:57:12.477286 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 27 23:57:12.477369 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 27 23:57:12.477448 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 27 23:57:12.477530 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 27 23:57:12.477609 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 27 23:57:12.477690 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 27 23:57:12.477807 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 27 23:57:12.477890 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 27 23:57:12.477968 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 27 23:57:12.478052 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 27 23:57:12.478131 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 27 23:57:12.478209 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 27 23:57:12.478293 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 27 23:57:12.478373 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 27 23:57:12.478450 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 27 23:57:12.478532 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 27 23:57:12.478612 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 27 23:57:12.478692 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 27 23:57:12.478784 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 27 23:57:12.478864 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 27 23:57:12.478961 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 27 23:57:12.479046 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 27 23:57:12.479125 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 27 23:57:12.479207 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 27 23:57:12.479290 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 27 23:57:12.479369 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 27 23:57:12.479447 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 27 23:57:12.479530 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 27 23:57:12.479609 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 27 23:57:12.479689 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 27 23:57:12.479784 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 27 23:57:12.479865 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 27 23:57:12.479943 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 27 23:57:12.480024 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 27 23:57:12.480105 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 27 23:57:12.480186 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 27 23:57:12.480264 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 27 23:57:12.480345 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 27 23:57:12.480424 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 27 23:57:12.480506 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 27 23:57:12.480585 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 27 23:57:12.480666 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 27 23:57:12.480754 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 27 23:57:12.480834 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 27 23:57:12.480913 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 27 23:57:12.480993 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 27 23:57:12.481070 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 27 23:57:12.481152 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 27 23:57:12.481230 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 27 23:57:12.481310 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 27 23:57:12.481388 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 27 23:57:12.481468 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 27 23:57:12.481549 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 27 23:57:12.481630 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 27 23:57:12.481709 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 27 23:57:12.481798 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 27 23:57:12.481877 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 27 23:57:12.481958 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 27 23:57:12.482037 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 27 23:57:12.482116 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 27 23:57:12.482196 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 27 23:57:12.482277 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 27 23:57:12.482355 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 27 23:57:12.482435 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 27 23:57:12.482512 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 27 23:57:12.482591 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 27 23:57:12.482672 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 27 23:57:12.482764 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 27 23:57:12.482844 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 27 23:57:12.482941 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 27 23:57:12.483023 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 27 23:57:12.483103 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 27 23:57:12.483184 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 27 23:57:12.483263 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 27 23:57:12.483341 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 27 23:57:12.483421 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 27 23:57:12.483499 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 27 23:57:12.483578 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 27 23:57:12.483655 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 27 23:57:12.483748 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 27 23:57:12.483829 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 27 23:57:12.483909 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 27 23:57:12.483987 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 27 23:57:12.484065 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 27 23:57:12.484143 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 27 23:57:12.484226 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 27 23:57:12.484304 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 27 23:57:12.484385 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 27 23:57:12.484463 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 27 23:57:12.484543 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 27 23:57:12.484621 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 27 23:57:12.484702 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 27 23:57:12.484789 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 27 23:57:12.484871 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 27 23:57:12.484950 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 27 23:57:12.485030 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 27 23:57:12.485108 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 27 23:57:12.485187 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 27 23:57:12.485269 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 27 23:57:12.485349 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 27 23:57:12.485427 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 27 23:57:12.485507 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 27 23:57:12.485585 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 27 23:57:12.485664 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 27 23:57:12.485752 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 27 23:57:12.485835 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 27 23:57:12.485914 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 27 23:57:12.485993 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 27 23:57:12.486070 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 27 23:57:12.486151 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 27 23:57:12.486228 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 27 23:57:12.486310 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 27 23:57:12.486389 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 27 23:57:12.486468 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 27 23:57:12.486546 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 27 23:57:12.486626 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 27 23:57:12.486704 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 27 23:57:12.486795 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 27 23:57:12.486875 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 27 23:57:12.486976 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 27 23:57:12.487058 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 27 23:57:12.487138 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 27 23:57:12.487217 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 27 23:57:12.487296 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 27 23:57:12.487378 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 27 23:57:12.487457 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 27 23:57:12.487536 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 27 23:57:12.487616 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 27 23:57:12.487699 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 27 23:57:12.487796 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 27 23:57:12.487877 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.487956 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.488036 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 27 23:57:12.488114 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.488195 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.488274 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 27 23:57:12.488353 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.488431 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.488510 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 27 23:57:12.488589 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.488668 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.488756 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 27 23:57:12.488836 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.488915 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.488998 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 27 23:57:12.489077 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.489155 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.489237 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 27 23:57:12.489315 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.489392 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.489471 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 27 23:57:12.489549 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.489626 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.489708 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 27 23:57:12.489795 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.489873 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.489952 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 27 23:57:12.490030 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.490108 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.490188 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 27 23:57:12.490268 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.490345 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.490425 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 27 23:57:12.490504 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.490582 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.490664 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 27 23:57:12.490752 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.490831 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.490924 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 27 23:57:12.491005 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.491083 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.491163 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 27 23:57:12.491244 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.491322 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.491403 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 27 23:57:12.491481 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.491560 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.491640 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 27 23:57:12.491718 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.491812 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.491892 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 27 23:57:12.491970 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.492049 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.492127 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 27 23:57:12.492205 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 27 23:57:12.492286 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 27 23:57:12.492366 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 27 23:57:12.492445 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 27 23:57:12.492525 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 27 23:57:12.492603 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 27 23:57:12.492682 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 27 23:57:12.492774 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 27 23:57:12.492889 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 27 23:57:12.492977 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 27 23:57:12.493061 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 27 23:57:12.493147 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 27 23:57:12.493361 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 27 23:57:12.493445 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 27 23:57:12.493527 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.493606 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.493734 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.493828 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.493910 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.493989 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.494954 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.495061 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.495143 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.495233 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.495314 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.495392 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.495471 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.495549 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.495629 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.495710 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.495837 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.495920 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.496002 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.496080 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.496161 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.496244 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.496324 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.496403 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.496484 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.496562 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.496649 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.496742 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.496852 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.496935 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.497017 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.497096 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.497181 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.497262 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.497361 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 23:57:12.497440 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 27 23:57:12.497528 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 27 23:57:12.497611 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 27 23:57:12.497695 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 27 23:57:12.497811 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 27 23:57:12.497895 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 27 23:57:12.497979 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 27 23:57:12.498068 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 27 23:57:12.498167 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 27 23:57:12.498253 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 27 23:57:12.498334 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 27 23:57:12.498419 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 27 23:57:12.498504 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 27 23:57:12.498583 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 27 23:57:12.498661 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 27 23:57:12.498815 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 27 23:57:12.498934 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 27 23:57:12.499021 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 27 23:57:12.499102 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 27 23:57:12.499225 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 27 23:57:12.499315 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 27 23:57:12.499403 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 27 23:57:12.499484 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 27 23:57:12.499564 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 27 23:57:12.499643 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 27 23:57:12.499756 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 27 23:57:12.499846 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 27 23:57:12.499930 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 27 23:57:12.500011 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 27 23:57:12.500090 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 27 23:57:12.500172 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 27 23:57:12.500251 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 27 23:57:12.500332 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 27 23:57:12.500413 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 27 23:57:12.500493 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 27 23:57:12.500573 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 27 23:57:12.500654 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 27 23:57:12.500751 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 27 23:57:12.500838 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 27 23:57:12.500924 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 27 23:57:12.501023 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 27 23:57:12.501105 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 27 23:57:12.501187 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 27 23:57:12.501266 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 27 23:57:12.501348 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 27 23:57:12.501427 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 27 23:57:12.501506 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 27 23:57:12.501584 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 27 23:57:12.501664 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 27 23:57:12.501754 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 27 23:57:12.501844 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 27 23:57:12.501925 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 27 23:57:12.502005 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 27 23:57:12.502083 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 27 23:57:12.502164 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 27 23:57:12.502245 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 27 23:57:12.502326 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 27 23:57:12.502405 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 27 23:57:12.502487 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 27 23:57:12.502566 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 27 23:57:12.502647 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 27 23:57:12.502734 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 27 23:57:12.502814 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 27 23:57:12.502907 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 27 23:57:12.502992 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 27 23:57:12.503071 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 27 23:57:12.503154 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 27 23:57:12.503232 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 27 23:57:12.503311 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 27 23:57:12.503389 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 27 23:57:12.503470 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 27 23:57:12.503548 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 27 23:57:12.503628 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 27 23:57:12.503707 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 27 23:57:12.503801 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 27 23:57:12.503884 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 27 23:57:12.503965 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 27 23:57:12.504045 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 27 23:57:12.504126 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 27 23:57:12.504208 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 27 23:57:12.504286 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 27 23:57:12.504366 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 27 23:57:12.504447 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 27 23:57:12.504528 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 27 23:57:12.504607 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 27 23:57:12.504687 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 27 23:57:12.504792 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 27 23:57:12.504877 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 27 23:57:12.504958 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 27 23:57:12.505038 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 27 23:57:12.505119 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 27 23:57:12.505199 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 27 23:57:12.505281 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 27 23:57:12.505366 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 27 23:57:12.505450 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 27 23:57:12.505529 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 27 23:57:12.505611 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 27 23:57:12.505737 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 27 23:57:12.505825 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 27 23:57:12.505909 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 27 23:57:12.505992 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 27 23:57:12.506071 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 27 23:57:12.506152 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 27 23:57:12.506231 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 27 23:57:12.506309 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 27 23:57:12.506389 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 27 23:57:12.506474 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 27 23:57:12.506555 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 27 23:57:12.506633 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 27 23:57:12.506711 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 27 23:57:12.506802 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 27 23:57:12.506919 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 27 23:57:12.507013 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 27 23:57:12.507095 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 27 23:57:12.507178 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 27 23:57:12.507260 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 27 23:57:12.507339 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 27 23:57:12.507418 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 27 23:57:12.507501 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 27 23:57:12.507583 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 27 23:57:12.507662 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 27 23:57:12.507759 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 27 23:57:12.507850 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 27 23:57:12.507930 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 27 23:57:12.508009 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 27 23:57:12.508089 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 27 23:57:12.508174 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 27 23:57:12.508245 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 27 23:57:12.508316 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 27 23:57:12.508409 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 27 23:57:12.508485 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 27 23:57:12.508568 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 27 23:57:12.508641 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 27 23:57:12.508721 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 27 23:57:12.508815 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 27 23:57:12.508921 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 27 23:57:12.508998 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 27 23:57:12.509082 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 27 23:57:12.509156 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 27 23:57:12.509239 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 27 23:57:12.509315 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 27 23:57:12.509397 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 27 23:57:12.509475 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 27 23:57:12.509557 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 27 23:57:12.509634 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 27 23:57:12.509716 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 27 23:57:12.509809 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 27 23:57:12.509902 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 27 23:57:12.509978 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 27 23:57:12.510059 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 27 23:57:12.510133 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 27 23:57:12.510212 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 27 23:57:12.510289 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 27 23:57:12.510369 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 27 23:57:12.510443 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 27 23:57:12.510525 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 27 23:57:12.510598 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 27 23:57:12.510681 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 27 23:57:12.510765 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 27 23:57:12.510845 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 27 23:57:12.510938 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 27 23:57:12.511025 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 27 23:57:12.511100 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 27 23:57:12.511188 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 27 23:57:12.511282 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 27 23:57:12.511367 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 27 23:57:12.511443 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 27 23:57:12.511523 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 27 23:57:12.511604 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 27 23:57:12.511680 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 27 23:57:12.511774 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 27 23:57:12.511859 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 27 23:57:12.511934 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 27 23:57:12.512010 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 27 23:57:12.512092 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 27 23:57:12.512166 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 27 23:57:12.512240 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 27 23:57:12.512369 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 27 23:57:12.512450 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 27 23:57:12.512524 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 27 23:57:12.512604 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 27 23:57:12.512678 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 27 23:57:12.512764 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 27 23:57:12.512850 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 27 23:57:12.512928 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 27 23:57:12.513004 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 27 23:57:12.513084 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 27 23:57:12.513158 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 27 23:57:12.513230 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 27 23:57:12.513318 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 27 23:57:12.513394 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 27 23:57:12.513469 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 27 23:57:12.513551 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 27 23:57:12.513623 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 27 23:57:12.513696 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 27 23:57:12.513795 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 27 23:57:12.513870 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 27 23:57:12.513942 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 27 23:57:12.514020 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 27 23:57:12.514093 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 27 23:57:12.514165 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 27 23:57:12.514246 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 27 23:57:12.514320 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 27 23:57:12.514394 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 27 23:57:12.514474 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 27 23:57:12.514547 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 27 23:57:12.514619 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 27 23:57:12.514700 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 27 23:57:12.514787 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 27 23:57:12.514861 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 27 23:57:12.514872 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 27 23:57:12.514889 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 27 23:57:12.514898 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 27 23:57:12.514909 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 27 23:57:12.514917 kernel: iommu: Default domain type: Translated Jan 27 23:57:12.514926 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 27 23:57:12.514934 kernel: efivars: Registered efivars operations Jan 27 23:57:12.514942 kernel: vgaarb: loaded Jan 27 23:57:12.514950 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 27 23:57:12.514958 kernel: VFS: Disk quotas dquot_6.6.0 Jan 27 23:57:12.514968 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 27 23:57:12.514976 kernel: pnp: PnP ACPI init Jan 27 23:57:12.515082 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 27 23:57:12.515095 kernel: pnp: PnP ACPI: found 1 devices Jan 27 23:57:12.515103 kernel: NET: Registered PF_INET protocol family Jan 27 23:57:12.515111 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 27 23:57:12.515119 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 27 23:57:12.515129 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 27 23:57:12.515138 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 27 23:57:12.515146 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 27 23:57:12.515154 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 27 23:57:12.515162 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 27 23:57:12.515170 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 27 23:57:12.515178 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 27 23:57:12.515269 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 27 23:57:12.515281 kernel: PCI: CLS 0 bytes, default 64 Jan 27 23:57:12.515289 kernel: kvm [1]: HYP mode not available Jan 27 23:57:12.515297 kernel: Initialise system trusted keyrings Jan 27 23:57:12.515305 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 27 23:57:12.515314 kernel: Key type asymmetric registered Jan 27 23:57:12.515321 kernel: Asymmetric key parser 'x509' registered Jan 27 23:57:12.515331 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 27 23:57:12.515339 kernel: io scheduler mq-deadline registered Jan 27 23:57:12.515347 kernel: io scheduler kyber registered Jan 27 23:57:12.515356 kernel: io scheduler bfq registered Jan 27 23:57:12.515364 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 27 23:57:12.515446 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 27 23:57:12.515527 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 27 23:57:12.515608 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.515690 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 27 23:57:12.515787 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 27 23:57:12.515868 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.515949 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 27 23:57:12.516029 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 27 23:57:12.516110 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.516193 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 27 23:57:12.516272 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 27 23:57:12.516350 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.516430 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 27 23:57:12.516509 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 27 23:57:12.516592 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.516673 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 27 23:57:12.516768 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 27 23:57:12.516849 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.516930 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 27 23:57:12.517009 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 27 23:57:12.517086 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.517170 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 27 23:57:12.517248 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 27 23:57:12.517326 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.517337 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 27 23:57:12.517416 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 27 23:57:12.517525 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 27 23:57:12.517610 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.517692 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 27 23:57:12.517781 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 27 23:57:12.517861 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.517943 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 27 23:57:12.518022 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 27 23:57:12.518103 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.518184 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 27 23:57:12.518270 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 27 23:57:12.518348 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.518429 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 27 23:57:12.518509 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 27 23:57:12.518588 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.518671 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 27 23:57:12.518766 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 27 23:57:12.518850 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.518956 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 27 23:57:12.519039 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 27 23:57:12.519117 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.519203 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 27 23:57:12.519284 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 27 23:57:12.519362 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.519373 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 27 23:57:12.519455 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 27 23:57:12.519539 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 27 23:57:12.519621 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.519703 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 27 23:57:12.519798 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 27 23:57:12.519879 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.519962 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 27 23:57:12.520042 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 27 23:57:12.520124 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.520210 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 27 23:57:12.520289 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 27 23:57:12.520368 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.520449 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 27 23:57:12.520529 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 27 23:57:12.520608 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.520693 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 27 23:57:12.520783 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 27 23:57:12.520867 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.520951 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 27 23:57:12.521033 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 27 23:57:12.521133 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.521220 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 27 23:57:12.521302 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 27 23:57:12.521381 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.521392 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 27 23:57:12.521471 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 27 23:57:12.521549 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 27 23:57:12.521628 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.521712 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 27 23:57:12.521800 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 27 23:57:12.521879 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.521960 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 27 23:57:12.522040 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 27 23:57:12.522118 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.522202 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 27 23:57:12.522290 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 27 23:57:12.522373 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.522474 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 27 23:57:12.522556 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 27 23:57:12.522635 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.522721 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 27 23:57:12.522816 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 27 23:57:12.522928 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.523016 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 27 23:57:12.523096 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 27 23:57:12.523188 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.523278 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 27 23:57:12.523362 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 27 23:57:12.523440 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.523523 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 27 23:57:12.523603 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 27 23:57:12.523685 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 23:57:12.523696 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 27 23:57:12.523706 kernel: ACPI: button: Power Button [PWRB] Jan 27 23:57:12.523803 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 27 23:57:12.523892 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 27 23:57:12.523903 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 27 23:57:12.523912 kernel: thunder_xcv, ver 1.0 Jan 27 23:57:12.523920 kernel: thunder_bgx, ver 1.0 Jan 27 23:57:12.523928 kernel: nicpf, ver 1.0 Jan 27 23:57:12.523938 kernel: nicvf, ver 1.0 Jan 27 23:57:12.524033 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 27 23:57:12.524111 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-27T23:57:11 UTC (1769558231) Jan 27 23:57:12.524122 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 27 23:57:12.524130 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 27 23:57:12.524139 kernel: watchdog: NMI not fully supported Jan 27 23:57:12.524149 kernel: watchdog: Hard watchdog permanently disabled Jan 27 23:57:12.524157 kernel: NET: Registered PF_INET6 protocol family Jan 27 23:57:12.524165 kernel: Segment Routing with IPv6 Jan 27 23:57:12.524173 kernel: In-situ OAM (IOAM) with IPv6 Jan 27 23:57:12.524181 kernel: NET: Registered PF_PACKET protocol family Jan 27 23:57:12.524189 kernel: Key type dns_resolver registered Jan 27 23:57:12.524197 kernel: registered taskstats version 1 Jan 27 23:57:12.524206 kernel: Loading compiled-in X.509 certificates Jan 27 23:57:12.524215 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 9b9d0a6e8555c4a74bcb93286e875e2244e1db21' Jan 27 23:57:12.524223 kernel: Demotion targets for Node 0: null Jan 27 23:57:12.524231 kernel: Key type .fscrypt registered Jan 27 23:57:12.524239 kernel: Key type fscrypt-provisioning registered Jan 27 23:57:12.524247 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 27 23:57:12.524255 kernel: ima: Allocated hash algorithm: sha1 Jan 27 23:57:12.524263 kernel: ima: No architecture policies found Jan 27 23:57:12.524273 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 27 23:57:12.524281 kernel: clk: Disabling unused clocks Jan 27 23:57:12.524289 kernel: PM: genpd: Disabling unused power domains Jan 27 23:57:12.524297 kernel: Freeing unused kernel memory: 12480K Jan 27 23:57:12.524306 kernel: Run /init as init process Jan 27 23:57:12.524314 kernel: with arguments: Jan 27 23:57:12.524322 kernel: /init Jan 27 23:57:12.524331 kernel: with environment: Jan 27 23:57:12.524339 kernel: HOME=/ Jan 27 23:57:12.524347 kernel: TERM=linux Jan 27 23:57:12.524355 kernel: ACPI: bus type USB registered Jan 27 23:57:12.524363 kernel: usbcore: registered new interface driver usbfs Jan 27 23:57:12.524371 kernel: usbcore: registered new interface driver hub Jan 27 23:57:12.524379 kernel: usbcore: registered new device driver usb Jan 27 23:57:12.524466 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 27 23:57:12.524550 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 27 23:57:12.524634 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 27 23:57:12.524715 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 27 23:57:12.524831 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 27 23:57:12.524914 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 27 23:57:12.525021 kernel: hub 1-0:1.0: USB hub found Jan 27 23:57:12.525121 kernel: hub 1-0:1.0: 4 ports detected Jan 27 23:57:12.525228 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 27 23:57:12.525346 kernel: hub 2-0:1.0: USB hub found Jan 27 23:57:12.525434 kernel: hub 2-0:1.0: 4 ports detected Jan 27 23:57:12.525529 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 27 23:57:12.525613 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 27 23:57:12.525624 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 27 23:57:12.525633 kernel: GPT:25804799 != 104857599 Jan 27 23:57:12.525641 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 27 23:57:12.525650 kernel: GPT:25804799 != 104857599 Jan 27 23:57:12.525658 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 27 23:57:12.525667 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 27 23:57:12.525676 kernel: SCSI subsystem initialized Jan 27 23:57:12.525685 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 27 23:57:12.525693 kernel: device-mapper: uevent: version 1.0.3 Jan 27 23:57:12.525702 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 27 23:57:12.525710 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 27 23:57:12.525721 kernel: raid6: neonx8 gen() 15766 MB/s Jan 27 23:57:12.525751 kernel: raid6: neonx4 gen() 15703 MB/s Jan 27 23:57:12.525760 kernel: raid6: neonx2 gen() 13271 MB/s Jan 27 23:57:12.525768 kernel: raid6: neonx1 gen() 10445 MB/s Jan 27 23:57:12.525776 kernel: raid6: int64x8 gen() 6824 MB/s Jan 27 23:57:12.525785 kernel: raid6: int64x4 gen() 7346 MB/s Jan 27 23:57:12.525793 kernel: raid6: int64x2 gen() 6117 MB/s Jan 27 23:57:12.525802 kernel: raid6: int64x1 gen() 5059 MB/s Jan 27 23:57:12.525812 kernel: raid6: using algorithm neonx8 gen() 15766 MB/s Jan 27 23:57:12.525820 kernel: raid6: .... xor() 12054 MB/s, rmw enabled Jan 27 23:57:12.525829 kernel: raid6: using neon recovery algorithm Jan 27 23:57:12.525838 kernel: xor: measuring software checksum speed Jan 27 23:57:12.525848 kernel: 8regs : 21584 MB/sec Jan 27 23:57:12.525856 kernel: 32regs : 21670 MB/sec Jan 27 23:57:12.525866 kernel: arm64_neon : 28099 MB/sec Jan 27 23:57:12.525875 kernel: xor: using function: arm64_neon (28099 MB/sec) Jan 27 23:57:12.525883 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 27 23:57:12.526001 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 27 23:57:12.526015 kernel: BTRFS: device fsid f7176ebb-63b5-458d-bfa0-a0dcd6bb053d devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (275) Jan 27 23:57:12.526024 kernel: BTRFS info (device dm-0): first mount of filesystem f7176ebb-63b5-458d-bfa0-a0dcd6bb053d Jan 27 23:57:12.526033 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 27 23:57:12.526043 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 27 23:57:12.526052 kernel: BTRFS info (device dm-0): enabling free space tree Jan 27 23:57:12.526060 kernel: loop: module loaded Jan 27 23:57:12.526069 kernel: loop0: detected capacity change from 0 to 91832 Jan 27 23:57:12.526077 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 27 23:57:12.526177 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 27 23:57:12.526192 systemd[1]: Successfully made /usr/ read-only. Jan 27 23:57:12.526204 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 27 23:57:12.526213 systemd[1]: Detected virtualization kvm. Jan 27 23:57:12.526222 systemd[1]: Detected architecture arm64. Jan 27 23:57:12.526230 systemd[1]: Running in initrd. Jan 27 23:57:12.526239 systemd[1]: No hostname configured, using default hostname. Jan 27 23:57:12.526249 systemd[1]: Hostname set to . Jan 27 23:57:12.526258 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 27 23:57:12.526266 systemd[1]: Queued start job for default target initrd.target. Jan 27 23:57:12.526276 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 27 23:57:12.526284 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 23:57:12.526293 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 23:57:12.526304 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 27 23:57:12.526313 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 27 23:57:12.526323 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 27 23:57:12.526332 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 27 23:57:12.526341 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 23:57:12.526350 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 27 23:57:12.526360 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 27 23:57:12.526369 systemd[1]: Reached target paths.target - Path Units. Jan 27 23:57:12.526377 systemd[1]: Reached target slices.target - Slice Units. Jan 27 23:57:12.526386 systemd[1]: Reached target swap.target - Swaps. Jan 27 23:57:12.526395 systemd[1]: Reached target timers.target - Timer Units. Jan 27 23:57:12.526404 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 27 23:57:12.526412 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 27 23:57:12.526422 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 27 23:57:12.526431 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 27 23:57:12.526440 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 27 23:57:12.526449 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 27 23:57:12.526458 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 27 23:57:12.526467 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 23:57:12.526477 systemd[1]: Reached target sockets.target - Socket Units. Jan 27 23:57:12.526486 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 27 23:57:12.526495 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 27 23:57:12.526504 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 27 23:57:12.526513 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 27 23:57:12.526522 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 27 23:57:12.526530 systemd[1]: Starting systemd-fsck-usr.service... Jan 27 23:57:12.526548 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 27 23:57:12.526557 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 27 23:57:12.526566 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 23:57:12.526575 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 27 23:57:12.526586 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 23:57:12.526595 systemd[1]: Finished systemd-fsck-usr.service. Jan 27 23:57:12.526604 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 27 23:57:12.526613 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 27 23:57:12.526644 systemd-journald[419]: Collecting audit messages is enabled. Jan 27 23:57:12.526667 kernel: Bridge firewalling registered Jan 27 23:57:12.526676 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 27 23:57:12.526686 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 27 23:57:12.526695 kernel: audit: type=1130 audit(1769558232.457:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.526707 kernel: audit: type=1130 audit(1769558232.462:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.526716 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 23:57:12.526751 kernel: audit: type=1130 audit(1769558232.467:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.526761 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 27 23:57:12.526770 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 27 23:57:12.526779 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 27 23:57:12.526790 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 23:57:12.526800 kernel: audit: type=1130 audit(1769558232.508:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.526809 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 27 23:57:12.526819 kernel: audit: type=1130 audit(1769558232.513:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.526829 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 27 23:57:12.526838 kernel: audit: type=1130 audit(1769558232.517:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.526848 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 27 23:57:12.526857 kernel: audit: type=1334 audit(1769558232.522:8): prog-id=6 op=LOAD Jan 27 23:57:12.526866 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 27 23:57:12.526875 systemd-journald[419]: Journal started Jan 27 23:57:12.526905 systemd-journald[419]: Runtime Journal (/run/log/journal/90db4509c1ad4277a85eeedb6c529ce8) is 8M, max 319.5M, 311.5M free. Jan 27 23:57:12.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.462000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.467000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.508000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.517000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.522000 audit: BPF prog-id=6 op=LOAD Jan 27 23:57:12.453595 systemd-modules-load[420]: Inserted module 'br_netfilter' Jan 27 23:57:12.546393 systemd[1]: Started systemd-journald.service - Journal Service. Jan 27 23:57:12.545000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.549795 kernel: audit: type=1130 audit(1769558232.545:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.551216 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 27 23:57:12.555415 dracut-cmdline[444]: dracut-109 Jan 27 23:57:12.559752 dracut-cmdline[444]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=880c7a57ca1a4cf41361128ef304e12abcda0ba85f8697ad932e9820a1865169 Jan 27 23:57:12.561506 systemd-tmpfiles[459]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 27 23:57:12.570000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.569817 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 23:57:12.574830 kernel: audit: type=1130 audit(1769558232.570:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.585083 systemd-resolved[445]: Positive Trust Anchors: Jan 27 23:57:12.585105 systemd-resolved[445]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 27 23:57:12.585108 systemd-resolved[445]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 27 23:57:12.585140 systemd-resolved[445]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 27 23:57:12.611655 systemd-resolved[445]: Defaulting to hostname 'linux'. Jan 27 23:57:12.612000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.612602 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 27 23:57:12.613755 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 27 23:57:12.649757 kernel: Loading iSCSI transport class v2.0-870. Jan 27 23:57:12.662760 kernel: iscsi: registered transport (tcp) Jan 27 23:57:12.676846 kernel: iscsi: registered transport (qla4xxx) Jan 27 23:57:12.676909 kernel: QLogic iSCSI HBA Driver Jan 27 23:57:12.699247 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 27 23:57:12.728836 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 23:57:12.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.731029 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 27 23:57:12.776662 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 27 23:57:12.777000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.779205 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 27 23:57:12.780804 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 27 23:57:12.808999 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 27 23:57:12.809000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.810000 audit: BPF prog-id=7 op=LOAD Jan 27 23:57:12.810000 audit: BPF prog-id=8 op=LOAD Jan 27 23:57:12.812302 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 23:57:12.844352 systemd-udevd[688]: Using default interface naming scheme 'v257'. Jan 27 23:57:12.852353 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 23:57:12.853000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.855212 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 27 23:57:12.878271 dracut-pre-trigger[759]: rd.md=0: removing MD RAID activation Jan 27 23:57:12.883862 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 27 23:57:12.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.885000 audit: BPF prog-id=9 op=LOAD Jan 27 23:57:12.887106 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 27 23:57:12.908071 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 27 23:57:12.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.911380 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 27 23:57:12.926148 systemd-networkd[807]: lo: Link UP Jan 27 23:57:12.926159 systemd-networkd[807]: lo: Gained carrier Jan 27 23:57:12.926635 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 27 23:57:12.927000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:12.927876 systemd[1]: Reached target network.target - Network. Jan 27 23:57:12.995643 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 23:57:12.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:13.002586 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 27 23:57:13.075567 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 27 23:57:13.085096 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 27 23:57:13.105127 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 27 23:57:13.125318 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 27 23:57:13.125374 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 27 23:57:13.127764 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 27 23:57:13.128930 systemd-networkd[807]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 23:57:13.128939 systemd-networkd[807]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 27 23:57:13.129532 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 27 23:57:13.132378 systemd-networkd[807]: eth0: Link UP Jan 27 23:57:13.132545 systemd-networkd[807]: eth0: Gained carrier Jan 27 23:57:13.132558 systemd-networkd[807]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 23:57:13.136923 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 27 23:57:13.140042 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 27 23:57:13.141000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:13.140162 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 23:57:13.141852 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 23:57:13.156554 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 23:57:13.164339 disk-uuid[871]: Primary Header is updated. Jan 27 23:57:13.164339 disk-uuid[871]: Secondary Entries is updated. Jan 27 23:57:13.164339 disk-uuid[871]: Secondary Header is updated. Jan 27 23:57:13.177619 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 23:57:13.184856 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 27 23:57:13.185058 kernel: usbcore: registered new interface driver usbhid Jan 27 23:57:13.185070 kernel: usbhid: USB HID core driver Jan 27 23:57:13.182000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:13.190793 systemd-networkd[807]: eth0: DHCPv4 address 10.0.6.5/25, gateway 10.0.6.1 acquired from 10.0.6.1 Jan 27 23:57:13.254372 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 27 23:57:13.254000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:13.255966 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 27 23:57:13.257391 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 23:57:13.259364 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 27 23:57:13.262220 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 27 23:57:13.295105 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 27 23:57:13.295000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:14.208582 disk-uuid[873]: Warning: The kernel is still using the old partition table. Jan 27 23:57:14.208582 disk-uuid[873]: The new table will be used at the next reboot or after you Jan 27 23:57:14.208582 disk-uuid[873]: run partprobe(8) or kpartx(8) Jan 27 23:57:14.208582 disk-uuid[873]: The operation has completed successfully. Jan 27 23:57:14.215081 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 27 23:57:14.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:14.215000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:14.215193 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 27 23:57:14.217314 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 27 23:57:14.256746 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (902) Jan 27 23:57:14.259604 kernel: BTRFS info (device vda6): first mount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 27 23:57:14.259622 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 27 23:57:14.264009 kernel: BTRFS info (device vda6): turning on async discard Jan 27 23:57:14.264033 kernel: BTRFS info (device vda6): enabling free space tree Jan 27 23:57:14.269776 kernel: BTRFS info (device vda6): last unmount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 27 23:57:14.270314 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 27 23:57:14.271000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:14.272603 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 27 23:57:14.428751 ignition[921]: Ignition 2.24.0 Jan 27 23:57:14.428768 ignition[921]: Stage: fetch-offline Jan 27 23:57:14.428805 ignition[921]: no configs at "/usr/lib/ignition/base.d" Jan 27 23:57:14.430544 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 27 23:57:14.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:14.428816 ignition[921]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 23:57:14.428973 ignition[921]: parsed url from cmdline: "" Jan 27 23:57:14.433284 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 27 23:57:14.428976 ignition[921]: no config URL provided Jan 27 23:57:14.428981 ignition[921]: reading system config file "/usr/lib/ignition/user.ign" Jan 27 23:57:14.428988 ignition[921]: no config at "/usr/lib/ignition/user.ign" Jan 27 23:57:14.428992 ignition[921]: failed to fetch config: resource requires networking Jan 27 23:57:14.429138 ignition[921]: Ignition finished successfully Jan 27 23:57:14.467055 ignition[930]: Ignition 2.24.0 Jan 27 23:57:14.467075 ignition[930]: Stage: fetch Jan 27 23:57:14.467220 ignition[930]: no configs at "/usr/lib/ignition/base.d" Jan 27 23:57:14.467229 ignition[930]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 23:57:14.467306 ignition[930]: parsed url from cmdline: "" Jan 27 23:57:14.467309 ignition[930]: no config URL provided Jan 27 23:57:14.467313 ignition[930]: reading system config file "/usr/lib/ignition/user.ign" Jan 27 23:57:14.467319 ignition[930]: no config at "/usr/lib/ignition/user.ign" Jan 27 23:57:14.467648 ignition[930]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 23:57:14.467664 ignition[930]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 23:57:14.467986 ignition[930]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 27 23:57:15.127967 systemd-networkd[807]: eth0: Gained IPv6LL Jan 27 23:57:15.469209 ignition[930]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 23:57:15.469235 ignition[930]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 23:57:16.469706 ignition[930]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 23:57:16.469818 ignition[930]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 23:57:16.933327 ignition[930]: GET result: OK Jan 27 23:57:16.933605 ignition[930]: parsing config with SHA512: 0fcb92c9718c74b677da2b2dbb3b93581481180bd3f35c514d2c9d5cdb6f2736346dcae1aabec0456fc1ae6b655b21398bf6bbe393ca1399b1822327a9b1c1c4 Jan 27 23:57:16.938693 unknown[930]: fetched base config from "system" Jan 27 23:57:16.938708 unknown[930]: fetched base config from "system" Jan 27 23:57:16.939076 ignition[930]: fetch: fetch complete Jan 27 23:57:16.938713 unknown[930]: fetched user config from "openstack" Jan 27 23:57:16.945907 kernel: kauditd_printk_skb: 20 callbacks suppressed Jan 27 23:57:16.945934 kernel: audit: type=1130 audit(1769558236.941:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:16.941000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:16.939081 ignition[930]: fetch: fetch passed Jan 27 23:57:16.940612 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 27 23:57:16.939123 ignition[930]: Ignition finished successfully Jan 27 23:57:16.943166 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 27 23:57:16.968416 ignition[938]: Ignition 2.24.0 Jan 27 23:57:16.968437 ignition[938]: Stage: kargs Jan 27 23:57:16.968586 ignition[938]: no configs at "/usr/lib/ignition/base.d" Jan 27 23:57:16.968595 ignition[938]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 23:57:16.971232 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 27 23:57:16.971000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:16.969346 ignition[938]: kargs: kargs passed Jan 27 23:57:16.977009 kernel: audit: type=1130 audit(1769558236.971:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:16.973495 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 27 23:57:16.969392 ignition[938]: Ignition finished successfully Jan 27 23:57:16.997565 ignition[945]: Ignition 2.24.0 Jan 27 23:57:16.997587 ignition[945]: Stage: disks Jan 27 23:57:16.997777 ignition[945]: no configs at "/usr/lib/ignition/base.d" Jan 27 23:57:16.997788 ignition[945]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 23:57:16.998520 ignition[945]: disks: disks passed Jan 27 23:57:17.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.000758 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 27 23:57:17.007542 kernel: audit: type=1130 audit(1769558237.002:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:16.998565 ignition[945]: Ignition finished successfully Jan 27 23:57:17.003007 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 27 23:57:17.006826 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 27 23:57:17.008595 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 27 23:57:17.010014 systemd[1]: Reached target sysinit.target - System Initialization. Jan 27 23:57:17.011628 systemd[1]: Reached target basic.target - Basic System. Jan 27 23:57:17.014367 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 27 23:57:17.065384 systemd-fsck[954]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 27 23:57:17.069208 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 27 23:57:17.069000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.071518 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 27 23:57:17.075264 kernel: audit: type=1130 audit(1769558237.069:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.178755 kernel: EXT4-fs (vda9): mounted filesystem e122e254-04a8-47c4-9c16-e71d001bbc70 r/w with ordered data mode. Quota mode: none. Jan 27 23:57:17.179068 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 27 23:57:17.180309 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 27 23:57:17.187383 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 27 23:57:17.189483 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 27 23:57:17.190570 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 27 23:57:17.191351 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 27 23:57:17.195760 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 27 23:57:17.195796 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 27 23:57:17.208744 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 27 23:57:17.210897 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 27 23:57:17.223762 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (962) Jan 27 23:57:17.227474 kernel: BTRFS info (device vda6): first mount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 27 23:57:17.227493 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 27 23:57:17.233755 kernel: BTRFS info (device vda6): turning on async discard Jan 27 23:57:17.233807 kernel: BTRFS info (device vda6): enabling free space tree Jan 27 23:57:17.235598 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 27 23:57:17.277790 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:17.385845 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 27 23:57:17.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.388118 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 27 23:57:17.391845 kernel: audit: type=1130 audit(1769558237.386:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.391794 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 27 23:57:17.404297 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 27 23:57:17.406311 kernel: BTRFS info (device vda6): last unmount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 27 23:57:17.424851 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 27 23:57:17.425000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.429752 kernel: audit: type=1130 audit(1769558237.425:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.436740 ignition[1063]: INFO : Ignition 2.24.0 Jan 27 23:57:17.437631 ignition[1063]: INFO : Stage: mount Jan 27 23:57:17.437631 ignition[1063]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 23:57:17.437631 ignition[1063]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 23:57:17.441268 ignition[1063]: INFO : mount: mount passed Jan 27 23:57:17.441268 ignition[1063]: INFO : Ignition finished successfully Jan 27 23:57:17.442000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:17.440236 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 27 23:57:17.447189 kernel: audit: type=1130 audit(1769558237.442:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:18.313782 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:20.319868 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:24.325771 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:24.334510 coreos-metadata[964]: Jan 27 23:57:24.334 WARN failed to locate config-drive, using the metadata service API instead Jan 27 23:57:24.353291 coreos-metadata[964]: Jan 27 23:57:24.353 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 27 23:57:24.996759 coreos-metadata[964]: Jan 27 23:57:24.996 INFO Fetch successful Jan 27 23:57:24.997933 coreos-metadata[964]: Jan 27 23:57:24.997 INFO wrote hostname ci-4593-0-0-n-485d202ac1 to /sysroot/etc/hostname Jan 27 23:57:24.999566 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 27 23:57:25.007257 kernel: audit: type=1130 audit(1769558245.000:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:25.007320 kernel: audit: type=1131 audit(1769558245.000:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:25.000000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:25.000000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:24.999683 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 27 23:57:25.002165 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 27 23:57:25.026193 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 27 23:57:25.050745 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1080) Jan 27 23:57:25.053745 kernel: BTRFS info (device vda6): first mount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 27 23:57:25.053789 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 27 23:57:25.058094 kernel: BTRFS info (device vda6): turning on async discard Jan 27 23:57:25.058119 kernel: BTRFS info (device vda6): enabling free space tree Jan 27 23:57:25.059634 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 27 23:57:25.086156 ignition[1098]: INFO : Ignition 2.24.0 Jan 27 23:57:25.086156 ignition[1098]: INFO : Stage: files Jan 27 23:57:25.087869 ignition[1098]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 23:57:25.087869 ignition[1098]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 23:57:25.087869 ignition[1098]: DEBUG : files: compiled without relabeling support, skipping Jan 27 23:57:25.091130 ignition[1098]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 27 23:57:25.091130 ignition[1098]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 27 23:57:25.093922 ignition[1098]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 27 23:57:25.093922 ignition[1098]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 27 23:57:25.093922 ignition[1098]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 27 23:57:25.093789 unknown[1098]: wrote ssh authorized keys file for user: core Jan 27 23:57:25.099311 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 27 23:57:25.099311 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 27 23:57:25.162358 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 27 23:57:25.272287 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 27 23:57:25.272287 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 27 23:57:25.276173 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Jan 27 23:57:25.670328 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 27 23:57:26.751311 ignition[1098]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 27 23:57:26.751311 ignition[1098]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 27 23:57:26.755166 ignition[1098]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 27 23:57:26.758558 ignition[1098]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 27 23:57:26.758558 ignition[1098]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 27 23:57:26.758558 ignition[1098]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 27 23:57:26.758558 ignition[1098]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 27 23:57:26.758558 ignition[1098]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 27 23:57:26.758558 ignition[1098]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 27 23:57:26.758558 ignition[1098]: INFO : files: files passed Jan 27 23:57:26.758558 ignition[1098]: INFO : Ignition finished successfully Jan 27 23:57:26.774205 kernel: audit: type=1130 audit(1769558246.761:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.761000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.760521 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 27 23:57:26.763346 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 27 23:57:26.765385 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 27 23:57:26.776000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.775980 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 27 23:57:26.784348 kernel: audit: type=1130 audit(1769558246.776:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.784375 kernel: audit: type=1131 audit(1769558246.776:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.776000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.776314 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 27 23:57:26.789258 initrd-setup-root-after-ignition[1131]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 27 23:57:26.789258 initrd-setup-root-after-ignition[1131]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 27 23:57:26.792873 initrd-setup-root-after-ignition[1135]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 27 23:57:26.793658 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 27 23:57:26.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.798757 kernel: audit: type=1130 audit(1769558246.794:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.795532 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 27 23:57:26.800964 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 27 23:57:26.847522 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 27 23:57:26.848805 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 27 23:57:26.855332 kernel: audit: type=1130 audit(1769558246.849:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.855361 kernel: audit: type=1131 audit(1769558246.849:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.849000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.850180 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 27 23:57:26.856249 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 27 23:57:26.858001 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 27 23:57:26.859082 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 27 23:57:26.902000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.901787 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 27 23:57:26.906357 kernel: audit: type=1130 audit(1769558246.902:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.904358 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 27 23:57:26.926957 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 27 23:57:26.927218 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 27 23:57:26.929123 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 23:57:26.930911 systemd[1]: Stopped target timers.target - Timer Units. Jan 27 23:57:26.932612 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 27 23:57:26.934000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.932776 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 27 23:57:26.938198 kernel: audit: type=1131 audit(1769558246.934:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.937305 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 27 23:57:26.939151 systemd[1]: Stopped target basic.target - Basic System. Jan 27 23:57:26.940544 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 27 23:57:26.942104 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 27 23:57:26.943734 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 27 23:57:26.945600 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 27 23:57:26.947396 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 27 23:57:26.949029 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 27 23:57:26.950842 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 27 23:57:26.952804 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 27 23:57:26.954480 systemd[1]: Stopped target swap.target - Swaps. Jan 27 23:57:26.955868 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 27 23:57:26.957000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.956027 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 27 23:57:26.958062 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 27 23:57:26.959806 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 23:57:26.961620 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 27 23:57:26.961749 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 23:57:26.965000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.963627 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 27 23:57:26.963792 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 27 23:57:26.968000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.966403 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 27 23:57:26.969000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.966550 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 27 23:57:26.968348 systemd[1]: ignition-files.service: Deactivated successfully. Jan 27 23:57:26.968467 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 27 23:57:26.970788 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 27 23:57:26.973438 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 27 23:57:26.976000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.975019 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 27 23:57:26.978000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.975169 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 23:57:26.980000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.976983 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 27 23:57:26.977102 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 23:57:26.978665 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 27 23:57:26.978805 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 27 23:57:26.984778 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 27 23:57:26.987000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.987000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.984869 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 27 23:57:26.992273 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 27 23:57:26.993206 ignition[1155]: INFO : Ignition 2.24.0 Jan 27 23:57:26.993206 ignition[1155]: INFO : Stage: umount Jan 27 23:57:26.993206 ignition[1155]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 23:57:26.993206 ignition[1155]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 23:57:26.995000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.995036 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 27 23:57:26.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.999155 ignition[1155]: INFO : umount: umount passed Jan 27 23:57:26.999155 ignition[1155]: INFO : Ignition finished successfully Jan 27 23:57:26.995153 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 27 23:57:26.996761 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 27 23:57:27.001000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:26.996842 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 27 23:57:27.003000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.001335 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 27 23:57:27.004000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.001390 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 27 23:57:27.002405 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 27 23:57:27.007000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.002453 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 27 23:57:27.003960 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 27 23:57:27.004013 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 27 23:57:27.005454 systemd[1]: Stopped target network.target - Network. Jan 27 23:57:27.006800 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 27 23:57:27.006872 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 27 23:57:27.008526 systemd[1]: Stopped target paths.target - Path Units. Jan 27 23:57:27.009950 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 27 23:57:27.014778 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 23:57:27.016068 systemd[1]: Stopped target slices.target - Slice Units. Jan 27 23:57:27.017461 systemd[1]: Stopped target sockets.target - Socket Units. Jan 27 23:57:27.019124 systemd[1]: iscsid.socket: Deactivated successfully. Jan 27 23:57:27.019170 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 27 23:57:27.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.021052 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 27 23:57:27.026000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.021087 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 27 23:57:27.027000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.022552 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 27 23:57:27.022578 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 27 23:57:27.024186 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 27 23:57:27.024248 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 27 23:57:27.025684 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 27 23:57:27.025745 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 27 23:57:27.027325 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 27 23:57:27.027377 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 27 23:57:27.028902 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 27 23:57:27.030535 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 27 23:57:27.039000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.039175 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 27 23:57:27.039277 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 27 23:57:27.042300 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 27 23:57:27.044000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.042421 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 27 23:57:27.046000 audit: BPF prog-id=6 op=UNLOAD Jan 27 23:57:27.047462 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 27 23:57:27.048000 audit: BPF prog-id=9 op=UNLOAD Jan 27 23:57:27.048745 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 27 23:57:27.048809 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 27 23:57:27.051616 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 27 23:57:27.053366 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 27 23:57:27.054000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.053433 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 27 23:57:27.056000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.055262 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 27 23:57:27.057000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.055312 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 27 23:57:27.056867 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 27 23:57:27.056913 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 27 23:57:27.058758 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 23:57:27.080289 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 27 23:57:27.080438 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 23:57:27.081000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.082506 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 27 23:57:27.082547 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 27 23:57:27.084835 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 27 23:57:27.088000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.084868 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 23:57:27.090000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.086606 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 27 23:57:27.086661 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 27 23:57:27.092000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.089211 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 27 23:57:27.089267 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 27 23:57:27.091178 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 27 23:57:27.097000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.091234 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 27 23:57:27.099000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.094596 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 27 23:57:27.101000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.095662 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 27 23:57:27.102000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.095760 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 23:57:27.105000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.097677 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 27 23:57:27.097751 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 23:57:27.099615 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 27 23:57:27.099661 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 27 23:57:27.101803 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 27 23:57:27.101849 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 23:57:27.103782 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 27 23:57:27.103837 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 23:57:27.123187 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 27 23:57:27.123330 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 27 23:57:27.125000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.125416 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 27 23:57:27.128000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:27.125532 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 27 23:57:27.129339 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 27 23:57:27.131417 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 27 23:57:27.148558 systemd[1]: Switching root. Jan 27 23:57:27.191431 systemd-journald[419]: Journal stopped Jan 27 23:57:28.113584 systemd-journald[419]: Received SIGTERM from PID 1 (systemd). Jan 27 23:57:28.113663 kernel: SELinux: policy capability network_peer_controls=1 Jan 27 23:57:28.113688 kernel: SELinux: policy capability open_perms=1 Jan 27 23:57:28.113704 kernel: SELinux: policy capability extended_socket_class=1 Jan 27 23:57:28.113715 kernel: SELinux: policy capability always_check_network=0 Jan 27 23:57:28.114800 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 27 23:57:28.114824 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 27 23:57:28.114849 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 27 23:57:28.114864 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 27 23:57:28.114881 kernel: SELinux: policy capability userspace_initial_context=0 Jan 27 23:57:28.114897 systemd[1]: Successfully loaded SELinux policy in 63.604ms. Jan 27 23:57:28.114920 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.100ms. Jan 27 23:57:28.114933 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 27 23:57:28.114945 systemd[1]: Detected virtualization kvm. Jan 27 23:57:28.114963 systemd[1]: Detected architecture arm64. Jan 27 23:57:28.114974 systemd[1]: Detected first boot. Jan 27 23:57:28.114985 systemd[1]: Hostname set to . Jan 27 23:57:28.114998 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 27 23:57:28.115009 zram_generator::config[1201]: No configuration found. Jan 27 23:57:28.115027 kernel: NET: Registered PF_VSOCK protocol family Jan 27 23:57:28.115037 systemd[1]: Populated /etc with preset unit settings. Jan 27 23:57:28.115050 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 27 23:57:28.115061 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 27 23:57:28.115073 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 27 23:57:28.115087 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 27 23:57:28.115098 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 27 23:57:28.115110 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 27 23:57:28.115121 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 27 23:57:28.115132 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 27 23:57:28.115143 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 27 23:57:28.115157 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 27 23:57:28.115168 systemd[1]: Created slice user.slice - User and Session Slice. Jan 27 23:57:28.115179 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 23:57:28.115190 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 23:57:28.115202 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 27 23:57:28.115213 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 27 23:57:28.115224 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 27 23:57:28.115237 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 27 23:57:28.115252 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 27 23:57:28.115263 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 23:57:28.115274 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 27 23:57:28.115286 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 27 23:57:28.115299 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 27 23:57:28.115310 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 27 23:57:28.115325 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 27 23:57:28.115337 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 23:57:28.115348 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 27 23:57:28.115359 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 27 23:57:28.115370 systemd[1]: Reached target slices.target - Slice Units. Jan 27 23:57:28.115386 systemd[1]: Reached target swap.target - Swaps. Jan 27 23:57:28.115397 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 27 23:57:28.115408 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 27 23:57:28.115423 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 27 23:57:28.115434 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 27 23:57:28.115445 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 27 23:57:28.115456 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 27 23:57:28.115467 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 27 23:57:28.115480 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 27 23:57:28.115491 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 27 23:57:28.115502 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 23:57:28.115514 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 27 23:57:28.115524 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 27 23:57:28.115536 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 27 23:57:28.115547 systemd[1]: Mounting media.mount - External Media Directory... Jan 27 23:57:28.115559 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 27 23:57:28.115571 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 27 23:57:28.115582 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 27 23:57:28.115593 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 27 23:57:28.115605 systemd[1]: Reached target machines.target - Containers. Jan 27 23:57:28.115616 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 27 23:57:28.115628 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 23:57:28.115639 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 27 23:57:28.115651 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 27 23:57:28.115662 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 23:57:28.115674 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 27 23:57:28.115685 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 23:57:28.115696 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 27 23:57:28.115708 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 23:57:28.115719 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 27 23:57:28.115833 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 27 23:57:28.115848 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 27 23:57:28.115866 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 27 23:57:28.115878 systemd[1]: Stopped systemd-fsck-usr.service. Jan 27 23:57:28.115890 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 23:57:28.115901 kernel: fuse: init (API version 7.41) Jan 27 23:57:28.115912 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 27 23:57:28.115923 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 27 23:57:28.115934 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 27 23:57:28.115947 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 27 23:57:28.115959 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 27 23:57:28.115970 kernel: ACPI: bus type drm_connector registered Jan 27 23:57:28.115980 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 27 23:57:28.115991 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 27 23:57:28.116002 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 27 23:57:28.116015 systemd[1]: Mounted media.mount - External Media Directory. Jan 27 23:57:28.116062 systemd-journald[1264]: Collecting audit messages is enabled. Jan 27 23:57:28.116091 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 27 23:57:28.116103 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 27 23:57:28.116117 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 27 23:57:28.116129 systemd-journald[1264]: Journal started Jan 27 23:57:28.116152 systemd-journald[1264]: Runtime Journal (/run/log/journal/90db4509c1ad4277a85eeedb6c529ce8) is 8M, max 319.5M, 311.5M free. Jan 27 23:57:27.969000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 27 23:57:28.061000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.063000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.066000 audit: BPF prog-id=14 op=UNLOAD Jan 27 23:57:28.066000 audit: BPF prog-id=13 op=UNLOAD Jan 27 23:57:28.067000 audit: BPF prog-id=15 op=LOAD Jan 27 23:57:28.067000 audit: BPF prog-id=16 op=LOAD Jan 27 23:57:28.067000 audit: BPF prog-id=17 op=LOAD Jan 27 23:57:28.107000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 27 23:57:28.107000 audit[1264]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=fffffb4be8b0 a2=4000 a3=0 items=0 ppid=1 pid=1264 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:57:28.107000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 27 23:57:27.874182 systemd[1]: Queued start job for default target multi-user.target. Jan 27 23:57:27.898799 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 27 23:57:27.899244 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 27 23:57:28.119929 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 23:57:28.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.122781 systemd[1]: Started systemd-journald.service - Journal Service. Jan 27 23:57:28.121000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.123261 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 27 23:57:28.124437 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 27 23:57:28.124000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.124000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.125864 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 23:57:28.126026 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 23:57:28.126000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.126000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.127373 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 27 23:57:28.127541 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 27 23:57:28.130000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.130000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.131130 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 23:57:28.131308 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 23:57:28.131000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.131000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.132819 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 27 23:57:28.132982 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 27 23:57:28.133000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.133000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.134319 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 23:57:28.134471 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 23:57:28.134000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.134000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.135947 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 27 23:57:28.136000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.138944 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 23:57:28.139000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.141268 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 27 23:57:28.141000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.142861 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 27 23:57:28.143000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.144500 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 27 23:57:28.145000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.157023 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 27 23:57:28.158982 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 27 23:57:28.160218 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 27 23:57:28.160253 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 27 23:57:28.162167 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 27 23:57:28.163979 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 23:57:28.164105 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 23:57:28.165909 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 27 23:57:28.168001 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 27 23:57:28.169091 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 27 23:57:28.172899 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 27 23:57:28.173933 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 27 23:57:28.175682 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 27 23:57:28.178957 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 27 23:57:28.184062 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 27 23:57:28.186868 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 23:57:28.189000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.193706 systemd-journald[1264]: Time spent on flushing to /var/log/journal/90db4509c1ad4277a85eeedb6c529ce8 is 29.965ms for 1819 entries. Jan 27 23:57:28.193706 systemd-journald[1264]: System Journal (/var/log/journal/90db4509c1ad4277a85eeedb6c529ce8) is 8M, max 588.1M, 580.1M free. Jan 27 23:57:28.246509 systemd-journald[1264]: Received client request to flush runtime journal. Jan 27 23:57:28.246571 kernel: loop1: detected capacity change from 0 to 1648 Jan 27 23:57:28.246602 kernel: loop2: detected capacity change from 0 to 200800 Jan 27 23:57:28.201000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.214000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.226000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.200643 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 27 23:57:28.201972 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 27 23:57:28.207959 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 27 23:57:28.213843 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 27 23:57:28.221996 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jan 27 23:57:28.222007 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Jan 27 23:57:28.226049 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 27 23:57:28.230907 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 27 23:57:28.249780 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 27 23:57:28.251000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.264894 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 27 23:57:28.265000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.280115 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 27 23:57:28.280000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.281000 audit: BPF prog-id=18 op=LOAD Jan 27 23:57:28.281000 audit: BPF prog-id=19 op=LOAD Jan 27 23:57:28.281000 audit: BPF prog-id=20 op=LOAD Jan 27 23:57:28.284477 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 27 23:57:28.286768 kernel: loop3: detected capacity change from 0 to 100192 Jan 27 23:57:28.286000 audit: BPF prog-id=21 op=LOAD Jan 27 23:57:28.288043 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 27 23:57:28.291009 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 27 23:57:28.294486 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 27 23:57:28.293000 audit: BPF prog-id=22 op=LOAD Jan 27 23:57:28.293000 audit: BPF prog-id=23 op=LOAD Jan 27 23:57:28.293000 audit: BPF prog-id=24 op=LOAD Jan 27 23:57:28.310000 audit: BPF prog-id=25 op=LOAD Jan 27 23:57:28.310000 audit: BPF prog-id=26 op=LOAD Jan 27 23:57:28.310000 audit: BPF prog-id=27 op=LOAD Jan 27 23:57:28.312174 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 27 23:57:28.324486 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Jan 27 23:57:28.324503 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Jan 27 23:57:28.327760 kernel: loop4: detected capacity change from 0 to 45344 Jan 27 23:57:28.332874 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 23:57:28.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.340902 systemd-nsresourced[1343]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 27 23:57:28.342037 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 27 23:57:28.343000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.356850 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 27 23:57:28.357000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.373850 kernel: loop5: detected capacity change from 0 to 1648 Jan 27 23:57:28.379750 kernel: loop6: detected capacity change from 0 to 200800 Jan 27 23:57:28.398755 kernel: loop7: detected capacity change from 0 to 100192 Jan 27 23:57:28.402841 systemd-oomd[1339]: No swap; memory pressure usage will be degraded Jan 27 23:57:28.403872 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 27 23:57:28.405000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.410745 kernel: loop1: detected capacity change from 0 to 45344 Jan 27 23:57:28.413529 systemd-resolved[1341]: Positive Trust Anchors: Jan 27 23:57:28.413550 systemd-resolved[1341]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 27 23:57:28.413553 systemd-resolved[1341]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 27 23:57:28.413583 systemd-resolved[1341]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 27 23:57:28.425427 (sd-merge)[1359]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 27 23:57:28.428555 (sd-merge)[1359]: Merged extensions into '/usr'. Jan 27 23:57:28.433650 systemd-resolved[1341]: Using system hostname 'ci-4593-0-0-n-485d202ac1'. Jan 27 23:57:28.435471 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 27 23:57:28.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.436768 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 27 23:57:28.438007 systemd[1]: Reload requested from client PID 1319 ('systemd-sysext') (unit systemd-sysext.service)... Jan 27 23:57:28.438023 systemd[1]: Reloading... Jan 27 23:57:28.493791 zram_generator::config[1391]: No configuration found. Jan 27 23:57:28.648220 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 27 23:57:28.648463 systemd[1]: Reloading finished in 210 ms. Jan 27 23:57:28.675585 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 27 23:57:28.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.678140 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 27 23:57:28.678000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:28.694001 systemd[1]: Starting ensure-sysext.service... Jan 27 23:57:28.696078 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 27 23:57:28.696000 audit: BPF prog-id=8 op=UNLOAD Jan 27 23:57:28.696000 audit: BPF prog-id=7 op=UNLOAD Jan 27 23:57:28.697000 audit: BPF prog-id=28 op=LOAD Jan 27 23:57:28.697000 audit: BPF prog-id=29 op=LOAD Jan 27 23:57:28.698755 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 23:57:28.700000 audit: BPF prog-id=30 op=LOAD Jan 27 23:57:28.700000 audit: BPF prog-id=25 op=UNLOAD Jan 27 23:57:28.700000 audit: BPF prog-id=31 op=LOAD Jan 27 23:57:28.700000 audit: BPF prog-id=32 op=LOAD Jan 27 23:57:28.700000 audit: BPF prog-id=26 op=UNLOAD Jan 27 23:57:28.700000 audit: BPF prog-id=27 op=UNLOAD Jan 27 23:57:28.701000 audit: BPF prog-id=33 op=LOAD Jan 27 23:57:28.701000 audit: BPF prog-id=18 op=UNLOAD Jan 27 23:57:28.701000 audit: BPF prog-id=34 op=LOAD Jan 27 23:57:28.701000 audit: BPF prog-id=35 op=LOAD Jan 27 23:57:28.701000 audit: BPF prog-id=19 op=UNLOAD Jan 27 23:57:28.701000 audit: BPF prog-id=20 op=UNLOAD Jan 27 23:57:28.701000 audit: BPF prog-id=36 op=LOAD Jan 27 23:57:28.701000 audit: BPF prog-id=22 op=UNLOAD Jan 27 23:57:28.701000 audit: BPF prog-id=37 op=LOAD Jan 27 23:57:28.701000 audit: BPF prog-id=38 op=LOAD Jan 27 23:57:28.701000 audit: BPF prog-id=23 op=UNLOAD Jan 27 23:57:28.701000 audit: BPF prog-id=24 op=UNLOAD Jan 27 23:57:28.702000 audit: BPF prog-id=39 op=LOAD Jan 27 23:57:28.703000 audit: BPF prog-id=21 op=UNLOAD Jan 27 23:57:28.704000 audit: BPF prog-id=40 op=LOAD Jan 27 23:57:28.704000 audit: BPF prog-id=15 op=UNLOAD Jan 27 23:57:28.704000 audit: BPF prog-id=41 op=LOAD Jan 27 23:57:28.704000 audit: BPF prog-id=42 op=LOAD Jan 27 23:57:28.704000 audit: BPF prog-id=16 op=UNLOAD Jan 27 23:57:28.704000 audit: BPF prog-id=17 op=UNLOAD Jan 27 23:57:28.713547 systemd[1]: Reload requested from client PID 1428 ('systemctl') (unit ensure-sysext.service)... Jan 27 23:57:28.713561 systemd[1]: Reloading... Jan 27 23:57:28.716213 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 27 23:57:28.716254 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 27 23:57:28.716492 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 27 23:57:28.717418 systemd-tmpfiles[1429]: ACLs are not supported, ignoring. Jan 27 23:57:28.717473 systemd-tmpfiles[1429]: ACLs are not supported, ignoring. Jan 27 23:57:28.721946 systemd-udevd[1430]: Using default interface naming scheme 'v257'. Jan 27 23:57:28.724317 systemd-tmpfiles[1429]: Detected autofs mount point /boot during canonicalization of boot. Jan 27 23:57:28.724337 systemd-tmpfiles[1429]: Skipping /boot Jan 27 23:57:28.730786 systemd-tmpfiles[1429]: Detected autofs mount point /boot during canonicalization of boot. Jan 27 23:57:28.730803 systemd-tmpfiles[1429]: Skipping /boot Jan 27 23:57:28.774855 zram_generator::config[1476]: No configuration found. Jan 27 23:57:28.859757 kernel: mousedev: PS/2 mouse device common for all mice Jan 27 23:57:28.976610 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 27 23:57:28.979633 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 27 23:57:28.982076 systemd[1]: Reloading finished in 268 ms. Jan 27 23:57:28.995308 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 23:57:28.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.000870 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 27 23:57:29.000936 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 27 23:57:29.000957 kernel: [drm] features: -context_init Jan 27 23:57:29.000000 audit: BPF prog-id=43 op=LOAD Jan 27 23:57:29.000000 audit: BPF prog-id=30 op=UNLOAD Jan 27 23:57:29.000000 audit: BPF prog-id=44 op=LOAD Jan 27 23:57:29.000000 audit: BPF prog-id=45 op=LOAD Jan 27 23:57:29.000000 audit: BPF prog-id=31 op=UNLOAD Jan 27 23:57:29.000000 audit: BPF prog-id=32 op=UNLOAD Jan 27 23:57:29.000000 audit: BPF prog-id=46 op=LOAD Jan 27 23:57:29.000000 audit: BPF prog-id=47 op=LOAD Jan 27 23:57:29.001000 audit: BPF prog-id=28 op=UNLOAD Jan 27 23:57:29.001000 audit: BPF prog-id=29 op=UNLOAD Jan 27 23:57:29.001000 audit: BPF prog-id=48 op=LOAD Jan 27 23:57:29.001000 audit: BPF prog-id=39 op=UNLOAD Jan 27 23:57:29.003755 kernel: [drm] number of scanouts: 1 Jan 27 23:57:29.003833 kernel: [drm] number of cap sets: 0 Jan 27 23:57:29.003000 audit: BPF prog-id=49 op=LOAD Jan 27 23:57:29.003000 audit: BPF prog-id=40 op=UNLOAD Jan 27 23:57:29.003000 audit: BPF prog-id=50 op=LOAD Jan 27 23:57:29.003000 audit: BPF prog-id=51 op=LOAD Jan 27 23:57:29.003000 audit: BPF prog-id=41 op=UNLOAD Jan 27 23:57:29.003000 audit: BPF prog-id=42 op=UNLOAD Jan 27 23:57:29.005000 audit: BPF prog-id=52 op=LOAD Jan 27 23:57:29.005000 audit: BPF prog-id=33 op=UNLOAD Jan 27 23:57:29.005000 audit: BPF prog-id=53 op=LOAD Jan 27 23:57:29.005000 audit: BPF prog-id=54 op=LOAD Jan 27 23:57:29.006771 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 27 23:57:29.005000 audit: BPF prog-id=34 op=UNLOAD Jan 27 23:57:29.005000 audit: BPF prog-id=35 op=UNLOAD Jan 27 23:57:29.006000 audit: BPF prog-id=55 op=LOAD Jan 27 23:57:29.006000 audit: BPF prog-id=36 op=UNLOAD Jan 27 23:57:29.006000 audit: BPF prog-id=56 op=LOAD Jan 27 23:57:29.006000 audit: BPF prog-id=57 op=LOAD Jan 27 23:57:29.006000 audit: BPF prog-id=37 op=UNLOAD Jan 27 23:57:29.006000 audit: BPF prog-id=38 op=UNLOAD Jan 27 23:57:29.010777 kernel: Console: switching to colour frame buffer device 160x50 Jan 27 23:57:29.010889 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 23:57:29.019767 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 27 23:57:29.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.054667 systemd[1]: Finished ensure-sysext.service. Jan 27 23:57:29.055000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.062325 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 27 23:57:29.064815 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 27 23:57:29.065964 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 23:57:29.085490 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 27 23:57:29.087910 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 23:57:29.091904 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 27 23:57:29.094924 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 23:57:29.097240 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 27 23:57:29.099185 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 23:57:29.101427 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 27 23:57:29.102842 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 23:57:29.102963 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 23:57:29.105962 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 27 23:57:29.108973 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 27 23:57:29.110226 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 23:57:29.111782 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 27 23:57:29.112000 audit: BPF prog-id=58 op=LOAD Jan 27 23:57:29.114338 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 27 23:57:29.116222 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 27 23:57:29.116604 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 27 23:57:29.116814 systemd[1]: Reached target time-set.target - System Time Set. Jan 27 23:57:29.118974 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 27 23:57:29.121455 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 23:57:29.124006 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 27 23:57:29.126841 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 27 23:57:29.127000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.127000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.128795 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 23:57:29.128985 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 23:57:29.130000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.130000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.132000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.132000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.130549 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 27 23:57:29.130768 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 27 23:57:29.132638 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 23:57:29.132847 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 23:57:29.133868 kernel: PTP clock support registered Jan 27 23:57:29.136000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.136000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.137000 audit[1577]: SYSTEM_BOOT pid=1577 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.138781 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 27 23:57:29.139030 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 27 23:57:29.140000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.140000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.141171 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 23:57:29.141365 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 23:57:29.142000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.142000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.143511 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 27 23:57:29.145368 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 27 23:57:29.146000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.146000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.147807 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 27 23:57:29.148000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.159282 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 27 23:57:29.160000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.167700 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 27 23:57:29.170541 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 27 23:57:29.172202 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 27 23:57:29.172295 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 27 23:57:29.173510 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 27 23:57:29.174000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:57:29.176000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 27 23:57:29.176000 audit[1602]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff0dbd240 a2=420 a3=0 items=0 ppid=1551 pid=1602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:57:29.176000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 23:57:29.178543 augenrules[1602]: No rules Jan 27 23:57:29.180787 systemd[1]: audit-rules.service: Deactivated successfully. Jan 27 23:57:29.181074 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 27 23:57:29.183814 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 27 23:57:29.187003 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 27 23:57:29.215108 systemd-networkd[1575]: lo: Link UP Jan 27 23:57:29.215408 systemd-networkd[1575]: lo: Gained carrier Jan 27 23:57:29.216567 systemd-networkd[1575]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 23:57:29.216576 systemd-networkd[1575]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 27 23:57:29.216690 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 27 23:57:29.218174 systemd-networkd[1575]: eth0: Link UP Jan 27 23:57:29.218306 systemd-networkd[1575]: eth0: Gained carrier Jan 27 23:57:29.218320 systemd-networkd[1575]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 23:57:29.219005 systemd[1]: Reached target network.target - Network. Jan 27 23:57:29.221964 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 27 23:57:29.224706 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 27 23:57:29.250488 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 23:57:29.254140 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 27 23:57:29.255897 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 27 23:57:29.257933 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 27 23:57:29.271941 systemd-networkd[1575]: eth0: DHCPv4 address 10.0.6.5/25, gateway 10.0.6.1 acquired from 10.0.6.1 Jan 27 23:57:29.838864 ldconfig[1566]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 27 23:57:29.845839 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 27 23:57:29.848470 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 27 23:57:29.871494 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 27 23:57:29.872872 systemd[1]: Reached target sysinit.target - System Initialization. Jan 27 23:57:29.873892 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 27 23:57:29.875009 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 27 23:57:29.876273 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 27 23:57:29.877355 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 27 23:57:29.878708 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 27 23:57:29.879904 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 27 23:57:29.880898 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 27 23:57:29.882004 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 27 23:57:29.882037 systemd[1]: Reached target paths.target - Path Units. Jan 27 23:57:29.882851 systemd[1]: Reached target timers.target - Timer Units. Jan 27 23:57:29.885154 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 27 23:57:29.887495 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 27 23:57:29.890149 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 27 23:57:29.891437 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 27 23:57:29.892599 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 27 23:57:29.901132 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 27 23:57:29.902364 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 27 23:57:29.904112 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 27 23:57:29.905233 systemd[1]: Reached target sockets.target - Socket Units. Jan 27 23:57:29.906131 systemd[1]: Reached target basic.target - Basic System. Jan 27 23:57:29.907023 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 27 23:57:29.907062 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 27 23:57:29.910041 systemd[1]: Starting chronyd.service - NTP client/server... Jan 27 23:57:29.911763 systemd[1]: Starting containerd.service - containerd container runtime... Jan 27 23:57:29.913801 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 27 23:57:29.916909 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 27 23:57:29.918927 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 27 23:57:29.921746 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:29.922655 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 27 23:57:29.924629 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 27 23:57:29.926802 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 27 23:57:29.938903 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 27 23:57:29.940256 jq[1631]: false Jan 27 23:57:29.941194 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 27 23:57:29.943350 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 27 23:57:29.945663 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 27 23:57:29.951958 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 27 23:57:29.952121 chronyd[1624]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 27 23:57:29.952874 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 27 23:57:29.953045 extend-filesystems[1632]: Found /dev/vda6 Jan 27 23:57:29.953168 chronyd[1624]: Loaded seccomp filter (level 2) Jan 27 23:57:29.953305 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 27 23:57:29.955062 systemd[1]: Starting update-engine.service - Update Engine... Jan 27 23:57:29.957915 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 27 23:57:29.959914 systemd[1]: Started chronyd.service - NTP client/server. Jan 27 23:57:29.960644 extend-filesystems[1632]: Found /dev/vda9 Jan 27 23:57:29.964196 extend-filesystems[1632]: Checking size of /dev/vda9 Jan 27 23:57:29.967810 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 27 23:57:29.969389 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 27 23:57:29.970904 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 27 23:57:29.971202 systemd[1]: motdgen.service: Deactivated successfully. Jan 27 23:57:29.971431 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 27 23:57:29.977784 jq[1648]: true Jan 27 23:57:29.975191 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 27 23:57:29.975386 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 27 23:57:29.983524 extend-filesystems[1632]: Resized partition /dev/vda9 Jan 27 23:57:29.991606 extend-filesystems[1676]: resize2fs 1.47.3 (8-Jul-2025) Jan 27 23:57:30.005750 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 27 23:57:30.005832 jq[1662]: true Jan 27 23:57:30.017260 tar[1659]: linux-arm64/LICENSE Jan 27 23:57:30.017260 tar[1659]: linux-arm64/helm Jan 27 23:57:30.021687 update_engine[1644]: I20260127 23:57:30.021313 1644 main.cc:92] Flatcar Update Engine starting Jan 27 23:57:30.041528 dbus-daemon[1627]: [system] SELinux support is enabled Jan 27 23:57:30.041803 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 27 23:57:30.045468 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 27 23:57:30.045499 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 27 23:57:30.047845 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 27 23:57:30.047872 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 27 23:57:30.048173 update_engine[1644]: I20260127 23:57:30.048125 1644 update_check_scheduler.cc:74] Next update check in 6m0s Jan 27 23:57:30.051789 systemd[1]: Started update-engine.service - Update Engine. Jan 27 23:57:30.055895 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 27 23:57:30.086910 systemd-logind[1642]: New seat seat0. Jan 27 23:57:30.115094 locksmithd[1692]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 27 23:57:30.117086 systemd-logind[1642]: Watching system buttons on /dev/input/event0 (Power Button) Jan 27 23:57:30.117103 systemd-logind[1642]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 27 23:57:30.117883 systemd[1]: Started systemd-logind.service - User Login Management. Jan 27 23:57:30.150941 bash[1696]: Updated "/home/core/.ssh/authorized_keys" Jan 27 23:57:30.153287 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 27 23:57:30.159777 systemd[1]: Starting sshkeys.service... Jan 27 23:57:30.169248 containerd[1664]: time="2026-01-27T23:57:30Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 27 23:57:30.170662 containerd[1664]: time="2026-01-27T23:57:30.170627640Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 27 23:57:30.183627 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 27 23:57:30.188043 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 27 23:57:30.193951 containerd[1664]: time="2026-01-27T23:57:30.193709880Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.72µs" Jan 27 23:57:30.193951 containerd[1664]: time="2026-01-27T23:57:30.193767160Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 27 23:57:30.193951 containerd[1664]: time="2026-01-27T23:57:30.193807560Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 27 23:57:30.193951 containerd[1664]: time="2026-01-27T23:57:30.193819800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.193963040Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.193978000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.194031680Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.194041800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.194311520Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.194326960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.194336520Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 27 23:57:30.194464 containerd[1664]: time="2026-01-27T23:57:30.194343720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.194629240Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.194660800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.194797760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.195014960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.195049240Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.195065080Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.195097880Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 27 23:57:30.195897 containerd[1664]: time="2026-01-27T23:57:30.195348560Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 27 23:57:30.196798 containerd[1664]: time="2026-01-27T23:57:30.196622000Z" level=info msg="metadata content store policy set" policy=shared Jan 27 23:57:30.207745 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:30.222618 containerd[1664]: time="2026-01-27T23:57:30.222408720Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 27 23:57:30.222618 containerd[1664]: time="2026-01-27T23:57:30.222476040Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 27 23:57:30.223219 containerd[1664]: time="2026-01-27T23:57:30.223169680Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 27 23:57:30.223219 containerd[1664]: time="2026-01-27T23:57:30.223204800Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 27 23:57:30.223369 containerd[1664]: time="2026-01-27T23:57:30.223343880Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 27 23:57:30.223393 containerd[1664]: time="2026-01-27T23:57:30.223378080Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 27 23:57:30.223411 containerd[1664]: time="2026-01-27T23:57:30.223392360Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 27 23:57:30.223411 containerd[1664]: time="2026-01-27T23:57:30.223402960Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 27 23:57:30.223447 containerd[1664]: time="2026-01-27T23:57:30.223414800Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 27 23:57:30.223447 containerd[1664]: time="2026-01-27T23:57:30.223427720Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 27 23:57:30.223447 containerd[1664]: time="2026-01-27T23:57:30.223440120Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 27 23:57:30.223501 containerd[1664]: time="2026-01-27T23:57:30.223451760Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 27 23:57:30.223501 containerd[1664]: time="2026-01-27T23:57:30.223461800Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 27 23:57:30.223501 containerd[1664]: time="2026-01-27T23:57:30.223475840Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 27 23:57:30.223634 containerd[1664]: time="2026-01-27T23:57:30.223613000Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 27 23:57:30.223654 containerd[1664]: time="2026-01-27T23:57:30.223643800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 27 23:57:30.223675 containerd[1664]: time="2026-01-27T23:57:30.223659560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 27 23:57:30.223675 containerd[1664]: time="2026-01-27T23:57:30.223670160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 27 23:57:30.223713 containerd[1664]: time="2026-01-27T23:57:30.223681200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 27 23:57:30.223713 containerd[1664]: time="2026-01-27T23:57:30.223690760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 27 23:57:30.223713 containerd[1664]: time="2026-01-27T23:57:30.223702480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 27 23:57:30.223779 containerd[1664]: time="2026-01-27T23:57:30.223716320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 27 23:57:30.223779 containerd[1664]: time="2026-01-27T23:57:30.223750600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 27 23:57:30.223779 containerd[1664]: time="2026-01-27T23:57:30.223762680Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 27 23:57:30.223779 containerd[1664]: time="2026-01-27T23:57:30.223773600Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 27 23:57:30.223845 containerd[1664]: time="2026-01-27T23:57:30.223803040Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 27 23:57:30.223863 containerd[1664]: time="2026-01-27T23:57:30.223842520Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 27 23:57:30.223863 containerd[1664]: time="2026-01-27T23:57:30.223856480Z" level=info msg="Start snapshots syncer" Jan 27 23:57:30.223903 containerd[1664]: time="2026-01-27T23:57:30.223890880Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 27 23:57:30.224191 containerd[1664]: time="2026-01-27T23:57:30.224148000Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 27 23:57:30.224280 containerd[1664]: time="2026-01-27T23:57:30.224206440Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 27 23:57:30.224280 containerd[1664]: time="2026-01-27T23:57:30.224250680Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 27 23:57:30.224364 containerd[1664]: time="2026-01-27T23:57:30.224343120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 27 23:57:30.224387 containerd[1664]: time="2026-01-27T23:57:30.224375560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 27 23:57:30.224411 containerd[1664]: time="2026-01-27T23:57:30.224387320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 27 23:57:30.224411 containerd[1664]: time="2026-01-27T23:57:30.224398400Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 27 23:57:30.224411 containerd[1664]: time="2026-01-27T23:57:30.224409800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 27 23:57:30.224459 containerd[1664]: time="2026-01-27T23:57:30.224420480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 27 23:57:30.224459 containerd[1664]: time="2026-01-27T23:57:30.224432000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 27 23:57:30.224459 containerd[1664]: time="2026-01-27T23:57:30.224442480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 27 23:57:30.224459 containerd[1664]: time="2026-01-27T23:57:30.224456720Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 27 23:57:30.224523 containerd[1664]: time="2026-01-27T23:57:30.224491160Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 27 23:57:30.224523 containerd[1664]: time="2026-01-27T23:57:30.224504040Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 27 23:57:30.224523 containerd[1664]: time="2026-01-27T23:57:30.224512880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 27 23:57:30.224571 containerd[1664]: time="2026-01-27T23:57:30.224521520Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 27 23:57:30.224571 containerd[1664]: time="2026-01-27T23:57:30.224529320Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 27 23:57:30.224571 containerd[1664]: time="2026-01-27T23:57:30.224538240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 27 23:57:30.224571 containerd[1664]: time="2026-01-27T23:57:30.224553920Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 27 23:57:30.224661 containerd[1664]: time="2026-01-27T23:57:30.224649520Z" level=info msg="runtime interface created" Jan 27 23:57:30.224661 containerd[1664]: time="2026-01-27T23:57:30.224659640Z" level=info msg="created NRI interface" Jan 27 23:57:30.224701 containerd[1664]: time="2026-01-27T23:57:30.224671200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 27 23:57:30.224701 containerd[1664]: time="2026-01-27T23:57:30.224682320Z" level=info msg="Connect containerd service" Jan 27 23:57:30.224745 containerd[1664]: time="2026-01-27T23:57:30.224702040Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 27 23:57:30.230153 containerd[1664]: time="2026-01-27T23:57:30.229900040Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 27 23:57:30.246975 sshd_keygen[1656]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 27 23:57:30.269089 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 27 23:57:30.273449 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 27 23:57:30.289784 systemd[1]: issuegen.service: Deactivated successfully. Jan 27 23:57:30.290852 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 27 23:57:30.293842 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 27 23:57:30.302771 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 27 23:57:30.315793 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 27 23:57:30.320905 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328375440Z" level=info msg="Start subscribing containerd event" Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328451520Z" level=info msg="Start recovering state" Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328535800Z" level=info msg="Start event monitor" Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328547120Z" level=info msg="Start cni network conf syncer for default" Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328556520Z" level=info msg="Start streaming server" Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328567120Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328574120Z" level=info msg="runtime interface starting up..." Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328580560Z" level=info msg="starting plugins..." Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.328593320Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.329482080Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.329537920Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 27 23:57:30.333292 containerd[1664]: time="2026-01-27T23:57:30.329591280Z" level=info msg="containerd successfully booted in 0.160795s" Jan 27 23:57:30.325863 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 27 23:57:30.329236 systemd[1]: Reached target getty.target - Login Prompts. Jan 27 23:57:30.330607 systemd[1]: Started containerd.service - containerd container runtime. Jan 27 23:57:30.336771 extend-filesystems[1676]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 27 23:57:30.336771 extend-filesystems[1676]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 27 23:57:30.336771 extend-filesystems[1676]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 27 23:57:30.341391 extend-filesystems[1632]: Resized filesystem in /dev/vda9 Jan 27 23:57:30.338201 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 27 23:57:30.338440 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 27 23:57:30.441807 tar[1659]: linux-arm64/README.md Jan 27 23:57:30.470324 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 27 23:57:30.937769 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:31.063955 systemd-networkd[1575]: eth0: Gained IPv6LL Jan 27 23:57:31.066716 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 27 23:57:31.068486 systemd[1]: Reached target network-online.target - Network is Online. Jan 27 23:57:31.070941 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:57:31.073063 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 27 23:57:31.113979 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 27 23:57:31.220767 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:31.889038 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:57:31.911339 (kubelet)[1769]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 23:57:32.376853 kubelet[1769]: E0127 23:57:32.376751 1769 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 23:57:32.380259 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 23:57:32.380403 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 23:57:32.380880 systemd[1]: kubelet.service: Consumed 717ms CPU time, 251.5M memory peak. Jan 27 23:57:32.950803 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:33.228804 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:35.299954 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 27 23:57:35.302290 systemd[1]: Started sshd@0-10.0.6.5:22-4.153.228.146:40834.service - OpenSSH per-connection server daemon (4.153.228.146:40834). Jan 27 23:57:35.833455 sshd[1780]: Accepted publickey for core from 4.153.228.146 port 40834 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:57:35.835960 sshd-session[1780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:57:35.846286 systemd-logind[1642]: New session 1 of user core. Jan 27 23:57:35.847898 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 27 23:57:35.849712 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 27 23:57:35.881829 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 27 23:57:35.884393 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 27 23:57:35.902053 (systemd)[1790]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:57:35.904357 systemd-logind[1642]: New session 2 of user core. Jan 27 23:57:36.018683 systemd[1790]: Queued start job for default target default.target. Jan 27 23:57:36.033079 systemd[1790]: Created slice app.slice - User Application Slice. Jan 27 23:57:36.033117 systemd[1790]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 27 23:57:36.033130 systemd[1790]: Reached target paths.target - Paths. Jan 27 23:57:36.033183 systemd[1790]: Reached target timers.target - Timers. Jan 27 23:57:36.034423 systemd[1790]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 27 23:57:36.035251 systemd[1790]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 27 23:57:36.044493 systemd[1790]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 27 23:57:36.044558 systemd[1790]: Reached target sockets.target - Sockets. Jan 27 23:57:36.044986 systemd[1790]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 27 23:57:36.045145 systemd[1790]: Reached target basic.target - Basic System. Jan 27 23:57:36.045204 systemd[1790]: Reached target default.target - Main User Target. Jan 27 23:57:36.045230 systemd[1790]: Startup finished in 135ms. Jan 27 23:57:36.045509 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 27 23:57:36.047099 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 27 23:57:36.349258 systemd[1]: Started sshd@1-10.0.6.5:22-4.153.228.146:49566.service - OpenSSH per-connection server daemon (4.153.228.146:49566). Jan 27 23:57:36.866691 sshd[1804]: Accepted publickey for core from 4.153.228.146 port 49566 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:57:36.867980 sshd-session[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:57:36.871707 systemd-logind[1642]: New session 3 of user core. Jan 27 23:57:36.881204 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 27 23:57:36.960773 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:36.966881 coreos-metadata[1626]: Jan 27 23:57:36.966 WARN failed to locate config-drive, using the metadata service API instead Jan 27 23:57:36.982284 coreos-metadata[1626]: Jan 27 23:57:36.982 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 27 23:57:37.155210 sshd[1808]: Connection closed by 4.153.228.146 port 49566 Jan 27 23:57:37.154866 sshd-session[1804]: pam_unix(sshd:session): session closed for user core Jan 27 23:57:37.159022 systemd[1]: sshd@1-10.0.6.5:22-4.153.228.146:49566.service: Deactivated successfully. Jan 27 23:57:37.162372 systemd[1]: session-3.scope: Deactivated successfully. Jan 27 23:57:37.163167 systemd-logind[1642]: Session 3 logged out. Waiting for processes to exit. Jan 27 23:57:37.164027 systemd-logind[1642]: Removed session 3. Jan 27 23:57:37.238793 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 23:57:37.244863 coreos-metadata[1712]: Jan 27 23:57:37.244 WARN failed to locate config-drive, using the metadata service API instead Jan 27 23:57:37.258246 coreos-metadata[1712]: Jan 27 23:57:37.258 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 27 23:57:37.262201 systemd[1]: Started sshd@2-10.0.6.5:22-4.153.228.146:49582.service - OpenSSH per-connection server daemon (4.153.228.146:49582). Jan 27 23:57:37.769755 sshd[1818]: Accepted publickey for core from 4.153.228.146 port 49582 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:57:37.771308 sshd-session[1818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:57:37.776032 systemd-logind[1642]: New session 4 of user core. Jan 27 23:57:37.791246 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 27 23:57:38.057929 sshd[1822]: Connection closed by 4.153.228.146 port 49582 Jan 27 23:57:38.057846 sshd-session[1818]: pam_unix(sshd:session): session closed for user core Jan 27 23:57:38.062386 systemd[1]: sshd@2-10.0.6.5:22-4.153.228.146:49582.service: Deactivated successfully. Jan 27 23:57:38.063976 systemd[1]: session-4.scope: Deactivated successfully. Jan 27 23:57:38.066693 systemd-logind[1642]: Session 4 logged out. Waiting for processes to exit. Jan 27 23:57:38.067686 systemd-logind[1642]: Removed session 4. Jan 27 23:57:39.632824 coreos-metadata[1626]: Jan 27 23:57:39.632 INFO Fetch successful Jan 27 23:57:39.633184 coreos-metadata[1626]: Jan 27 23:57:39.633 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 27 23:57:39.634450 coreos-metadata[1712]: Jan 27 23:57:39.634 INFO Fetch successful Jan 27 23:57:39.634450 coreos-metadata[1712]: Jan 27 23:57:39.634 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 27 23:57:42.116599 coreos-metadata[1626]: Jan 27 23:57:42.116 INFO Fetch successful Jan 27 23:57:42.116599 coreos-metadata[1626]: Jan 27 23:57:42.116 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 27 23:57:42.631259 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 27 23:57:42.632992 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:57:42.770035 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:57:42.774158 (kubelet)[1835]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 23:57:42.807976 kubelet[1835]: E0127 23:57:42.807792 1835 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 23:57:42.811161 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 23:57:42.811294 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 23:57:42.811694 systemd[1]: kubelet.service: Consumed 147ms CPU time, 109.3M memory peak. Jan 27 23:57:43.384632 coreos-metadata[1712]: Jan 27 23:57:43.384 INFO Fetch successful Jan 27 23:57:43.388089 unknown[1712]: wrote ssh authorized keys file for user: core Jan 27 23:57:43.425081 update-ssh-keys[1844]: Updated "/home/core/.ssh/authorized_keys" Jan 27 23:57:43.426298 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 27 23:57:43.428616 systemd[1]: Finished sshkeys.service. Jan 27 23:57:43.463158 coreos-metadata[1626]: Jan 27 23:57:43.463 INFO Fetch successful Jan 27 23:57:43.463158 coreos-metadata[1626]: Jan 27 23:57:43.463 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 27 23:57:44.723981 coreos-metadata[1626]: Jan 27 23:57:44.723 INFO Fetch successful Jan 27 23:57:44.723981 coreos-metadata[1626]: Jan 27 23:57:44.723 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 27 23:57:47.190867 coreos-metadata[1626]: Jan 27 23:57:47.190 INFO Fetch successful Jan 27 23:57:47.190867 coreos-metadata[1626]: Jan 27 23:57:47.190 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 27 23:57:48.167119 systemd[1]: Started sshd@3-10.0.6.5:22-4.153.228.146:40070.service - OpenSSH per-connection server daemon (4.153.228.146:40070). Jan 27 23:57:48.685706 sshd[1848]: Accepted publickey for core from 4.153.228.146 port 40070 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:57:48.687055 sshd-session[1848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:57:48.691183 systemd-logind[1642]: New session 5 of user core. Jan 27 23:57:48.706996 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 27 23:57:48.973381 sshd[1852]: Connection closed by 4.153.228.146 port 40070 Jan 27 23:57:48.973859 sshd-session[1848]: pam_unix(sshd:session): session closed for user core Jan 27 23:57:48.978087 systemd[1]: sshd@3-10.0.6.5:22-4.153.228.146:40070.service: Deactivated successfully. Jan 27 23:57:48.979646 systemd[1]: session-5.scope: Deactivated successfully. Jan 27 23:57:48.980348 systemd-logind[1642]: Session 5 logged out. Waiting for processes to exit. Jan 27 23:57:48.981295 systemd-logind[1642]: Removed session 5. Jan 27 23:57:49.083216 systemd[1]: Started sshd@4-10.0.6.5:22-4.153.228.146:40084.service - OpenSSH per-connection server daemon (4.153.228.146:40084). Jan 27 23:57:49.591737 sshd[1858]: Accepted publickey for core from 4.153.228.146 port 40084 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:57:49.593128 sshd-session[1858]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:57:49.597994 systemd-logind[1642]: New session 6 of user core. Jan 27 23:57:49.604071 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 27 23:57:49.879936 sshd[1862]: Connection closed by 4.153.228.146 port 40084 Jan 27 23:57:49.879966 sshd-session[1858]: pam_unix(sshd:session): session closed for user core Jan 27 23:57:49.883929 systemd[1]: sshd@4-10.0.6.5:22-4.153.228.146:40084.service: Deactivated successfully. Jan 27 23:57:49.885535 systemd[1]: session-6.scope: Deactivated successfully. Jan 27 23:57:49.887471 systemd-logind[1642]: Session 6 logged out. Waiting for processes to exit. Jan 27 23:57:49.888344 systemd-logind[1642]: Removed session 6. Jan 27 23:57:50.171620 coreos-metadata[1626]: Jan 27 23:57:50.171 INFO Fetch successful Jan 27 23:57:50.204836 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 27 23:57:50.205664 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 27 23:57:50.205834 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 27 23:57:50.209898 systemd[1]: Startup finished in 2.432s (kernel) + 15.201s (initrd) + 22.920s (userspace) = 40.554s. Jan 27 23:57:53.062166 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 27 23:57:53.063673 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:57:53.196915 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:57:53.201118 (kubelet)[1879]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 23:57:53.233800 kubelet[1879]: E0127 23:57:53.233745 1879 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 23:57:53.236578 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 23:57:53.236712 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 23:57:53.237358 systemd[1]: kubelet.service: Consumed 141ms CPU time, 107.1M memory peak. Jan 27 23:57:53.737638 chronyd[1624]: Selected source PHC0 Jan 27 23:57:59.930996 systemd[1]: Started sshd@5-10.0.6.5:22-4.153.228.146:44136.service - OpenSSH per-connection server daemon (4.153.228.146:44136). Jan 27 23:58:00.439069 sshd[1890]: Accepted publickey for core from 4.153.228.146 port 44136 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:58:00.440409 sshd-session[1890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:58:00.444593 systemd-logind[1642]: New session 7 of user core. Jan 27 23:58:00.459117 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 27 23:58:00.727956 sshd[1894]: Connection closed by 4.153.228.146 port 44136 Jan 27 23:58:00.727573 sshd-session[1890]: pam_unix(sshd:session): session closed for user core Jan 27 23:58:00.732143 systemd[1]: sshd@5-10.0.6.5:22-4.153.228.146:44136.service: Deactivated successfully. Jan 27 23:58:00.733898 systemd[1]: session-7.scope: Deactivated successfully. Jan 27 23:58:00.735314 systemd-logind[1642]: Session 7 logged out. Waiting for processes to exit. Jan 27 23:58:00.736200 systemd-logind[1642]: Removed session 7. Jan 27 23:58:00.843230 systemd[1]: Started sshd@6-10.0.6.5:22-4.153.228.146:44152.service - OpenSSH per-connection server daemon (4.153.228.146:44152). Jan 27 23:58:01.373914 sshd[1900]: Accepted publickey for core from 4.153.228.146 port 44152 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:58:01.375257 sshd-session[1900]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:58:01.378930 systemd-logind[1642]: New session 8 of user core. Jan 27 23:58:01.390893 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 27 23:58:01.657651 sshd[1904]: Connection closed by 4.153.228.146 port 44152 Jan 27 23:58:01.658126 sshd-session[1900]: pam_unix(sshd:session): session closed for user core Jan 27 23:58:01.663224 systemd[1]: sshd@6-10.0.6.5:22-4.153.228.146:44152.service: Deactivated successfully. Jan 27 23:58:01.664912 systemd[1]: session-8.scope: Deactivated successfully. Jan 27 23:58:01.665637 systemd-logind[1642]: Session 8 logged out. Waiting for processes to exit. Jan 27 23:58:01.667966 systemd-logind[1642]: Removed session 8. Jan 27 23:58:01.767053 systemd[1]: Started sshd@7-10.0.6.5:22-4.153.228.146:44168.service - OpenSSH per-connection server daemon (4.153.228.146:44168). Jan 27 23:58:02.277775 sshd[1910]: Accepted publickey for core from 4.153.228.146 port 44168 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:58:02.279101 sshd-session[1910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:58:02.283196 systemd-logind[1642]: New session 9 of user core. Jan 27 23:58:02.300913 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 27 23:58:02.566172 sshd[1914]: Connection closed by 4.153.228.146 port 44168 Jan 27 23:58:02.566524 sshd-session[1910]: pam_unix(sshd:session): session closed for user core Jan 27 23:58:02.569697 systemd[1]: sshd@7-10.0.6.5:22-4.153.228.146:44168.service: Deactivated successfully. Jan 27 23:58:02.571325 systemd[1]: session-9.scope: Deactivated successfully. Jan 27 23:58:02.572583 systemd-logind[1642]: Session 9 logged out. Waiting for processes to exit. Jan 27 23:58:02.576009 systemd-logind[1642]: Removed session 9. Jan 27 23:58:02.684447 systemd[1]: Started sshd@8-10.0.6.5:22-4.153.228.146:44178.service - OpenSSH per-connection server daemon (4.153.228.146:44178). Jan 27 23:58:03.191697 sshd[1920]: Accepted publickey for core from 4.153.228.146 port 44178 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:58:03.193071 sshd-session[1920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:58:03.197653 systemd-logind[1642]: New session 10 of user core. Jan 27 23:58:03.207019 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 27 23:58:03.400119 sudo[1925]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 27 23:58:03.400383 sudo[1925]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 23:58:03.423957 sudo[1925]: pam_unix(sudo:session): session closed for user root Jan 27 23:58:03.487644 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 27 23:58:03.489401 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:58:03.520451 sshd[1924]: Connection closed by 4.153.228.146 port 44178 Jan 27 23:58:03.520895 sshd-session[1920]: pam_unix(sshd:session): session closed for user core Jan 27 23:58:03.525045 systemd[1]: sshd@8-10.0.6.5:22-4.153.228.146:44178.service: Deactivated successfully. Jan 27 23:58:03.526816 systemd[1]: session-10.scope: Deactivated successfully. Jan 27 23:58:03.527551 systemd-logind[1642]: Session 10 logged out. Waiting for processes to exit. Jan 27 23:58:03.528996 systemd-logind[1642]: Removed session 10. Jan 27 23:58:03.635018 systemd[1]: Started sshd@9-10.0.6.5:22-4.153.228.146:44182.service - OpenSSH per-connection server daemon (4.153.228.146:44182). Jan 27 23:58:03.667197 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:58:03.670924 (kubelet)[1943]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 23:58:03.708434 kubelet[1943]: E0127 23:58:03.708379 1943 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 23:58:03.711063 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 23:58:03.711197 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 23:58:03.712811 systemd[1]: kubelet.service: Consumed 144ms CPU time, 107.3M memory peak. Jan 27 23:58:04.145614 sshd[1935]: Accepted publickey for core from 4.153.228.146 port 44182 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:58:04.146550 sshd-session[1935]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:58:04.150696 systemd-logind[1642]: New session 11 of user core. Jan 27 23:58:04.157888 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 27 23:58:04.340445 sudo[1954]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 27 23:58:04.340760 sudo[1954]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 23:58:04.344163 sudo[1954]: pam_unix(sudo:session): session closed for user root Jan 27 23:58:04.350056 sudo[1953]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 27 23:58:04.350313 sudo[1953]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 23:58:04.357264 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 27 23:58:04.392804 kernel: kauditd_printk_skb: 192 callbacks suppressed Jan 27 23:58:04.392993 kernel: audit: type=1305 audit(1769558284.388:236): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 23:58:04.388000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 23:58:04.393105 augenrules[1978]: No rules Jan 27 23:58:04.393131 systemd[1]: audit-rules.service: Deactivated successfully. Jan 27 23:58:04.393351 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 27 23:58:04.388000 audit[1978]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc4e02ad0 a2=420 a3=0 items=0 ppid=1959 pid=1978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:04.397407 kernel: audit: type=1300 audit(1769558284.388:236): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc4e02ad0 a2=420 a3=0 items=0 ppid=1959 pid=1978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:04.388000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 23:58:04.399738 kernel: audit: type=1327 audit(1769558284.388:236): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 23:58:04.399899 kernel: audit: type=1130 audit(1769558284.391:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.398970 sudo[1953]: pam_unix(sudo:session): session closed for user root Jan 27 23:58:04.391000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.405352 kernel: audit: type=1131 audit(1769558284.391:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.405403 kernel: audit: type=1106 audit(1769558284.397:239): pid=1953 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.397000 audit[1953]: USER_END pid=1953 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.397000 audit[1953]: CRED_DISP pid=1953 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.410930 kernel: audit: type=1104 audit(1769558284.397:240): pid=1953 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.496070 sshd[1952]: Connection closed by 4.153.228.146 port 44182 Jan 27 23:58:04.496350 sshd-session[1935]: pam_unix(sshd:session): session closed for user core Jan 27 23:58:04.495000 audit[1935]: USER_END pid=1935 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:04.502487 systemd[1]: sshd@9-10.0.6.5:22-4.153.228.146:44182.service: Deactivated successfully. Jan 27 23:58:04.504058 systemd[1]: session-11.scope: Deactivated successfully. Jan 27 23:58:04.496000 audit[1935]: CRED_DISP pid=1935 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:04.506614 systemd-logind[1642]: Session 11 logged out. Waiting for processes to exit. Jan 27 23:58:04.507443 systemd-logind[1642]: Removed session 11. Jan 27 23:58:04.507816 kernel: audit: type=1106 audit(1769558284.495:241): pid=1935 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:04.507856 kernel: audit: type=1104 audit(1769558284.496:242): pid=1935 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:04.507871 kernel: audit: type=1131 audit(1769558284.501:243): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.6.5:22-4.153.228.146:44182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.501000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.6.5:22-4.153.228.146:44182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:04.606021 systemd[1]: Started sshd@10-10.0.6.5:22-4.153.228.146:49934.service - OpenSSH per-connection server daemon (4.153.228.146:49934). Jan 27 23:58:04.604000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.6.5:22-4.153.228.146:49934 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:05.112000 audit[1987]: USER_ACCT pid=1987 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:05.114210 sshd[1987]: Accepted publickey for core from 4.153.228.146 port 49934 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 27 23:58:05.113000 audit[1987]: CRED_ACQ pid=1987 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:05.113000 audit[1987]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd94b8320 a2=3 a3=0 items=0 ppid=1 pid=1987 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:05.113000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 23:58:05.115919 sshd-session[1987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 23:58:05.119908 systemd-logind[1642]: New session 12 of user core. Jan 27 23:58:05.135977 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 27 23:58:05.137000 audit[1987]: USER_START pid=1987 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:05.138000 audit[1991]: CRED_ACQ pid=1991 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:05.307000 audit[1992]: USER_ACCT pid=1992 uid=500 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:05.309310 sudo[1992]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 27 23:58:05.307000 audit[1992]: CRED_REFR pid=1992 uid=500 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:05.307000 audit[1992]: USER_START pid=1992 uid=500 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:05.309582 sudo[1992]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 23:58:05.635293 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 27 23:58:05.654371 (dockerd)[2014]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 27 23:58:05.889597 dockerd[2014]: time="2026-01-27T23:58:05.889457493Z" level=info msg="Starting up" Jan 27 23:58:05.890935 dockerd[2014]: time="2026-01-27T23:58:05.890896777Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 27 23:58:05.903600 dockerd[2014]: time="2026-01-27T23:58:05.903535416Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 27 23:58:05.948771 dockerd[2014]: time="2026-01-27T23:58:05.948703755Z" level=info msg="Loading containers: start." Jan 27 23:58:05.958779 kernel: Initializing XFRM netlink socket Jan 27 23:58:06.004000 audit[2066]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.004000 audit[2066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffd2a7ed60 a2=0 a3=0 items=0 ppid=2014 pid=2066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.004000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 27 23:58:06.006000 audit[2068]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2068 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.006000 audit[2068]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffd806d030 a2=0 a3=0 items=0 ppid=2014 pid=2068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.006000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 27 23:58:06.008000 audit[2070]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2070 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.008000 audit[2070]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcba97290 a2=0 a3=0 items=0 ppid=2014 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.008000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 27 23:58:06.010000 audit[2072]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.010000 audit[2072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe51535f0 a2=0 a3=0 items=0 ppid=2014 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.010000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 27 23:58:06.012000 audit[2074]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2074 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.012000 audit[2074]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd4417760 a2=0 a3=0 items=0 ppid=2014 pid=2074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.012000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 27 23:58:06.014000 audit[2076]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.014000 audit[2076]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffdc4969b0 a2=0 a3=0 items=0 ppid=2014 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.014000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 23:58:06.016000 audit[2078]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2078 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.016000 audit[2078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffce2475a0 a2=0 a3=0 items=0 ppid=2014 pid=2078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.016000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 23:58:06.018000 audit[2080]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2080 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.018000 audit[2080]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffefe27f10 a2=0 a3=0 items=0 ppid=2014 pid=2080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.018000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 27 23:58:06.050000 audit[2083]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.050000 audit[2083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=fffffd3b6310 a2=0 a3=0 items=0 ppid=2014 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.050000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 27 23:58:06.051000 audit[2085]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2085 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.051000 audit[2085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc3ccaaa0 a2=0 a3=0 items=0 ppid=2014 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.051000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 27 23:58:06.053000 audit[2087]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.053000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd2adb520 a2=0 a3=0 items=0 ppid=2014 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.053000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 27 23:58:06.055000 audit[2089]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.055000 audit[2089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffc861aae0 a2=0 a3=0 items=0 ppid=2014 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.055000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 23:58:06.056000 audit[2091]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2091 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.056000 audit[2091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=fffff0b6a7f0 a2=0 a3=0 items=0 ppid=2014 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.056000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 27 23:58:06.091000 audit[2121]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.091000 audit[2121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffc1c5d260 a2=0 a3=0 items=0 ppid=2014 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.091000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 27 23:58:06.093000 audit[2123]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.093000 audit[2123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffdf1ad000 a2=0 a3=0 items=0 ppid=2014 pid=2123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.093000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 27 23:58:06.095000 audit[2125]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.095000 audit[2125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffce2a99a0 a2=0 a3=0 items=0 ppid=2014 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.095000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 27 23:58:06.096000 audit[2127]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.096000 audit[2127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe66ddf00 a2=0 a3=0 items=0 ppid=2014 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.096000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 27 23:58:06.098000 audit[2129]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.098000 audit[2129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd120d3d0 a2=0 a3=0 items=0 ppid=2014 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.098000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 27 23:58:06.100000 audit[2131]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2131 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.100000 audit[2131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffde538c90 a2=0 a3=0 items=0 ppid=2014 pid=2131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.100000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 23:58:06.102000 audit[2133]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2133 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.102000 audit[2133]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe16012b0 a2=0 a3=0 items=0 ppid=2014 pid=2133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.102000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 23:58:06.104000 audit[2135]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.104000 audit[2135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffec31aea0 a2=0 a3=0 items=0 ppid=2014 pid=2135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.104000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 27 23:58:06.106000 audit[2137]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2137 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.106000 audit[2137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffd7ac0d10 a2=0 a3=0 items=0 ppid=2014 pid=2137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.106000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 27 23:58:06.108000 audit[2139]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2139 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.108000 audit[2139]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe186e070 a2=0 a3=0 items=0 ppid=2014 pid=2139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.108000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 27 23:58:06.110000 audit[2141]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2141 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.110000 audit[2141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd746fbe0 a2=0 a3=0 items=0 ppid=2014 pid=2141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.110000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 27 23:58:06.112000 audit[2143]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2143 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.112000 audit[2143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffd6d64b60 a2=0 a3=0 items=0 ppid=2014 pid=2143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.112000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 23:58:06.113000 audit[2145]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.113000 audit[2145]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc1eea0e0 a2=0 a3=0 items=0 ppid=2014 pid=2145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.113000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 27 23:58:06.118000 audit[2150]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.118000 audit[2150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd4635490 a2=0 a3=0 items=0 ppid=2014 pid=2150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.118000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 27 23:58:06.120000 audit[2152]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.120000 audit[2152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffcdf8d710 a2=0 a3=0 items=0 ppid=2014 pid=2152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.120000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 27 23:58:06.122000 audit[2154]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.122000 audit[2154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffda39f5d0 a2=0 a3=0 items=0 ppid=2014 pid=2154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.122000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 27 23:58:06.124000 audit[2156]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2156 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.124000 audit[2156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe9553bf0 a2=0 a3=0 items=0 ppid=2014 pid=2156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.124000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 27 23:58:06.126000 audit[2158]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2158 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.126000 audit[2158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffc96c6960 a2=0 a3=0 items=0 ppid=2014 pid=2158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.126000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 27 23:58:06.128000 audit[2160]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2160 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:06.128000 audit[2160]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffefea3000 a2=0 a3=0 items=0 ppid=2014 pid=2160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.128000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 27 23:58:06.149000 audit[2165]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2165 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.149000 audit[2165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffd30f25d0 a2=0 a3=0 items=0 ppid=2014 pid=2165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.149000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 27 23:58:06.151000 audit[2167]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.151000 audit[2167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffed95bee0 a2=0 a3=0 items=0 ppid=2014 pid=2167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.151000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 27 23:58:06.160000 audit[2175]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2175 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.160000 audit[2175]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffd15c7060 a2=0 a3=0 items=0 ppid=2014 pid=2175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.160000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 27 23:58:06.169000 audit[2181]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2181 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.169000 audit[2181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffe0a787b0 a2=0 a3=0 items=0 ppid=2014 pid=2181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.169000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 27 23:58:06.172000 audit[2183]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2183 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.172000 audit[2183]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffeb8a4710 a2=0 a3=0 items=0 ppid=2014 pid=2183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.172000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 27 23:58:06.174000 audit[2185]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2185 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.174000 audit[2185]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffca5a8d30 a2=0 a3=0 items=0 ppid=2014 pid=2185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.174000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 27 23:58:06.176000 audit[2187]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2187 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.176000 audit[2187]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffd1ddf300 a2=0 a3=0 items=0 ppid=2014 pid=2187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.176000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 23:58:06.177000 audit[2189]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2189 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:06.177000 audit[2189]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffff249fd20 a2=0 a3=0 items=0 ppid=2014 pid=2189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:06.177000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 27 23:58:06.180246 systemd-networkd[1575]: docker0: Link UP Jan 27 23:58:06.184397 dockerd[2014]: time="2026-01-27T23:58:06.184338880Z" level=info msg="Loading containers: done." Jan 27 23:58:06.197443 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2697601751-merged.mount: Deactivated successfully. Jan 27 23:58:06.206482 dockerd[2014]: time="2026-01-27T23:58:06.206392908Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 27 23:58:06.206644 dockerd[2014]: time="2026-01-27T23:58:06.206494028Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 27 23:58:06.206691 dockerd[2014]: time="2026-01-27T23:58:06.206665029Z" level=info msg="Initializing buildkit" Jan 27 23:58:06.233566 dockerd[2014]: time="2026-01-27T23:58:06.233226110Z" level=info msg="Completed buildkit initialization" Jan 27 23:58:06.237946 dockerd[2014]: time="2026-01-27T23:58:06.237914725Z" level=info msg="Daemon has completed initialization" Jan 27 23:58:06.238565 dockerd[2014]: time="2026-01-27T23:58:06.238104725Z" level=info msg="API listen on /run/docker.sock" Jan 27 23:58:06.238388 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 27 23:58:06.237000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:07.406073 containerd[1664]: time="2026-01-27T23:58:07.406034638Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Jan 27 23:58:08.157470 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3747991464.mount: Deactivated successfully. Jan 27 23:58:08.750577 containerd[1664]: time="2026-01-27T23:58:08.750510375Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:08.752440 containerd[1664]: time="2026-01-27T23:58:08.752382861Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=23753476" Jan 27 23:58:08.753643 containerd[1664]: time="2026-01-27T23:58:08.753604184Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:08.756957 containerd[1664]: time="2026-01-27T23:58:08.756927075Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:08.758057 containerd[1664]: time="2026-01-27T23:58:08.757822877Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.351750198s" Jan 27 23:58:08.758057 containerd[1664]: time="2026-01-27T23:58:08.757853157Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Jan 27 23:58:08.758379 containerd[1664]: time="2026-01-27T23:58:08.758352879Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Jan 27 23:58:09.945154 containerd[1664]: time="2026-01-27T23:58:09.945079685Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:09.946995 containerd[1664]: time="2026-01-27T23:58:09.946968131Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19127323" Jan 27 23:58:09.948184 containerd[1664]: time="2026-01-27T23:58:09.948123615Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:09.951060 containerd[1664]: time="2026-01-27T23:58:09.951010703Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:09.952125 containerd[1664]: time="2026-01-27T23:58:09.952100547Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.193718668s" Jan 27 23:58:09.952174 containerd[1664]: time="2026-01-27T23:58:09.952130547Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Jan 27 23:58:09.952844 containerd[1664]: time="2026-01-27T23:58:09.952505908Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Jan 27 23:58:10.993687 containerd[1664]: time="2026-01-27T23:58:10.993643885Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:10.994693 containerd[1664]: time="2026-01-27T23:58:10.994649208Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14183580" Jan 27 23:58:10.995489 containerd[1664]: time="2026-01-27T23:58:10.995440450Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:10.998218 containerd[1664]: time="2026-01-27T23:58:10.998181339Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:10.999780 containerd[1664]: time="2026-01-27T23:58:10.999753344Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 1.047215516s" Jan 27 23:58:10.999818 containerd[1664]: time="2026-01-27T23:58:10.999784544Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Jan 27 23:58:11.000290 containerd[1664]: time="2026-01-27T23:58:11.000263385Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Jan 27 23:58:11.982528 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2482497447.mount: Deactivated successfully. Jan 27 23:58:12.145966 containerd[1664]: time="2026-01-27T23:58:12.145903881Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:12.146984 containerd[1664]: time="2026-01-27T23:58:12.146937164Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=12960247" Jan 27 23:58:12.147826 containerd[1664]: time="2026-01-27T23:58:12.147789127Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:12.150090 containerd[1664]: time="2026-01-27T23:58:12.150068294Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:12.150969 containerd[1664]: time="2026-01-27T23:58:12.150754576Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.150436311s" Jan 27 23:58:12.150969 containerd[1664]: time="2026-01-27T23:58:12.150789976Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Jan 27 23:58:12.151211 containerd[1664]: time="2026-01-27T23:58:12.151185337Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Jan 27 23:58:12.937190 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1975285637.mount: Deactivated successfully. Jan 27 23:58:13.418688 containerd[1664]: time="2026-01-27T23:58:13.417921577Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:13.418688 containerd[1664]: time="2026-01-27T23:58:13.418552138Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=19575910" Jan 27 23:58:13.419619 containerd[1664]: time="2026-01-27T23:58:13.419590662Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:13.422428 containerd[1664]: time="2026-01-27T23:58:13.422401270Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:13.423757 containerd[1664]: time="2026-01-27T23:58:13.423704234Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.272488457s" Jan 27 23:58:13.423816 containerd[1664]: time="2026-01-27T23:58:13.423760195Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Jan 27 23:58:13.424763 containerd[1664]: time="2026-01-27T23:58:13.424607197Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Jan 27 23:58:13.771188 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 27 23:58:13.772832 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:58:13.909858 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:58:13.908000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:13.913314 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 27 23:58:13.913372 kernel: audit: type=1130 audit(1769558293.908:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:13.914042 (kubelet)[2364]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 23:58:13.947905 kubelet[2364]: E0127 23:58:13.947838 2364 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 23:58:13.950470 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 23:58:13.950611 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 23:58:13.952823 systemd[1]: kubelet.service: Consumed 141ms CPU time, 107.2M memory peak. Jan 27 23:58:13.951000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 23:58:13.955754 kernel: audit: type=1131 audit(1769558293.951:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 23:58:13.973410 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2698261295.mount: Deactivated successfully. Jan 27 23:58:13.979756 containerd[1664]: time="2026-01-27T23:58:13.979708345Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:13.981847 containerd[1664]: time="2026-01-27T23:58:13.981782991Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Jan 27 23:58:13.982893 containerd[1664]: time="2026-01-27T23:58:13.982857595Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:13.985690 containerd[1664]: time="2026-01-27T23:58:13.985649603Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:13.986934 containerd[1664]: time="2026-01-27T23:58:13.986875007Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 562.22325ms" Jan 27 23:58:13.986934 containerd[1664]: time="2026-01-27T23:58:13.986926407Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Jan 27 23:58:13.987979 containerd[1664]: time="2026-01-27T23:58:13.987806570Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Jan 27 23:58:14.646769 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount160827155.mount: Deactivated successfully. Jan 27 23:58:15.656025 update_engine[1644]: I20260127 23:58:15.655925 1644 update_attempter.cc:509] Updating boot flags... Jan 27 23:58:16.985123 containerd[1664]: time="2026-01-27T23:58:16.985073271Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:16.985949 containerd[1664]: time="2026-01-27T23:58:16.985901714Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=96314798" Jan 27 23:58:16.987447 containerd[1664]: time="2026-01-27T23:58:16.987405318Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:16.989745 containerd[1664]: time="2026-01-27T23:58:16.989700485Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:16.990864 containerd[1664]: time="2026-01-27T23:58:16.990835809Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 3.002997039s" Jan 27 23:58:16.990915 containerd[1664]: time="2026-01-27T23:58:16.990869129Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Jan 27 23:58:23.207786 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:58:23.206000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:23.207952 systemd[1]: kubelet.service: Consumed 141ms CPU time, 107.2M memory peak. Jan 27 23:58:23.209849 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:58:23.206000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:23.212962 kernel: audit: type=1130 audit(1769558303.206:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:23.213010 kernel: audit: type=1131 audit(1769558303.206:297): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:23.243211 systemd[1]: Reload requested from client PID 2477 ('systemctl') (unit session-12.scope)... Jan 27 23:58:23.243230 systemd[1]: Reloading... Jan 27 23:58:23.321794 zram_generator::config[2523]: No configuration found. Jan 27 23:58:23.497408 systemd[1]: Reloading finished in 253 ms. Jan 27 23:58:23.523000 audit: BPF prog-id=63 op=LOAD Jan 27 23:58:23.523000 audit: BPF prog-id=64 op=LOAD Jan 27 23:58:23.527435 kernel: audit: type=1334 audit(1769558303.523:298): prog-id=63 op=LOAD Jan 27 23:58:23.527480 kernel: audit: type=1334 audit(1769558303.523:299): prog-id=64 op=LOAD Jan 27 23:58:23.527501 kernel: audit: type=1334 audit(1769558303.523:300): prog-id=46 op=UNLOAD Jan 27 23:58:23.523000 audit: BPF prog-id=46 op=UNLOAD Jan 27 23:58:23.523000 audit: BPF prog-id=47 op=UNLOAD Jan 27 23:58:23.530001 kernel: audit: type=1334 audit(1769558303.523:301): prog-id=47 op=UNLOAD Jan 27 23:58:23.530035 kernel: audit: type=1334 audit(1769558303.524:302): prog-id=65 op=LOAD Jan 27 23:58:23.524000 audit: BPF prog-id=65 op=LOAD Jan 27 23:58:23.524000 audit: BPF prog-id=49 op=UNLOAD Jan 27 23:58:23.531456 kernel: audit: type=1334 audit(1769558303.524:303): prog-id=49 op=UNLOAD Jan 27 23:58:23.524000 audit: BPF prog-id=66 op=LOAD Jan 27 23:58:23.532271 kernel: audit: type=1334 audit(1769558303.524:304): prog-id=66 op=LOAD Jan 27 23:58:23.532306 kernel: audit: type=1334 audit(1769558303.526:305): prog-id=67 op=LOAD Jan 27 23:58:23.526000 audit: BPF prog-id=67 op=LOAD Jan 27 23:58:23.526000 audit: BPF prog-id=50 op=UNLOAD Jan 27 23:58:23.526000 audit: BPF prog-id=51 op=UNLOAD Jan 27 23:58:23.527000 audit: BPF prog-id=68 op=LOAD Jan 27 23:58:23.535000 audit: BPF prog-id=43 op=UNLOAD Jan 27 23:58:23.535000 audit: BPF prog-id=69 op=LOAD Jan 27 23:58:23.535000 audit: BPF prog-id=70 op=LOAD Jan 27 23:58:23.535000 audit: BPF prog-id=44 op=UNLOAD Jan 27 23:58:23.535000 audit: BPF prog-id=45 op=UNLOAD Jan 27 23:58:23.535000 audit: BPF prog-id=71 op=LOAD Jan 27 23:58:23.535000 audit: BPF prog-id=48 op=UNLOAD Jan 27 23:58:23.536000 audit: BPF prog-id=72 op=LOAD Jan 27 23:58:23.536000 audit: BPF prog-id=55 op=UNLOAD Jan 27 23:58:23.536000 audit: BPF prog-id=73 op=LOAD Jan 27 23:58:23.536000 audit: BPF prog-id=74 op=LOAD Jan 27 23:58:23.536000 audit: BPF prog-id=56 op=UNLOAD Jan 27 23:58:23.536000 audit: BPF prog-id=57 op=UNLOAD Jan 27 23:58:23.537000 audit: BPF prog-id=75 op=LOAD Jan 27 23:58:23.537000 audit: BPF prog-id=58 op=UNLOAD Jan 27 23:58:23.538000 audit: BPF prog-id=76 op=LOAD Jan 27 23:58:23.538000 audit: BPF prog-id=52 op=UNLOAD Jan 27 23:58:23.538000 audit: BPF prog-id=77 op=LOAD Jan 27 23:58:23.538000 audit: BPF prog-id=78 op=LOAD Jan 27 23:58:23.538000 audit: BPF prog-id=53 op=UNLOAD Jan 27 23:58:23.538000 audit: BPF prog-id=54 op=UNLOAD Jan 27 23:58:23.538000 audit: BPF prog-id=79 op=LOAD Jan 27 23:58:23.538000 audit: BPF prog-id=59 op=UNLOAD Jan 27 23:58:23.540000 audit: BPF prog-id=80 op=LOAD Jan 27 23:58:23.540000 audit: BPF prog-id=60 op=UNLOAD Jan 27 23:58:23.540000 audit: BPF prog-id=81 op=LOAD Jan 27 23:58:23.540000 audit: BPF prog-id=82 op=LOAD Jan 27 23:58:23.540000 audit: BPF prog-id=61 op=UNLOAD Jan 27 23:58:23.540000 audit: BPF prog-id=62 op=UNLOAD Jan 27 23:58:23.554510 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 27 23:58:23.554598 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 27 23:58:23.554889 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:58:23.553000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 23:58:23.554944 systemd[1]: kubelet.service: Consumed 95ms CPU time, 95.1M memory peak. Jan 27 23:58:23.556344 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:58:23.666293 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:58:23.664000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:23.680999 (kubelet)[2571]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 27 23:58:23.716778 kubelet[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 27 23:58:23.716778 kubelet[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 23:58:23.716778 kubelet[2571]: I0127 23:58:23.715905 2571 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 23:58:24.119151 kubelet[2571]: I0127 23:58:24.119114 2571 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 27 23:58:24.120755 kubelet[2571]: I0127 23:58:24.119264 2571 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 23:58:24.120755 kubelet[2571]: I0127 23:58:24.119299 2571 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 27 23:58:24.120755 kubelet[2571]: I0127 23:58:24.119306 2571 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 27 23:58:24.120755 kubelet[2571]: I0127 23:58:24.119524 2571 server.go:956] "Client rotation is on, will bootstrap in background" Jan 27 23:58:24.126749 kubelet[2571]: E0127 23:58:24.126698 2571 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.6.5:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 27 23:58:24.127488 kubelet[2571]: I0127 23:58:24.127460 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 27 23:58:24.131144 kubelet[2571]: I0127 23:58:24.131117 2571 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 23:58:24.133846 kubelet[2571]: I0127 23:58:24.133815 2571 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 27 23:58:24.134048 kubelet[2571]: I0127 23:58:24.134019 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 23:58:24.134187 kubelet[2571]: I0127 23:58:24.134045 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-n-485d202ac1","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 23:58:24.134267 kubelet[2571]: I0127 23:58:24.134190 2571 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 23:58:24.134267 kubelet[2571]: I0127 23:58:24.134199 2571 container_manager_linux.go:306] "Creating device plugin manager" Jan 27 23:58:24.134309 kubelet[2571]: I0127 23:58:24.134293 2571 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 27 23:58:24.136873 kubelet[2571]: I0127 23:58:24.136835 2571 state_mem.go:36] "Initialized new in-memory state store" Jan 27 23:58:24.138840 kubelet[2571]: I0127 23:58:24.138782 2571 kubelet.go:475] "Attempting to sync node with API server" Jan 27 23:58:24.138840 kubelet[2571]: I0127 23:58:24.138810 2571 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 23:58:24.138840 kubelet[2571]: I0127 23:58:24.138845 2571 kubelet.go:387] "Adding apiserver pod source" Jan 27 23:58:24.139076 kubelet[2571]: I0127 23:58:24.138859 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 23:58:24.140093 kubelet[2571]: E0127 23:58:24.140048 2571 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.6.5:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4593-0-0-n-485d202ac1&limit=500&resourceVersion=0\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 27 23:58:24.140352 kubelet[2571]: E0127 23:58:24.140302 2571 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.6.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 27 23:58:24.143077 kubelet[2571]: I0127 23:58:24.143056 2571 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 27 23:58:24.144029 kubelet[2571]: I0127 23:58:24.144007 2571 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 27 23:58:24.144084 kubelet[2571]: I0127 23:58:24.144041 2571 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 27 23:58:24.144084 kubelet[2571]: W0127 23:58:24.144081 2571 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 27 23:58:24.147304 kubelet[2571]: I0127 23:58:24.147286 2571 server.go:1262] "Started kubelet" Jan 27 23:58:24.147536 kubelet[2571]: I0127 23:58:24.147461 2571 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 23:58:24.147536 kubelet[2571]: I0127 23:58:24.147534 2571 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 27 23:58:24.147768 kubelet[2571]: I0127 23:58:24.147744 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 23:58:24.147835 kubelet[2571]: I0127 23:58:24.147817 2571 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 23:58:24.148166 kubelet[2571]: I0127 23:58:24.148131 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 23:58:24.148651 kubelet[2571]: I0127 23:58:24.148604 2571 server.go:310] "Adding debug handlers to kubelet server" Jan 27 23:58:24.151089 kubelet[2571]: I0127 23:58:24.151058 2571 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 27 23:58:24.152442 kubelet[2571]: E0127 23:58:24.152416 2571 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4593-0-0-n-485d202ac1\" not found" Jan 27 23:58:24.152442 kubelet[2571]: I0127 23:58:24.152442 2571 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 27 23:58:24.152639 kubelet[2571]: I0127 23:58:24.152584 2571 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 27 23:58:24.152677 kubelet[2571]: I0127 23:58:24.152645 2571 reconciler.go:29] "Reconciler: start to sync state" Jan 27 23:58:24.153338 kubelet[2571]: E0127 23:58:24.153306 2571 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.6.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 27 23:58:24.154055 kubelet[2571]: I0127 23:58:24.154029 2571 factory.go:223] Registration of the systemd container factory successfully Jan 27 23:58:24.154137 kubelet[2571]: I0127 23:58:24.154112 2571 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 27 23:58:24.154964 kubelet[2571]: E0127 23:58:24.153376 2571 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.6.5:6443/api/v1/namespaces/default/events\": dial tcp 10.0.6.5:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4593-0-0-n-485d202ac1.188ebbe6f00a4b60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4593-0-0-n-485d202ac1,UID:ci-4593-0-0-n-485d202ac1,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-n-485d202ac1,},FirstTimestamp:2026-01-27 23:58:24.147262304 +0000 UTC m=+0.463673108,LastTimestamp:2026-01-27 23:58:24.147262304 +0000 UTC m=+0.463673108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-n-485d202ac1,}" Jan 27 23:58:24.155080 kubelet[2571]: E0127 23:58:24.154996 2571 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 27 23:58:24.153000 audit[2588]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2588 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.153000 audit[2588]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffe4050ce0 a2=0 a3=0 items=0 ppid=2571 pid=2588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.153000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 27 23:58:24.155645 kubelet[2571]: E0127 23:58:24.155482 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.6.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-485d202ac1?timeout=10s\": dial tcp 10.0.6.5:6443: connect: connection refused" interval="200ms" Jan 27 23:58:24.155645 kubelet[2571]: I0127 23:58:24.155618 2571 factory.go:223] Registration of the containerd container factory successfully Jan 27 23:58:24.156000 audit[2589]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2589 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.156000 audit[2589]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc8fd6340 a2=0 a3=0 items=0 ppid=2571 pid=2589 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.156000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 27 23:58:24.158000 audit[2591]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2591 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.158000 audit[2591]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff8735fd0 a2=0 a3=0 items=0 ppid=2571 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.158000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 23:58:24.160000 audit[2593]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2593 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.160000 audit[2593]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc9441630 a2=0 a3=0 items=0 ppid=2571 pid=2593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.160000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 23:58:24.168000 audit[2596]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2596 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.168000 audit[2596]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=fffff8821930 a2=0 a3=0 items=0 ppid=2571 pid=2596 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.168000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Jan 27 23:58:24.170240 kubelet[2571]: I0127 23:58:24.170183 2571 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 27 23:58:24.169000 audit[2597]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2597 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:24.169000 audit[2597]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffdac1b820 a2=0 a3=0 items=0 ppid=2571 pid=2597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.169000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 27 23:58:24.171468 kubelet[2571]: I0127 23:58:24.171418 2571 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 27 23:58:24.171468 kubelet[2571]: I0127 23:58:24.171447 2571 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 27 23:58:24.171468 kubelet[2571]: I0127 23:58:24.171468 2571 kubelet.go:2427] "Starting kubelet main sync loop" Jan 27 23:58:24.171555 kubelet[2571]: E0127 23:58:24.171503 2571 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 23:58:24.170000 audit[2598]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2598 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.170000 audit[2598]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc272dd80 a2=0 a3=0 items=0 ppid=2571 pid=2598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.170000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 27 23:58:24.170000 audit[2600]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2600 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:24.170000 audit[2600]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff498f960 a2=0 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.170000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 27 23:58:24.171000 audit[2601]: NETFILTER_CFG table=nat:50 family=2 entries=1 op=nft_register_chain pid=2601 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.171000 audit[2601]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe9931420 a2=0 a3=0 items=0 ppid=2571 pid=2601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.171000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 27 23:58:24.171000 audit[2602]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2602 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:24.171000 audit[2602]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc3ef8b80 a2=0 a3=0 items=0 ppid=2571 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.171000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 27 23:58:24.172000 audit[2603]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2603 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:24.172000 audit[2603]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff3335f00 a2=0 a3=0 items=0 ppid=2571 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.172000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 27 23:58:24.172000 audit[2604]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2604 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:24.172000 audit[2604]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe2c79a60 a2=0 a3=0 items=0 ppid=2571 pid=2604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:24.172000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 27 23:58:24.175352 kubelet[2571]: E0127 23:58:24.175310 2571 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.6.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 27 23:58:24.175417 kubelet[2571]: I0127 23:58:24.175402 2571 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 27 23:58:24.175417 kubelet[2571]: I0127 23:58:24.175415 2571 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 27 23:58:24.175458 kubelet[2571]: I0127 23:58:24.175433 2571 state_mem.go:36] "Initialized new in-memory state store" Jan 27 23:58:24.178694 kubelet[2571]: I0127 23:58:24.178667 2571 policy_none.go:49] "None policy: Start" Jan 27 23:58:24.178756 kubelet[2571]: I0127 23:58:24.178702 2571 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 27 23:58:24.178756 kubelet[2571]: I0127 23:58:24.178714 2571 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 27 23:58:24.180304 kubelet[2571]: I0127 23:58:24.180254 2571 policy_none.go:47] "Start" Jan 27 23:58:24.187130 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 27 23:58:24.199991 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 27 23:58:24.223708 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 27 23:58:24.225257 kubelet[2571]: E0127 23:58:24.225218 2571 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 27 23:58:24.225451 kubelet[2571]: I0127 23:58:24.225435 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 23:58:24.225479 kubelet[2571]: I0127 23:58:24.225452 2571 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 23:58:24.225908 kubelet[2571]: I0127 23:58:24.225885 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 23:58:24.226519 kubelet[2571]: E0127 23:58:24.226493 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 27 23:58:24.226579 kubelet[2571]: E0127 23:58:24.226534 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4593-0-0-n-485d202ac1\" not found" Jan 27 23:58:24.283469 systemd[1]: Created slice kubepods-burstable-pod3eea4af4c1ec326f839c5c2c95a0fa92.slice - libcontainer container kubepods-burstable-pod3eea4af4c1ec326f839c5c2c95a0fa92.slice. Jan 27 23:58:24.292875 kubelet[2571]: E0127 23:58:24.292843 2571 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-485d202ac1\" not found" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.294367 systemd[1]: Created slice kubepods-burstable-pod0317defbd4561dc7d9dbf580bc87901e.slice - libcontainer container kubepods-burstable-pod0317defbd4561dc7d9dbf580bc87901e.slice. Jan 27 23:58:24.319355 kubelet[2571]: E0127 23:58:24.319306 2571 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-485d202ac1\" not found" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.321567 systemd[1]: Created slice kubepods-burstable-podfd938113fbefd5d7db92031144e7a885.slice - libcontainer container kubepods-burstable-podfd938113fbefd5d7db92031144e7a885.slice. Jan 27 23:58:24.323390 kubelet[2571]: E0127 23:58:24.323365 2571 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-485d202ac1\" not found" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.327925 kubelet[2571]: I0127 23:58:24.327906 2571 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.328304 kubelet[2571]: E0127 23:58:24.328281 2571 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.6.5:6443/api/v1/nodes\": dial tcp 10.0.6.5:6443: connect: connection refused" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.355988 kubelet[2571]: E0127 23:58:24.355941 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.6.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-485d202ac1?timeout=10s\": dial tcp 10.0.6.5:6443: connect: connection refused" interval="400ms" Jan 27 23:58:24.455448 kubelet[2571]: I0127 23:58:24.454507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455448 kubelet[2571]: I0127 23:58:24.454548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455448 kubelet[2571]: I0127 23:58:24.454582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455448 kubelet[2571]: I0127 23:58:24.454600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455448 kubelet[2571]: I0127 23:58:24.454617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd938113fbefd5d7db92031144e7a885-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-n-485d202ac1\" (UID: \"fd938113fbefd5d7db92031144e7a885\") " pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455680 kubelet[2571]: I0127 23:58:24.454636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3eea4af4c1ec326f839c5c2c95a0fa92-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" (UID: \"3eea4af4c1ec326f839c5c2c95a0fa92\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455680 kubelet[2571]: I0127 23:58:24.454649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455680 kubelet[2571]: I0127 23:58:24.454666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3eea4af4c1ec326f839c5c2c95a0fa92-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" (UID: \"3eea4af4c1ec326f839c5c2c95a0fa92\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.455680 kubelet[2571]: I0127 23:58:24.454681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3eea4af4c1ec326f839c5c2c95a0fa92-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" (UID: \"3eea4af4c1ec326f839c5c2c95a0fa92\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.530753 kubelet[2571]: I0127 23:58:24.530712 2571 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.531121 kubelet[2571]: E0127 23:58:24.531085 2571 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.6.5:6443/api/v1/nodes\": dial tcp 10.0.6.5:6443: connect: connection refused" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.597060 containerd[1664]: time="2026-01-27T23:58:24.597007567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-n-485d202ac1,Uid:3eea4af4c1ec326f839c5c2c95a0fa92,Namespace:kube-system,Attempt:0,}" Jan 27 23:58:24.624531 containerd[1664]: time="2026-01-27T23:58:24.624120850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-n-485d202ac1,Uid:0317defbd4561dc7d9dbf580bc87901e,Namespace:kube-system,Attempt:0,}" Jan 27 23:58:24.625972 containerd[1664]: time="2026-01-27T23:58:24.625918696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-n-485d202ac1,Uid:fd938113fbefd5d7db92031144e7a885,Namespace:kube-system,Attempt:0,}" Jan 27 23:58:24.757466 kubelet[2571]: E0127 23:58:24.757345 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.6.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-485d202ac1?timeout=10s\": dial tcp 10.0.6.5:6443: connect: connection refused" interval="800ms" Jan 27 23:58:24.932892 kubelet[2571]: I0127 23:58:24.932854 2571 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:24.933197 kubelet[2571]: E0127 23:58:24.933173 2571 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.6.5:6443/api/v1/nodes\": dial tcp 10.0.6.5:6443: connect: connection refused" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:25.149895 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3204725867.mount: Deactivated successfully. Jan 27 23:58:25.160450 containerd[1664]: time="2026-01-27T23:58:25.159977859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 23:58:25.163072 containerd[1664]: time="2026-01-27T23:58:25.163007748Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 27 23:58:25.164949 containerd[1664]: time="2026-01-27T23:58:25.164909474Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 23:58:25.166793 containerd[1664]: time="2026-01-27T23:58:25.165863677Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 23:58:25.167111 containerd[1664]: time="2026-01-27T23:58:25.167079961Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 23:58:25.167837 containerd[1664]: time="2026-01-27T23:58:25.167794403Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 27 23:58:25.169176 containerd[1664]: time="2026-01-27T23:58:25.169128647Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 27 23:58:25.170677 containerd[1664]: time="2026-01-27T23:58:25.170637252Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 23:58:25.171688 containerd[1664]: time="2026-01-27T23:58:25.171634415Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 571.539878ms" Jan 27 23:58:25.172971 containerd[1664]: time="2026-01-27T23:58:25.172837738Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 545.483118ms" Jan 27 23:58:25.174205 containerd[1664]: time="2026-01-27T23:58:25.174178223Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 543.329831ms" Jan 27 23:58:25.212461 containerd[1664]: time="2026-01-27T23:58:25.211837658Z" level=info msg="connecting to shim f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f" address="unix:///run/containerd/s/6280a52ce344780ff047bc0b07da428ba86951bd109fc5da4e2306008644126e" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:58:25.215863 containerd[1664]: time="2026-01-27T23:58:25.215791711Z" level=info msg="connecting to shim 3749e72cd5978f6c04ba4eab3b593ba98748f23697de9e660b133ac9179c7423" address="unix:///run/containerd/s/fb4ba7067c19f386f41184a94bcf3660ddc3ee871c333b224ddc027ac8ba8b76" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:58:25.220821 containerd[1664]: time="2026-01-27T23:58:25.220778846Z" level=info msg="connecting to shim f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe" address="unix:///run/containerd/s/9c667f694dadb34d4804ddbb332cbba113438327055098393e17246dafe329be" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:58:25.243937 systemd[1]: Started cri-containerd-f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f.scope - libcontainer container f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f. Jan 27 23:58:25.248083 systemd[1]: Started cri-containerd-3749e72cd5978f6c04ba4eab3b593ba98748f23697de9e660b133ac9179c7423.scope - libcontainer container 3749e72cd5978f6c04ba4eab3b593ba98748f23697de9e660b133ac9179c7423. Jan 27 23:58:25.249147 systemd[1]: Started cri-containerd-f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe.scope - libcontainer container f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe. Jan 27 23:58:25.255000 audit: BPF prog-id=83 op=LOAD Jan 27 23:58:25.256000 audit: BPF prog-id=84 op=LOAD Jan 27 23:58:25.256000 audit[2658]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=2619 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634636465356633343738633230393434343139356563646536336265 Jan 27 23:58:25.256000 audit: BPF prog-id=84 op=UNLOAD Jan 27 23:58:25.256000 audit[2658]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634636465356633343738633230393434343139356563646536336265 Jan 27 23:58:25.256000 audit: BPF prog-id=85 op=LOAD Jan 27 23:58:25.256000 audit[2658]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=2619 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634636465356633343738633230393434343139356563646536336265 Jan 27 23:58:25.256000 audit: BPF prog-id=86 op=LOAD Jan 27 23:58:25.256000 audit[2658]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=2619 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634636465356633343738633230393434343139356563646536336265 Jan 27 23:58:25.256000 audit: BPF prog-id=86 op=UNLOAD Jan 27 23:58:25.256000 audit[2658]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634636465356633343738633230393434343139356563646536336265 Jan 27 23:58:25.256000 audit: BPF prog-id=85 op=UNLOAD Jan 27 23:58:25.256000 audit[2658]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634636465356633343738633230393434343139356563646536336265 Jan 27 23:58:25.256000 audit: BPF prog-id=87 op=LOAD Jan 27 23:58:25.256000 audit[2658]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=2619 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634636465356633343738633230393434343139356563646536336265 Jan 27 23:58:25.261000 audit: BPF prog-id=88 op=LOAD Jan 27 23:58:25.262000 audit: BPF prog-id=89 op=LOAD Jan 27 23:58:25.262000 audit: BPF prog-id=90 op=LOAD Jan 27 23:58:25.262000 audit[2691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2655 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.262000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632303065613533623833633538383132383631643438313934663364 Jan 27 23:58:25.262000 audit: BPF prog-id=90 op=UNLOAD Jan 27 23:58:25.262000 audit[2691]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.262000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632303065613533623833633538383132383631643438313934663364 Jan 27 23:58:25.262000 audit: BPF prog-id=91 op=LOAD Jan 27 23:58:25.262000 audit[2691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2655 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.262000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632303065613533623833633538383132383631643438313934663364 Jan 27 23:58:25.262000 audit: BPF prog-id=92 op=LOAD Jan 27 23:58:25.262000 audit[2691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2655 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.262000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632303065613533623833633538383132383631643438313934663364 Jan 27 23:58:25.263000 audit: BPF prog-id=92 op=UNLOAD Jan 27 23:58:25.263000 audit[2691]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632303065613533623833633538383132383631643438313934663364 Jan 27 23:58:25.263000 audit: BPF prog-id=91 op=UNLOAD Jan 27 23:58:25.263000 audit[2691]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632303065613533623833633538383132383631643438313934663364 Jan 27 23:58:25.263000 audit: BPF prog-id=93 op=LOAD Jan 27 23:58:25.263000 audit[2691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2655 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632303065613533623833633538383132383631643438313934663364 Jan 27 23:58:25.264000 audit: BPF prog-id=94 op=LOAD Jan 27 23:58:25.264000 audit[2675]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=2634 pid=2675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337343965373263643539373866366330346261346561623362353933 Jan 27 23:58:25.264000 audit: BPF prog-id=94 op=UNLOAD Jan 27 23:58:25.264000 audit[2675]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2634 pid=2675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337343965373263643539373866366330346261346561623362353933 Jan 27 23:58:25.264000 audit: BPF prog-id=95 op=LOAD Jan 27 23:58:25.264000 audit[2675]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2634 pid=2675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337343965373263643539373866366330346261346561623362353933 Jan 27 23:58:25.264000 audit: BPF prog-id=96 op=LOAD Jan 27 23:58:25.264000 audit[2675]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=2634 pid=2675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337343965373263643539373866366330346261346561623362353933 Jan 27 23:58:25.264000 audit: BPF prog-id=96 op=UNLOAD Jan 27 23:58:25.264000 audit[2675]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2634 pid=2675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337343965373263643539373866366330346261346561623362353933 Jan 27 23:58:25.264000 audit: BPF prog-id=95 op=UNLOAD Jan 27 23:58:25.264000 audit[2675]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2634 pid=2675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337343965373263643539373866366330346261346561623362353933 Jan 27 23:58:25.264000 audit: BPF prog-id=97 op=LOAD Jan 27 23:58:25.264000 audit[2675]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=2634 pid=2675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3337343965373263643539373866366330346261346561623362353933 Jan 27 23:58:25.277550 kubelet[2571]: E0127 23:58:25.277497 2571 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.6.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 27 23:58:25.288084 containerd[1664]: time="2026-01-27T23:58:25.288042933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-n-485d202ac1,Uid:0317defbd4561dc7d9dbf580bc87901e,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f\"" Jan 27 23:58:25.294610 containerd[1664]: time="2026-01-27T23:58:25.294572073Z" level=info msg="CreateContainer within sandbox \"f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 27 23:58:25.295883 containerd[1664]: time="2026-01-27T23:58:25.295848757Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-n-485d202ac1,Uid:3eea4af4c1ec326f839c5c2c95a0fa92,Namespace:kube-system,Attempt:0,} returns sandbox id \"3749e72cd5978f6c04ba4eab3b593ba98748f23697de9e660b133ac9179c7423\"" Jan 27 23:58:25.297495 containerd[1664]: time="2026-01-27T23:58:25.297064241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-n-485d202ac1,Uid:fd938113fbefd5d7db92031144e7a885,Namespace:kube-system,Attempt:0,} returns sandbox id \"f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe\"" Jan 27 23:58:25.303103 containerd[1664]: time="2026-01-27T23:58:25.303071139Z" level=info msg="CreateContainer within sandbox \"3749e72cd5978f6c04ba4eab3b593ba98748f23697de9e660b133ac9179c7423\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 27 23:58:25.304799 containerd[1664]: time="2026-01-27T23:58:25.304752464Z" level=info msg="CreateContainer within sandbox \"f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 27 23:58:25.308879 containerd[1664]: time="2026-01-27T23:58:25.308801277Z" level=info msg="Container 34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:58:25.314087 containerd[1664]: time="2026-01-27T23:58:25.314058453Z" level=info msg="Container 94de2c38768529c27d3fd40613b575b30ef1d57525f29584d3a38751af40de57: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:58:25.320071 containerd[1664]: time="2026-01-27T23:58:25.320039311Z" level=info msg="CreateContainer within sandbox \"f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9\"" Jan 27 23:58:25.320868 containerd[1664]: time="2026-01-27T23:58:25.320844514Z" level=info msg="StartContainer for \"34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9\"" Jan 27 23:58:25.322403 containerd[1664]: time="2026-01-27T23:58:25.322379758Z" level=info msg="connecting to shim 34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9" address="unix:///run/containerd/s/6280a52ce344780ff047bc0b07da428ba86951bd109fc5da4e2306008644126e" protocol=ttrpc version=3 Jan 27 23:58:25.323803 containerd[1664]: time="2026-01-27T23:58:25.323772283Z" level=info msg="Container fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:58:25.329320 containerd[1664]: time="2026-01-27T23:58:25.329280380Z" level=info msg="CreateContainer within sandbox \"3749e72cd5978f6c04ba4eab3b593ba98748f23697de9e660b133ac9179c7423\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"94de2c38768529c27d3fd40613b575b30ef1d57525f29584d3a38751af40de57\"" Jan 27 23:58:25.329834 containerd[1664]: time="2026-01-27T23:58:25.329807061Z" level=info msg="StartContainer for \"94de2c38768529c27d3fd40613b575b30ef1d57525f29584d3a38751af40de57\"" Jan 27 23:58:25.331447 containerd[1664]: time="2026-01-27T23:58:25.331150545Z" level=info msg="connecting to shim 94de2c38768529c27d3fd40613b575b30ef1d57525f29584d3a38751af40de57" address="unix:///run/containerd/s/fb4ba7067c19f386f41184a94bcf3660ddc3ee871c333b224ddc027ac8ba8b76" protocol=ttrpc version=3 Jan 27 23:58:25.333010 containerd[1664]: time="2026-01-27T23:58:25.332975391Z" level=info msg="CreateContainer within sandbox \"f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9\"" Jan 27 23:58:25.333686 containerd[1664]: time="2026-01-27T23:58:25.333662873Z" level=info msg="StartContainer for \"fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9\"" Jan 27 23:58:25.334804 containerd[1664]: time="2026-01-27T23:58:25.334691436Z" level=info msg="connecting to shim fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9" address="unix:///run/containerd/s/9c667f694dadb34d4804ddbb332cbba113438327055098393e17246dafe329be" protocol=ttrpc version=3 Jan 27 23:58:25.337541 kubelet[2571]: E0127 23:58:25.337499 2571 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.6.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 27 23:58:25.344935 systemd[1]: Started cri-containerd-34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9.scope - libcontainer container 34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9. Jan 27 23:58:25.363963 systemd[1]: Started cri-containerd-94de2c38768529c27d3fd40613b575b30ef1d57525f29584d3a38751af40de57.scope - libcontainer container 94de2c38768529c27d3fd40613b575b30ef1d57525f29584d3a38751af40de57. Jan 27 23:58:25.365312 systemd[1]: Started cri-containerd-fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9.scope - libcontainer container fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9. Jan 27 23:58:25.366000 audit: BPF prog-id=98 op=LOAD Jan 27 23:58:25.367000 audit: BPF prog-id=99 op=LOAD Jan 27 23:58:25.367000 audit[2750]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2619 pid=2750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334666164386430396638303663616635623537383039383965373835 Jan 27 23:58:25.367000 audit: BPF prog-id=99 op=UNLOAD Jan 27 23:58:25.367000 audit[2750]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=2750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334666164386430396638303663616635623537383039383965373835 Jan 27 23:58:25.367000 audit: BPF prog-id=100 op=LOAD Jan 27 23:58:25.367000 audit[2750]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2619 pid=2750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334666164386430396638303663616635623537383039383965373835 Jan 27 23:58:25.367000 audit: BPF prog-id=101 op=LOAD Jan 27 23:58:25.367000 audit[2750]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2619 pid=2750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334666164386430396638303663616635623537383039383965373835 Jan 27 23:58:25.367000 audit: BPF prog-id=101 op=UNLOAD Jan 27 23:58:25.367000 audit[2750]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=2750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334666164386430396638303663616635623537383039383965373835 Jan 27 23:58:25.367000 audit: BPF prog-id=100 op=UNLOAD Jan 27 23:58:25.367000 audit[2750]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=2750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334666164386430396638303663616635623537383039383965373835 Jan 27 23:58:25.367000 audit: BPF prog-id=102 op=LOAD Jan 27 23:58:25.367000 audit[2750]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2619 pid=2750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334666164386430396638303663616635623537383039383965373835 Jan 27 23:58:25.374000 audit: BPF prog-id=103 op=LOAD Jan 27 23:58:25.375000 audit: BPF prog-id=104 op=LOAD Jan 27 23:58:25.375000 audit[2766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2655 pid=2766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665396536343333366631353433626438333361346236343732353261 Jan 27 23:58:25.375000 audit: BPF prog-id=104 op=UNLOAD Jan 27 23:58:25.375000 audit[2766]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=2766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665396536343333366631353433626438333361346236343732353261 Jan 27 23:58:25.376000 audit: BPF prog-id=105 op=LOAD Jan 27 23:58:25.376000 audit[2766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2655 pid=2766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665396536343333366631353433626438333361346236343732353261 Jan 27 23:58:25.376000 audit: BPF prog-id=106 op=LOAD Jan 27 23:58:25.376000 audit[2766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2655 pid=2766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665396536343333366631353433626438333361346236343732353261 Jan 27 23:58:25.376000 audit: BPF prog-id=106 op=UNLOAD Jan 27 23:58:25.376000 audit[2766]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=2766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665396536343333366631353433626438333361346236343732353261 Jan 27 23:58:25.376000 audit: BPF prog-id=105 op=UNLOAD Jan 27 23:58:25.376000 audit[2766]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=2766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665396536343333366631353433626438333361346236343732353261 Jan 27 23:58:25.376000 audit: BPF prog-id=107 op=LOAD Jan 27 23:58:25.376000 audit[2766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2655 pid=2766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6665396536343333366631353433626438333361346236343732353261 Jan 27 23:58:25.377000 audit: BPF prog-id=108 op=LOAD Jan 27 23:58:25.378000 audit: BPF prog-id=109 op=LOAD Jan 27 23:58:25.378000 audit[2762]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2634 pid=2762 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934646532633338373638353239633237643366643430363133623537 Jan 27 23:58:25.378000 audit: BPF prog-id=109 op=UNLOAD Jan 27 23:58:25.378000 audit[2762]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2634 pid=2762 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934646532633338373638353239633237643366643430363133623537 Jan 27 23:58:25.378000 audit: BPF prog-id=110 op=LOAD Jan 27 23:58:25.378000 audit[2762]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2634 pid=2762 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934646532633338373638353239633237643366643430363133623537 Jan 27 23:58:25.378000 audit: BPF prog-id=111 op=LOAD Jan 27 23:58:25.378000 audit[2762]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2634 pid=2762 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934646532633338373638353239633237643366643430363133623537 Jan 27 23:58:25.378000 audit: BPF prog-id=111 op=UNLOAD Jan 27 23:58:25.378000 audit[2762]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2634 pid=2762 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934646532633338373638353239633237643366643430363133623537 Jan 27 23:58:25.378000 audit: BPF prog-id=110 op=UNLOAD Jan 27 23:58:25.378000 audit[2762]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2634 pid=2762 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934646532633338373638353239633237643366643430363133623537 Jan 27 23:58:25.378000 audit: BPF prog-id=112 op=LOAD Jan 27 23:58:25.378000 audit[2762]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2634 pid=2762 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:25.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934646532633338373638353239633237643366643430363133623537 Jan 27 23:58:25.412231 containerd[1664]: time="2026-01-27T23:58:25.412033834Z" level=info msg="StartContainer for \"fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9\" returns successfully" Jan 27 23:58:25.419207 containerd[1664]: time="2026-01-27T23:58:25.419138656Z" level=info msg="StartContainer for \"34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9\" returns successfully" Jan 27 23:58:25.423336 containerd[1664]: time="2026-01-27T23:58:25.423306789Z" level=info msg="StartContainer for \"94de2c38768529c27d3fd40613b575b30ef1d57525f29584d3a38751af40de57\" returns successfully" Jan 27 23:58:25.461321 kubelet[2571]: E0127 23:58:25.461269 2571 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.6.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.6.5:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 27 23:58:25.736435 kubelet[2571]: I0127 23:58:25.735593 2571 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:26.186909 kubelet[2571]: E0127 23:58:26.186867 2571 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-485d202ac1\" not found" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:26.190397 kubelet[2571]: E0127 23:58:26.190369 2571 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-485d202ac1\" not found" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:26.192153 kubelet[2571]: E0127 23:58:26.192120 2571 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-485d202ac1\" not found" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.021964 kubelet[2571]: E0127 23:58:27.021923 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4593-0-0-n-485d202ac1\" not found" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.141236 kubelet[2571]: I0127 23:58:27.141203 2571 apiserver.go:52] "Watching apiserver" Jan 27 23:58:27.142753 kubelet[2571]: I0127 23:58:27.142708 2571 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.153236 kubelet[2571]: I0127 23:58:27.153152 2571 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 23:58:27.155094 kubelet[2571]: I0127 23:58:27.155010 2571 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.162062 kubelet[2571]: E0127 23:58:27.161991 2571 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.162062 kubelet[2571]: I0127 23:58:27.162020 2571 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.164299 kubelet[2571]: E0127 23:58:27.164271 2571 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.164299 kubelet[2571]: I0127 23:58:27.164298 2571 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.166221 kubelet[2571]: E0127 23:58:27.165898 2571 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-485d202ac1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.192822 kubelet[2571]: I0127 23:58:27.192718 2571 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.193147 kubelet[2571]: I0127 23:58:27.193112 2571 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.195555 kubelet[2571]: E0127 23:58:27.195346 2571 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.196435 kubelet[2571]: E0127 23:58:27.196385 2571 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-485d202ac1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.249133 kubelet[2571]: I0127 23:58:27.249024 2571 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:27.251187 kubelet[2571]: E0127 23:58:27.250954 2571 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:28.714987 kubelet[2571]: I0127 23:58:28.714753 2571 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.120687 systemd[1]: Reload requested from client PID 2862 ('systemctl') (unit session-12.scope)... Jan 27 23:58:29.120707 systemd[1]: Reloading... Jan 27 23:58:29.183762 zram_generator::config[2912]: No configuration found. Jan 27 23:58:29.367076 systemd[1]: Reloading finished in 246 ms. Jan 27 23:58:29.402370 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:58:29.413719 systemd[1]: kubelet.service: Deactivated successfully. Jan 27 23:58:29.414034 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:58:29.412000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:29.414110 systemd[1]: kubelet.service: Consumed 818ms CPU time, 121.1M memory peak. Jan 27 23:58:29.416143 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 23:58:29.416977 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 27 23:58:29.417031 kernel: audit: type=1131 audit(1769558309.412:400): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:29.415000 audit: BPF prog-id=113 op=LOAD Jan 27 23:58:29.418301 kernel: audit: type=1334 audit(1769558309.415:401): prog-id=113 op=LOAD Jan 27 23:58:29.418344 kernel: audit: type=1334 audit(1769558309.415:402): prog-id=80 op=UNLOAD Jan 27 23:58:29.415000 audit: BPF prog-id=80 op=UNLOAD Jan 27 23:58:29.416000 audit: BPF prog-id=114 op=LOAD Jan 27 23:58:29.419820 kernel: audit: type=1334 audit(1769558309.416:403): prog-id=114 op=LOAD Jan 27 23:58:29.419848 kernel: audit: type=1334 audit(1769558309.417:404): prog-id=115 op=LOAD Jan 27 23:58:29.417000 audit: BPF prog-id=115 op=LOAD Jan 27 23:58:29.417000 audit: BPF prog-id=81 op=UNLOAD Jan 27 23:58:29.417000 audit: BPF prog-id=82 op=UNLOAD Jan 27 23:58:29.418000 audit: BPF prog-id=116 op=LOAD Jan 27 23:58:29.421416 kernel: audit: type=1334 audit(1769558309.417:405): prog-id=81 op=UNLOAD Jan 27 23:58:29.421447 kernel: audit: type=1334 audit(1769558309.417:406): prog-id=82 op=UNLOAD Jan 27 23:58:29.421464 kernel: audit: type=1334 audit(1769558309.418:407): prog-id=116 op=LOAD Jan 27 23:58:29.418000 audit: BPF prog-id=65 op=UNLOAD Jan 27 23:58:29.420000 audit: BPF prog-id=117 op=LOAD Jan 27 23:58:29.424324 kernel: audit: type=1334 audit(1769558309.418:408): prog-id=65 op=UNLOAD Jan 27 23:58:29.424351 kernel: audit: type=1334 audit(1769558309.420:409): prog-id=117 op=LOAD Jan 27 23:58:29.421000 audit: BPF prog-id=118 op=LOAD Jan 27 23:58:29.427000 audit: BPF prog-id=66 op=UNLOAD Jan 27 23:58:29.427000 audit: BPF prog-id=67 op=UNLOAD Jan 27 23:58:29.428000 audit: BPF prog-id=119 op=LOAD Jan 27 23:58:29.428000 audit: BPF prog-id=72 op=UNLOAD Jan 27 23:58:29.428000 audit: BPF prog-id=120 op=LOAD Jan 27 23:58:29.428000 audit: BPF prog-id=121 op=LOAD Jan 27 23:58:29.428000 audit: BPF prog-id=73 op=UNLOAD Jan 27 23:58:29.428000 audit: BPF prog-id=74 op=UNLOAD Jan 27 23:58:29.429000 audit: BPF prog-id=122 op=LOAD Jan 27 23:58:29.429000 audit: BPF prog-id=79 op=UNLOAD Jan 27 23:58:29.429000 audit: BPF prog-id=123 op=LOAD Jan 27 23:58:29.430000 audit: BPF prog-id=71 op=UNLOAD Jan 27 23:58:29.430000 audit: BPF prog-id=124 op=LOAD Jan 27 23:58:29.431000 audit: BPF prog-id=68 op=UNLOAD Jan 27 23:58:29.431000 audit: BPF prog-id=125 op=LOAD Jan 27 23:58:29.431000 audit: BPF prog-id=126 op=LOAD Jan 27 23:58:29.431000 audit: BPF prog-id=69 op=UNLOAD Jan 27 23:58:29.431000 audit: BPF prog-id=70 op=UNLOAD Jan 27 23:58:29.431000 audit: BPF prog-id=127 op=LOAD Jan 27 23:58:29.431000 audit: BPF prog-id=76 op=UNLOAD Jan 27 23:58:29.431000 audit: BPF prog-id=128 op=LOAD Jan 27 23:58:29.431000 audit: BPF prog-id=129 op=LOAD Jan 27 23:58:29.431000 audit: BPF prog-id=77 op=UNLOAD Jan 27 23:58:29.431000 audit: BPF prog-id=78 op=UNLOAD Jan 27 23:58:29.432000 audit: BPF prog-id=130 op=LOAD Jan 27 23:58:29.432000 audit: BPF prog-id=131 op=LOAD Jan 27 23:58:29.432000 audit: BPF prog-id=63 op=UNLOAD Jan 27 23:58:29.432000 audit: BPF prog-id=64 op=UNLOAD Jan 27 23:58:29.433000 audit: BPF prog-id=132 op=LOAD Jan 27 23:58:29.433000 audit: BPF prog-id=75 op=UNLOAD Jan 27 23:58:29.557939 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 23:58:29.556000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:29.563237 (kubelet)[2953]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 27 23:58:29.606257 kubelet[2953]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 27 23:58:29.606257 kubelet[2953]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 23:58:29.606257 kubelet[2953]: I0127 23:58:29.606162 2953 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 23:58:29.615235 kubelet[2953]: I0127 23:58:29.615145 2953 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 27 23:58:29.615569 kubelet[2953]: I0127 23:58:29.615546 2953 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 23:58:29.615612 kubelet[2953]: I0127 23:58:29.615597 2953 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 27 23:58:29.615612 kubelet[2953]: I0127 23:58:29.615606 2953 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 27 23:58:29.616095 kubelet[2953]: I0127 23:58:29.616021 2953 server.go:956] "Client rotation is on, will bootstrap in background" Jan 27 23:58:29.617351 kubelet[2953]: I0127 23:58:29.617332 2953 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 27 23:58:29.619583 kubelet[2953]: I0127 23:58:29.619425 2953 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 27 23:58:29.626364 kubelet[2953]: I0127 23:58:29.626315 2953 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 23:58:29.630202 kubelet[2953]: I0127 23:58:29.629462 2953 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 27 23:58:29.630202 kubelet[2953]: I0127 23:58:29.629679 2953 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 23:58:29.630202 kubelet[2953]: I0127 23:58:29.629710 2953 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-n-485d202ac1","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 23:58:29.630202 kubelet[2953]: I0127 23:58:29.629958 2953 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 23:58:29.630434 kubelet[2953]: I0127 23:58:29.629968 2953 container_manager_linux.go:306] "Creating device plugin manager" Jan 27 23:58:29.630434 kubelet[2953]: I0127 23:58:29.630000 2953 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 27 23:58:29.631668 kubelet[2953]: I0127 23:58:29.631642 2953 state_mem.go:36] "Initialized new in-memory state store" Jan 27 23:58:29.631860 kubelet[2953]: I0127 23:58:29.631850 2953 kubelet.go:475] "Attempting to sync node with API server" Jan 27 23:58:29.631896 kubelet[2953]: I0127 23:58:29.631866 2953 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 23:58:29.631896 kubelet[2953]: I0127 23:58:29.631893 2953 kubelet.go:387] "Adding apiserver pod source" Jan 27 23:58:29.631943 kubelet[2953]: I0127 23:58:29.631904 2953 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 23:58:29.633081 kubelet[2953]: I0127 23:58:29.633031 2953 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 27 23:58:29.634776 kubelet[2953]: I0127 23:58:29.633567 2953 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 27 23:58:29.634776 kubelet[2953]: I0127 23:58:29.633601 2953 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 27 23:58:29.640643 kubelet[2953]: I0127 23:58:29.640230 2953 server.go:1262] "Started kubelet" Jan 27 23:58:29.640951 kubelet[2953]: I0127 23:58:29.640915 2953 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 23:58:29.641222 kubelet[2953]: I0127 23:58:29.641162 2953 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 23:58:29.641577 kubelet[2953]: I0127 23:58:29.641512 2953 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 23:58:29.641661 kubelet[2953]: I0127 23:58:29.641591 2953 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 27 23:58:29.642086 kubelet[2953]: I0127 23:58:29.642055 2953 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 23:58:29.642558 kubelet[2953]: I0127 23:58:29.642535 2953 server.go:310] "Adding debug handlers to kubelet server" Jan 27 23:58:29.643385 kubelet[2953]: I0127 23:58:29.643348 2953 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 27 23:58:29.646023 kubelet[2953]: I0127 23:58:29.645985 2953 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 27 23:58:29.648826 kubelet[2953]: I0127 23:58:29.648791 2953 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 27 23:58:29.650742 kubelet[2953]: E0127 23:58:29.649057 2953 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4593-0-0-n-485d202ac1\" not found" Jan 27 23:58:29.653432 kubelet[2953]: I0127 23:58:29.653334 2953 reconciler.go:29] "Reconciler: start to sync state" Jan 27 23:58:29.653432 kubelet[2953]: I0127 23:58:29.653402 2953 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 27 23:58:29.660152 kubelet[2953]: I0127 23:58:29.660077 2953 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 27 23:58:29.660152 kubelet[2953]: I0127 23:58:29.660112 2953 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 27 23:58:29.660152 kubelet[2953]: I0127 23:58:29.660138 2953 kubelet.go:2427] "Starting kubelet main sync loop" Jan 27 23:58:29.661036 kubelet[2953]: E0127 23:58:29.660251 2953 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 23:58:29.663970 kubelet[2953]: I0127 23:58:29.663928 2953 factory.go:223] Registration of the containerd container factory successfully Jan 27 23:58:29.663970 kubelet[2953]: I0127 23:58:29.663951 2953 factory.go:223] Registration of the systemd container factory successfully Jan 27 23:58:29.664147 kubelet[2953]: I0127 23:58:29.664051 2953 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 27 23:58:29.665096 kubelet[2953]: E0127 23:58:29.665036 2953 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 27 23:58:29.695026 kubelet[2953]: I0127 23:58:29.694999 2953 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 27 23:58:29.695188 kubelet[2953]: I0127 23:58:29.695174 2953 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 27 23:58:29.695268 kubelet[2953]: I0127 23:58:29.695259 2953 state_mem.go:36] "Initialized new in-memory state store" Jan 27 23:58:29.695529 kubelet[2953]: I0127 23:58:29.695510 2953 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 27 23:58:29.695610 kubelet[2953]: I0127 23:58:29.695588 2953 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 27 23:58:29.695657 kubelet[2953]: I0127 23:58:29.695649 2953 policy_none.go:49] "None policy: Start" Jan 27 23:58:29.695714 kubelet[2953]: I0127 23:58:29.695703 2953 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 27 23:58:29.695806 kubelet[2953]: I0127 23:58:29.695795 2953 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 27 23:58:29.696020 kubelet[2953]: I0127 23:58:29.696007 2953 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Jan 27 23:58:29.696114 kubelet[2953]: I0127 23:58:29.696104 2953 policy_none.go:47] "Start" Jan 27 23:58:29.703517 kubelet[2953]: E0127 23:58:29.703443 2953 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 27 23:58:29.704273 kubelet[2953]: I0127 23:58:29.703941 2953 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 23:58:29.704273 kubelet[2953]: I0127 23:58:29.703971 2953 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 23:58:29.706147 kubelet[2953]: I0127 23:58:29.706128 2953 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 23:58:29.707022 kubelet[2953]: E0127 23:58:29.706478 2953 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 27 23:58:29.761155 kubelet[2953]: I0127 23:58:29.761091 2953 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.761431 kubelet[2953]: I0127 23:58:29.761390 2953 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.761531 kubelet[2953]: I0127 23:58:29.761501 2953 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.767693 kubelet[2953]: E0127 23:58:29.767658 2953 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-485d202ac1\" already exists" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.807024 kubelet[2953]: I0127 23:58:29.806995 2953 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.820642 kubelet[2953]: I0127 23:58:29.820591 2953 kubelet_node_status.go:124] "Node was previously registered" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.821037 kubelet[2953]: I0127 23:58:29.820692 2953 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.854919 kubelet[2953]: I0127 23:58:29.854869 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd938113fbefd5d7db92031144e7a885-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-n-485d202ac1\" (UID: \"fd938113fbefd5d7db92031144e7a885\") " pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855201 kubelet[2953]: I0127 23:58:29.855011 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3eea4af4c1ec326f839c5c2c95a0fa92-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" (UID: \"3eea4af4c1ec326f839c5c2c95a0fa92\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855201 kubelet[2953]: I0127 23:58:29.855036 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855400 kubelet[2953]: I0127 23:58:29.855052 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855400 kubelet[2953]: I0127 23:58:29.855261 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855400 kubelet[2953]: I0127 23:58:29.855281 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3eea4af4c1ec326f839c5c2c95a0fa92-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" (UID: \"3eea4af4c1ec326f839c5c2c95a0fa92\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855579 kubelet[2953]: I0127 23:58:29.855297 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3eea4af4c1ec326f839c5c2c95a0fa92-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" (UID: \"3eea4af4c1ec326f839c5c2c95a0fa92\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855579 kubelet[2953]: I0127 23:58:29.855509 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:29.855579 kubelet[2953]: I0127 23:58:29.855527 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0317defbd4561dc7d9dbf580bc87901e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-n-485d202ac1\" (UID: \"0317defbd4561dc7d9dbf580bc87901e\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:30.632743 kubelet[2953]: I0127 23:58:30.632672 2953 apiserver.go:52] "Watching apiserver" Jan 27 23:58:30.653868 kubelet[2953]: I0127 23:58:30.653837 2953 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 23:58:30.681228 kubelet[2953]: I0127 23:58:30.681186 2953 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:30.681980 kubelet[2953]: I0127 23:58:30.681496 2953 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:30.683557 kubelet[2953]: I0127 23:58:30.683433 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" podStartSLOduration=1.6834206900000002 podStartE2EDuration="1.68342069s" podCreationTimestamp="2026-01-27 23:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 23:58:30.682394047 +0000 UTC m=+1.114272589" watchObservedRunningTime="2026-01-27 23:58:30.68342069 +0000 UTC m=+1.115299192" Jan 27 23:58:30.683653 kubelet[2953]: I0127 23:58:30.683604 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-485d202ac1" podStartSLOduration=1.683598011 podStartE2EDuration="1.683598011s" podCreationTimestamp="2026-01-27 23:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 23:58:30.672007935 +0000 UTC m=+1.103886477" watchObservedRunningTime="2026-01-27 23:58:30.683598011 +0000 UTC m=+1.115476593" Jan 27 23:58:30.688846 kubelet[2953]: E0127 23:58:30.688512 2953 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-485d202ac1\" already exists" pod="kube-system/kube-apiserver-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:30.689004 kubelet[2953]: E0127 23:58:30.688927 2953 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-485d202ac1\" already exists" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" Jan 27 23:58:30.694905 kubelet[2953]: I0127 23:58:30.694823 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4593-0-0-n-485d202ac1" podStartSLOduration=2.694798845 podStartE2EDuration="2.694798845s" podCreationTimestamp="2026-01-27 23:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 23:58:30.694769525 +0000 UTC m=+1.126648067" watchObservedRunningTime="2026-01-27 23:58:30.694798845 +0000 UTC m=+1.126677387" Jan 27 23:58:34.872915 kubelet[2953]: I0127 23:58:34.872857 2953 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 27 23:58:34.873743 containerd[1664]: time="2026-01-27T23:58:34.873683581Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 27 23:58:34.874029 kubelet[2953]: I0127 23:58:34.873881 2953 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 27 23:58:35.609842 systemd[1]: Created slice kubepods-besteffort-pod25d66732_167a_4983_9c61_a4d57e781d50.slice - libcontainer container kubepods-besteffort-pod25d66732_167a_4983_9c61_a4d57e781d50.slice. Jan 27 23:58:35.695417 kubelet[2953]: I0127 23:58:35.695325 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/25d66732-167a-4983-9c61-a4d57e781d50-kube-proxy\") pod \"kube-proxy-blkbf\" (UID: \"25d66732-167a-4983-9c61-a4d57e781d50\") " pod="kube-system/kube-proxy-blkbf" Jan 27 23:58:35.695417 kubelet[2953]: I0127 23:58:35.695367 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25d66732-167a-4983-9c61-a4d57e781d50-lib-modules\") pod \"kube-proxy-blkbf\" (UID: \"25d66732-167a-4983-9c61-a4d57e781d50\") " pod="kube-system/kube-proxy-blkbf" Jan 27 23:58:35.695417 kubelet[2953]: I0127 23:58:35.695384 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24n6v\" (UniqueName: \"kubernetes.io/projected/25d66732-167a-4983-9c61-a4d57e781d50-kube-api-access-24n6v\") pod \"kube-proxy-blkbf\" (UID: \"25d66732-167a-4983-9c61-a4d57e781d50\") " pod="kube-system/kube-proxy-blkbf" Jan 27 23:58:35.695417 kubelet[2953]: I0127 23:58:35.695403 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/25d66732-167a-4983-9c61-a4d57e781d50-xtables-lock\") pod \"kube-proxy-blkbf\" (UID: \"25d66732-167a-4983-9c61-a4d57e781d50\") " pod="kube-system/kube-proxy-blkbf" Jan 27 23:58:35.929462 containerd[1664]: time="2026-01-27T23:58:35.929360508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-blkbf,Uid:25d66732-167a-4983-9c61-a4d57e781d50,Namespace:kube-system,Attempt:0,}" Jan 27 23:58:35.950830 containerd[1664]: time="2026-01-27T23:58:35.950786814Z" level=info msg="connecting to shim 6ca44f4241fd8e22da6b3fbd7c923e0160881cd40a7db9466cf22f3a82b92c54" address="unix:///run/containerd/s/50e1defed2b5641cf83b2cbe0af6c642260ff720ac094635dd53ae8e1c162799" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:58:35.972210 systemd[1]: Started cri-containerd-6ca44f4241fd8e22da6b3fbd7c923e0160881cd40a7db9466cf22f3a82b92c54.scope - libcontainer container 6ca44f4241fd8e22da6b3fbd7c923e0160881cd40a7db9466cf22f3a82b92c54. Jan 27 23:58:35.982613 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 27 23:58:35.982751 kernel: audit: type=1334 audit(1769558315.980:442): prog-id=133 op=LOAD Jan 27 23:58:35.980000 audit: BPF prog-id=133 op=LOAD Jan 27 23:58:35.980000 audit: BPF prog-id=134 op=LOAD Jan 27 23:58:35.983805 kernel: audit: type=1334 audit(1769558315.980:443): prog-id=134 op=LOAD Jan 27 23:58:35.983857 kernel: audit: type=1300 audit(1769558315.980:443): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.980000 audit[3024]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.980000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.990846 kernel: audit: type=1327 audit(1769558315.980:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.990934 kernel: audit: type=1334 audit(1769558315.981:444): prog-id=134 op=UNLOAD Jan 27 23:58:35.981000 audit: BPF prog-id=134 op=UNLOAD Jan 27 23:58:35.991762 kernel: audit: type=1300 audit(1769558315.981:444): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.981000 audit[3024]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.981000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.998449 kernel: audit: type=1327 audit(1769558315.981:444): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.998567 kernel: audit: type=1334 audit(1769558315.981:445): prog-id=135 op=LOAD Jan 27 23:58:35.981000 audit: BPF prog-id=135 op=LOAD Jan 27 23:58:35.999325 kernel: audit: type=1300 audit(1769558315.981:445): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.981000 audit[3024]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.981000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:36.006376 kernel: audit: type=1327 audit(1769558315.981:445): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.982000 audit: BPF prog-id=136 op=LOAD Jan 27 23:58:35.982000 audit[3024]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.982000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.983000 audit: BPF prog-id=136 op=UNLOAD Jan 27 23:58:35.983000 audit[3024]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.983000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.983000 audit: BPF prog-id=135 op=UNLOAD Jan 27 23:58:35.983000 audit[3024]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.983000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:35.983000 audit: BPF prog-id=137 op=LOAD Jan 27 23:58:35.983000 audit[3024]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3013 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:35.983000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663613434663432343166643865323264613662336662643763393233 Jan 27 23:58:36.019039 containerd[1664]: time="2026-01-27T23:58:36.018936824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-blkbf,Uid:25d66732-167a-4983-9c61-a4d57e781d50,Namespace:kube-system,Attempt:0,} returns sandbox id \"6ca44f4241fd8e22da6b3fbd7c923e0160881cd40a7db9466cf22f3a82b92c54\"" Jan 27 23:58:36.026323 containerd[1664]: time="2026-01-27T23:58:36.026282447Z" level=info msg="CreateContainer within sandbox \"6ca44f4241fd8e22da6b3fbd7c923e0160881cd40a7db9466cf22f3a82b92c54\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 27 23:58:36.042060 containerd[1664]: time="2026-01-27T23:58:36.039129726Z" level=info msg="Container 82dac7276e4562983fc2409e80c6b30df317b5e91899f03164c12906c64e2dc5: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:58:36.058803 containerd[1664]: time="2026-01-27T23:58:36.058720506Z" level=info msg="CreateContainer within sandbox \"6ca44f4241fd8e22da6b3fbd7c923e0160881cd40a7db9466cf22f3a82b92c54\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"82dac7276e4562983fc2409e80c6b30df317b5e91899f03164c12906c64e2dc5\"" Jan 27 23:58:36.060023 containerd[1664]: time="2026-01-27T23:58:36.059616149Z" level=info msg="StartContainer for \"82dac7276e4562983fc2409e80c6b30df317b5e91899f03164c12906c64e2dc5\"" Jan 27 23:58:36.062984 containerd[1664]: time="2026-01-27T23:58:36.062944799Z" level=info msg="connecting to shim 82dac7276e4562983fc2409e80c6b30df317b5e91899f03164c12906c64e2dc5" address="unix:///run/containerd/s/50e1defed2b5641cf83b2cbe0af6c642260ff720ac094635dd53ae8e1c162799" protocol=ttrpc version=3 Jan 27 23:58:36.092151 systemd[1]: Started cri-containerd-82dac7276e4562983fc2409e80c6b30df317b5e91899f03164c12906c64e2dc5.scope - libcontainer container 82dac7276e4562983fc2409e80c6b30df317b5e91899f03164c12906c64e2dc5. Jan 27 23:58:36.093151 systemd[1]: Created slice kubepods-besteffort-pod19eb2c97_7f89_4c70_952d_90de4520251e.slice - libcontainer container kubepods-besteffort-pod19eb2c97_7f89_4c70_952d_90de4520251e.slice. Jan 27 23:58:36.097606 kubelet[2953]: I0127 23:58:36.097569 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/19eb2c97-7f89-4c70-952d-90de4520251e-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-zsnf5\" (UID: \"19eb2c97-7f89-4c70-952d-90de4520251e\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-zsnf5" Jan 27 23:58:36.097606 kubelet[2953]: I0127 23:58:36.097611 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7lt4\" (UniqueName: \"kubernetes.io/projected/19eb2c97-7f89-4c70-952d-90de4520251e-kube-api-access-p7lt4\") pod \"tigera-operator-65cdcdfd6d-zsnf5\" (UID: \"19eb2c97-7f89-4c70-952d-90de4520251e\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-zsnf5" Jan 27 23:58:36.152000 audit: BPF prog-id=138 op=LOAD Jan 27 23:58:36.152000 audit[3049]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3013 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.152000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832646163373237366534353632393833666332343039653830633662 Jan 27 23:58:36.152000 audit: BPF prog-id=139 op=LOAD Jan 27 23:58:36.152000 audit[3049]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3013 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.152000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832646163373237366534353632393833666332343039653830633662 Jan 27 23:58:36.152000 audit: BPF prog-id=139 op=UNLOAD Jan 27 23:58:36.152000 audit[3049]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3013 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.152000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832646163373237366534353632393833666332343039653830633662 Jan 27 23:58:36.152000 audit: BPF prog-id=138 op=UNLOAD Jan 27 23:58:36.152000 audit[3049]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3013 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.152000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832646163373237366534353632393833666332343039653830633662 Jan 27 23:58:36.153000 audit: BPF prog-id=140 op=LOAD Jan 27 23:58:36.153000 audit[3049]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3013 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.153000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832646163373237366534353632393833666332343039653830633662 Jan 27 23:58:36.173245 containerd[1664]: time="2026-01-27T23:58:36.173191259Z" level=info msg="StartContainer for \"82dac7276e4562983fc2409e80c6b30df317b5e91899f03164c12906c64e2dc5\" returns successfully" Jan 27 23:58:36.400287 containerd[1664]: time="2026-01-27T23:58:36.400231397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-zsnf5,Uid:19eb2c97-7f89-4c70-952d-90de4520251e,Namespace:tigera-operator,Attempt:0,}" Jan 27 23:58:36.417000 audit[3117]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.417000 audit[3117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe7b108f0 a2=0 a3=1 items=0 ppid=3063 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.417000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 27 23:58:36.417000 audit[3118]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3118 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.417000 audit[3118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff934e490 a2=0 a3=1 items=0 ppid=3063 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.417000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 27 23:58:36.419000 audit[3121]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3121 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.419000 audit[3121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff3c2cbc0 a2=0 a3=1 items=0 ppid=3063 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.419000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 27 23:58:36.420000 audit[3124]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.420000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffcc0f610 a2=0 a3=1 items=0 ppid=3063 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.420000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 27 23:58:36.421000 audit[3122]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3122 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.421000 audit[3122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc33b220 a2=0 a3=1 items=0 ppid=3063 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.421000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 27 23:58:36.422000 audit[3129]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.422000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffefb34ea0 a2=0 a3=1 items=0 ppid=3063 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.422000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 27 23:58:36.428680 containerd[1664]: time="2026-01-27T23:58:36.428586324Z" level=info msg="connecting to shim c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7" address="unix:///run/containerd/s/d2ad2c05e2e2d28b0102df5b46a3e072f419d4f1b4535737cc41b2f217ce93f1" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:58:36.449957 systemd[1]: Started cri-containerd-c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7.scope - libcontainer container c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7. Jan 27 23:58:36.459000 audit: BPF prog-id=141 op=LOAD Jan 27 23:58:36.459000 audit: BPF prog-id=142 op=LOAD Jan 27 23:58:36.459000 audit[3146]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3134 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332343638343465383138313565633039333630306636326365306636 Jan 27 23:58:36.459000 audit: BPF prog-id=142 op=UNLOAD Jan 27 23:58:36.459000 audit[3146]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332343638343465383138313565633039333630306636326365306636 Jan 27 23:58:36.459000 audit: BPF prog-id=143 op=LOAD Jan 27 23:58:36.459000 audit[3146]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3134 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332343638343465383138313565633039333630306636326365306636 Jan 27 23:58:36.460000 audit: BPF prog-id=144 op=LOAD Jan 27 23:58:36.460000 audit[3146]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3134 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332343638343465383138313565633039333630306636326365306636 Jan 27 23:58:36.460000 audit: BPF prog-id=144 op=UNLOAD Jan 27 23:58:36.460000 audit[3146]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332343638343465383138313565633039333630306636326365306636 Jan 27 23:58:36.460000 audit: BPF prog-id=143 op=UNLOAD Jan 27 23:58:36.460000 audit[3146]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332343638343465383138313565633039333630306636326365306636 Jan 27 23:58:36.460000 audit: BPF prog-id=145 op=LOAD Jan 27 23:58:36.460000 audit[3146]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3134 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332343638343465383138313565633039333630306636326365306636 Jan 27 23:58:36.481168 containerd[1664]: time="2026-01-27T23:58:36.481111926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-zsnf5,Uid:19eb2c97-7f89-4c70-952d-90de4520251e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7\"" Jan 27 23:58:36.484438 containerd[1664]: time="2026-01-27T23:58:36.484393496Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 27 23:58:36.525000 audit[3172]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3172 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.525000 audit[3172]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffed8f1c70 a2=0 a3=1 items=0 ppid=3063 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.525000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 27 23:58:36.528000 audit[3174]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3174 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.528000 audit[3174]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffffe450370 a2=0 a3=1 items=0 ppid=3063 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.528000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Jan 27 23:58:36.531000 audit[3177]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3177 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.531000 audit[3177]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc60cad30 a2=0 a3=1 items=0 ppid=3063 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.531000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 27 23:58:36.532000 audit[3178]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3178 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.532000 audit[3178]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffea4c9860 a2=0 a3=1 items=0 ppid=3063 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.532000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 27 23:58:36.534000 audit[3180]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3180 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.534000 audit[3180]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff3e1a4c0 a2=0 a3=1 items=0 ppid=3063 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.534000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 27 23:58:36.535000 audit[3181]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.535000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffee51a0f0 a2=0 a3=1 items=0 ppid=3063 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.535000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 27 23:58:36.538000 audit[3183]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.538000 audit[3183]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd8054570 a2=0 a3=1 items=0 ppid=3063 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.538000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.541000 audit[3186]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.541000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffcadc7ca0 a2=0 a3=1 items=0 ppid=3063 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.541000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.542000 audit[3187]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3187 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.542000 audit[3187]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd29f7830 a2=0 a3=1 items=0 ppid=3063 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.542000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 27 23:58:36.544000 audit[3189]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3189 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.544000 audit[3189]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffffb049d10 a2=0 a3=1 items=0 ppid=3063 pid=3189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.544000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 27 23:58:36.546000 audit[3190]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3190 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.546000 audit[3190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe7539260 a2=0 a3=1 items=0 ppid=3063 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.546000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 27 23:58:36.548000 audit[3192]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3192 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.548000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcb320b70 a2=0 a3=1 items=0 ppid=3063 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.548000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Jan 27 23:58:36.552000 audit[3195]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3195 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.552000 audit[3195]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff4eed360 a2=0 a3=1 items=0 ppid=3063 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.552000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 27 23:58:36.555000 audit[3198]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3198 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.555000 audit[3198]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd6fbf430 a2=0 a3=1 items=0 ppid=3063 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.555000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 27 23:58:36.556000 audit[3199]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3199 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.556000 audit[3199]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd529ecc0 a2=0 a3=1 items=0 ppid=3063 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.556000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 27 23:58:36.559000 audit[3201]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.559000 audit[3201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffcf25cf40 a2=0 a3=1 items=0 ppid=3063 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.559000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.563000 audit[3204]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3204 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.563000 audit[3204]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc5c8bed0 a2=0 a3=1 items=0 ppid=3063 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.563000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.564000 audit[3205]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3205 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.564000 audit[3205]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc6603570 a2=0 a3=1 items=0 ppid=3063 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.564000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 27 23:58:36.567000 audit[3207]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3207 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 23:58:36.567000 audit[3207]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffd89af3f0 a2=0 a3=1 items=0 ppid=3063 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.567000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 27 23:58:36.588000 audit[3213]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3213 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:36.588000 audit[3213]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe416bc40 a2=0 a3=1 items=0 ppid=3063 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.588000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:36.603000 audit[3213]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3213 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:36.603000 audit[3213]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffe416bc40 a2=0 a3=1 items=0 ppid=3063 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.603000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:36.605000 audit[3218]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.605000 audit[3218]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffc6ae2100 a2=0 a3=1 items=0 ppid=3063 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.605000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 27 23:58:36.607000 audit[3220]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3220 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.607000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffc475f6c0 a2=0 a3=1 items=0 ppid=3063 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.607000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 27 23:58:36.611000 audit[3223]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.611000 audit[3223]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff8a93dd0 a2=0 a3=1 items=0 ppid=3063 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.611000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Jan 27 23:58:36.612000 audit[3224]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.612000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffca1a8050 a2=0 a3=1 items=0 ppid=3063 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.612000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 27 23:58:36.614000 audit[3226]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.614000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdc7a9490 a2=0 a3=1 items=0 ppid=3063 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.614000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 27 23:58:36.616000 audit[3227]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3227 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.616000 audit[3227]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff017f1a0 a2=0 a3=1 items=0 ppid=3063 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.616000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 27 23:58:36.618000 audit[3229]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.618000 audit[3229]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd7ed68c0 a2=0 a3=1 items=0 ppid=3063 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.618000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.621000 audit[3232]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3232 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.621000 audit[3232]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffe3173840 a2=0 a3=1 items=0 ppid=3063 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.621000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.622000 audit[3233]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3233 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.622000 audit[3233]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe273c2d0 a2=0 a3=1 items=0 ppid=3063 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.622000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 27 23:58:36.624000 audit[3235]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3235 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.624000 audit[3235]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd71a5e60 a2=0 a3=1 items=0 ppid=3063 pid=3235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.624000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 27 23:58:36.626000 audit[3236]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3236 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.626000 audit[3236]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffa9dc000 a2=0 a3=1 items=0 ppid=3063 pid=3236 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.626000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 27 23:58:36.628000 audit[3238]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3238 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.628000 audit[3238]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffde84ec90 a2=0 a3=1 items=0 ppid=3063 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.628000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 27 23:58:36.631000 audit[3241]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3241 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.631000 audit[3241]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcd065830 a2=0 a3=1 items=0 ppid=3063 pid=3241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.631000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 27 23:58:36.637000 audit[3244]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3244 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.637000 audit[3244]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff4551c30 a2=0 a3=1 items=0 ppid=3063 pid=3244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.637000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Jan 27 23:58:36.638000 audit[3245]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3245 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.638000 audit[3245]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe2d83870 a2=0 a3=1 items=0 ppid=3063 pid=3245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.638000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 27 23:58:36.640000 audit[3247]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3247 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.640000 audit[3247]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffeba98c40 a2=0 a3=1 items=0 ppid=3063 pid=3247 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.640000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.644000 audit[3250]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3250 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.644000 audit[3250]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffda8825e0 a2=0 a3=1 items=0 ppid=3063 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.644000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 23:58:36.645000 audit[3251]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3251 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.645000 audit[3251]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffb568870 a2=0 a3=1 items=0 ppid=3063 pid=3251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.645000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 27 23:58:36.647000 audit[3253]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3253 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.647000 audit[3253]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffdadacd50 a2=0 a3=1 items=0 ppid=3063 pid=3253 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.647000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 27 23:58:36.648000 audit[3254]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3254 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.648000 audit[3254]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffde3cf7b0 a2=0 a3=1 items=0 ppid=3063 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.648000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 27 23:58:36.651000 audit[3256]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3256 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.651000 audit[3256]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd02c5990 a2=0 a3=1 items=0 ppid=3063 pid=3256 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.651000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 23:58:36.654000 audit[3259]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3259 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 23:58:36.654000 audit[3259]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffff8501c00 a2=0 a3=1 items=0 ppid=3063 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.654000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 23:58:36.657000 audit[3261]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3261 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 27 23:58:36.657000 audit[3261]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffdb16dae0 a2=0 a3=1 items=0 ppid=3063 pid=3261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.657000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:36.658000 audit[3261]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3261 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 27 23:58:36.658000 audit[3261]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffdb16dae0 a2=0 a3=1 items=0 ppid=3063 pid=3261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:36.658000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:39.314289 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3165024553.mount: Deactivated successfully. Jan 27 23:58:40.967828 kubelet[2953]: I0127 23:58:40.967755 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-blkbf" podStartSLOduration=5.967713288 podStartE2EDuration="5.967713288s" podCreationTimestamp="2026-01-27 23:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 23:58:36.708145984 +0000 UTC m=+7.140024526" watchObservedRunningTime="2026-01-27 23:58:40.967713288 +0000 UTC m=+11.399591830" Jan 27 23:58:45.609286 containerd[1664]: time="2026-01-27T23:58:45.609205202Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:45.610384 containerd[1664]: time="2026-01-27T23:58:45.610320805Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 27 23:58:45.611436 containerd[1664]: time="2026-01-27T23:58:45.611391769Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:45.613989 containerd[1664]: time="2026-01-27T23:58:45.613920176Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:58:45.614762 containerd[1664]: time="2026-01-27T23:58:45.614710979Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 9.130266603s" Jan 27 23:58:45.614762 containerd[1664]: time="2026-01-27T23:58:45.614755019Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 27 23:58:45.618500 containerd[1664]: time="2026-01-27T23:58:45.618451110Z" level=info msg="CreateContainer within sandbox \"c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 27 23:58:45.628748 containerd[1664]: time="2026-01-27T23:58:45.627967620Z" level=info msg="Container 0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:58:45.634799 containerd[1664]: time="2026-01-27T23:58:45.634764921Z" level=info msg="CreateContainer within sandbox \"c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d\"" Jan 27 23:58:45.635631 containerd[1664]: time="2026-01-27T23:58:45.635600963Z" level=info msg="StartContainer for \"0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d\"" Jan 27 23:58:45.636785 containerd[1664]: time="2026-01-27T23:58:45.636755487Z" level=info msg="connecting to shim 0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d" address="unix:///run/containerd/s/d2ad2c05e2e2d28b0102df5b46a3e072f419d4f1b4535737cc41b2f217ce93f1" protocol=ttrpc version=3 Jan 27 23:58:45.656136 systemd[1]: Started cri-containerd-0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d.scope - libcontainer container 0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d. Jan 27 23:58:45.665000 audit: BPF prog-id=146 op=LOAD Jan 27 23:58:45.667398 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 27 23:58:45.667453 kernel: audit: type=1334 audit(1769558325.665:514): prog-id=146 op=LOAD Jan 27 23:58:45.667000 audit: BPF prog-id=147 op=LOAD Jan 27 23:58:45.669238 kernel: audit: type=1334 audit(1769558325.667:515): prog-id=147 op=LOAD Jan 27 23:58:45.669278 kernel: audit: type=1300 audit(1769558325.667:515): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.667000 audit[3270]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.667000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.675611 kernel: audit: type=1327 audit(1769558325.667:515): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.668000 audit: BPF prog-id=147 op=UNLOAD Jan 27 23:58:45.676523 kernel: audit: type=1334 audit(1769558325.668:516): prog-id=147 op=UNLOAD Jan 27 23:58:45.668000 audit[3270]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.679835 kernel: audit: type=1300 audit(1769558325.668:516): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.683203 kernel: audit: type=1327 audit(1769558325.668:516): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.683319 kernel: audit: type=1334 audit(1769558325.668:517): prog-id=148 op=LOAD Jan 27 23:58:45.668000 audit: BPF prog-id=148 op=LOAD Jan 27 23:58:45.668000 audit[3270]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.687230 kernel: audit: type=1300 audit(1769558325.668:517): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.691421 kernel: audit: type=1327 audit(1769558325.668:517): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.671000 audit: BPF prog-id=149 op=LOAD Jan 27 23:58:45.671000 audit[3270]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.674000 audit: BPF prog-id=149 op=UNLOAD Jan 27 23:58:45.674000 audit[3270]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.674000 audit: BPF prog-id=148 op=UNLOAD Jan 27 23:58:45.674000 audit[3270]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.674000 audit: BPF prog-id=150 op=LOAD Jan 27 23:58:45.674000 audit[3270]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3134 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:45.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063333866303830343938386331343639616232666561303833383061 Jan 27 23:58:45.705980 containerd[1664]: time="2026-01-27T23:58:45.705928299Z" level=info msg="StartContainer for \"0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d\" returns successfully" Jan 27 23:58:45.730908 kubelet[2953]: I0127 23:58:45.730745 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-zsnf5" podStartSLOduration=0.597611925 podStartE2EDuration="9.730712416s" podCreationTimestamp="2026-01-27 23:58:36 +0000 UTC" firstStartedPulling="2026-01-27 23:58:36.48242861 +0000 UTC m=+6.914307152" lastFinishedPulling="2026-01-27 23:58:45.615529101 +0000 UTC m=+16.047407643" observedRunningTime="2026-01-27 23:58:45.725291519 +0000 UTC m=+16.157170021" watchObservedRunningTime="2026-01-27 23:58:45.730712416 +0000 UTC m=+16.162590958" Jan 27 23:58:50.873999 sudo[1992]: pam_unix(sudo:session): session closed for user root Jan 27 23:58:50.873000 audit[1992]: USER_END pid=1992 uid=500 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:50.878395 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 27 23:58:50.878491 kernel: audit: type=1106 audit(1769558330.873:522): pid=1992 uid=500 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:50.873000 audit[1992]: CRED_DISP pid=1992 uid=500 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:50.881819 kernel: audit: type=1104 audit(1769558330.873:523): pid=1992 uid=500 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 23:58:50.972035 sshd[1991]: Connection closed by 4.153.228.146 port 49934 Jan 27 23:58:50.972339 sshd-session[1987]: pam_unix(sshd:session): session closed for user core Jan 27 23:58:50.973000 audit[1987]: USER_END pid=1987 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:50.977129 systemd-logind[1642]: Session 12 logged out. Waiting for processes to exit. Jan 27 23:58:50.977406 systemd[1]: sshd@10-10.0.6.5:22-4.153.228.146:49934.service: Deactivated successfully. Jan 27 23:58:50.973000 audit[1987]: CRED_DISP pid=1987 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:50.981425 systemd[1]: session-12.scope: Deactivated successfully. Jan 27 23:58:50.981551 kernel: audit: type=1106 audit(1769558330.973:524): pid=1987 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:50.981634 kernel: audit: type=1104 audit(1769558330.973:525): pid=1987 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 23:58:50.981927 systemd[1]: session-12.scope: Consumed 7.799s CPU time, 222.4M memory peak. Jan 27 23:58:50.978000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.6.5:22-4.153.228.146:49934 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:50.984599 systemd-logind[1642]: Removed session 12. Jan 27 23:58:50.986383 kernel: audit: type=1131 audit(1769558330.978:526): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.6.5:22-4.153.228.146:49934 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 23:58:51.413000 audit[3360]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3360 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:51.413000 audit[3360]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffdd0a1df0 a2=0 a3=1 items=0 ppid=3063 pid=3360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:51.419738 kernel: audit: type=1325 audit(1769558331.413:527): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3360 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:51.419799 kernel: audit: type=1300 audit(1769558331.413:527): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffdd0a1df0 a2=0 a3=1 items=0 ppid=3063 pid=3360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:51.413000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:51.426470 kernel: audit: type=1327 audit(1769558331.413:527): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:51.429000 audit[3360]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3360 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:51.433748 kernel: audit: type=1325 audit(1769558331.429:528): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3360 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:51.429000 audit[3360]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdd0a1df0 a2=0 a3=1 items=0 ppid=3063 pid=3360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:51.429000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:51.449821 kernel: audit: type=1300 audit(1769558331.429:528): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdd0a1df0 a2=0 a3=1 items=0 ppid=3063 pid=3360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:52.481000 audit[3363]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3363 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:52.481000 audit[3363]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd8778000 a2=0 a3=1 items=0 ppid=3063 pid=3363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:52.481000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:52.485000 audit[3363]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3363 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:52.485000 audit[3363]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd8778000 a2=0 a3=1 items=0 ppid=3063 pid=3363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:52.485000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:55.115000 audit[3365]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3365 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:55.115000 audit[3365]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffc235ffa0 a2=0 a3=1 items=0 ppid=3063 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:55.115000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:55.127000 audit[3365]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3365 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:55.127000 audit[3365]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc235ffa0 a2=0 a3=1 items=0 ppid=3063 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:55.127000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:56.147000 audit[3367]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3367 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:56.149084 kernel: kauditd_printk_skb: 13 callbacks suppressed Jan 27 23:58:56.149175 kernel: audit: type=1325 audit(1769558336.147:533): table=filter:111 family=2 entries=19 op=nft_register_rule pid=3367 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:56.154855 kernel: audit: type=1300 audit(1769558336.147:533): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd2ddfb00 a2=0 a3=1 items=0 ppid=3063 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:56.147000 audit[3367]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd2ddfb00 a2=0 a3=1 items=0 ppid=3063 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:56.147000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:56.156786 kernel: audit: type=1327 audit(1769558336.147:533): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:56.163954 kernel: audit: type=1325 audit(1769558336.157:534): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3367 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:56.164105 kernel: audit: type=1300 audit(1769558336.157:534): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd2ddfb00 a2=0 a3=1 items=0 ppid=3063 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:56.157000 audit[3367]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3367 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:56.157000 audit[3367]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd2ddfb00 a2=0 a3=1 items=0 ppid=3063 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:56.157000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:56.165948 kernel: audit: type=1327 audit(1769558336.157:534): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:57.647000 audit[3369]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3369 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:57.647000 audit[3369]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffcaf84d90 a2=0 a3=1 items=0 ppid=3063 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:57.654378 kernel: audit: type=1325 audit(1769558337.647:535): table=filter:113 family=2 entries=21 op=nft_register_rule pid=3369 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:57.654660 kernel: audit: type=1300 audit(1769558337.647:535): arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffcaf84d90 a2=0 a3=1 items=0 ppid=3063 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:57.647000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:57.657745 kernel: audit: type=1327 audit(1769558337.647:535): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:57.657823 kernel: audit: type=1325 audit(1769558337.655:536): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3369 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:57.655000 audit[3369]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3369 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:57.655000 audit[3369]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcaf84d90 a2=0 a3=1 items=0 ppid=3063 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:57.655000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:57.692647 systemd[1]: Created slice kubepods-besteffort-podc3fa94b4_e1eb_41d1_989b_49720c47ebc2.slice - libcontainer container kubepods-besteffort-podc3fa94b4_e1eb_41d1_989b_49720c47ebc2.slice. Jan 27 23:58:57.736126 kubelet[2953]: I0127 23:58:57.736087 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c3fa94b4-e1eb-41d1-989b-49720c47ebc2-typha-certs\") pod \"calico-typha-7bd7b47886-qt2wb\" (UID: \"c3fa94b4-e1eb-41d1-989b-49720c47ebc2\") " pod="calico-system/calico-typha-7bd7b47886-qt2wb" Jan 27 23:58:57.736126 kubelet[2953]: I0127 23:58:57.736131 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3fa94b4-e1eb-41d1-989b-49720c47ebc2-tigera-ca-bundle\") pod \"calico-typha-7bd7b47886-qt2wb\" (UID: \"c3fa94b4-e1eb-41d1-989b-49720c47ebc2\") " pod="calico-system/calico-typha-7bd7b47886-qt2wb" Jan 27 23:58:57.736602 kubelet[2953]: I0127 23:58:57.736152 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7g64\" (UniqueName: \"kubernetes.io/projected/c3fa94b4-e1eb-41d1-989b-49720c47ebc2-kube-api-access-m7g64\") pod \"calico-typha-7bd7b47886-qt2wb\" (UID: \"c3fa94b4-e1eb-41d1-989b-49720c47ebc2\") " pod="calico-system/calico-typha-7bd7b47886-qt2wb" Jan 27 23:58:57.872501 systemd[1]: Created slice kubepods-besteffort-pod0fab4411_fdf0_47be_92ce_e6d644a9b3b9.slice - libcontainer container kubepods-besteffort-pod0fab4411_fdf0_47be_92ce_e6d644a9b3b9.slice. Jan 27 23:58:57.938256 kubelet[2953]: I0127 23:58:57.938134 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-policysync\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938256 kubelet[2953]: I0127 23:58:57.938190 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-cni-net-dir\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938256 kubelet[2953]: I0127 23:58:57.938208 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-var-lib-calico\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938256 kubelet[2953]: I0127 23:58:57.938223 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-xtables-lock\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938256 kubelet[2953]: I0127 23:58:57.938239 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-cni-log-dir\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938439 kubelet[2953]: I0127 23:58:57.938252 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-var-run-calico\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938439 kubelet[2953]: I0127 23:58:57.938268 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-flexvol-driver-host\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938439 kubelet[2953]: I0127 23:58:57.938309 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-node-certs\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938439 kubelet[2953]: I0127 23:58:57.938327 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whkpm\" (UniqueName: \"kubernetes.io/projected/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-kube-api-access-whkpm\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938439 kubelet[2953]: I0127 23:58:57.938343 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-cni-bin-dir\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938542 kubelet[2953]: I0127 23:58:57.938357 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-lib-modules\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.938542 kubelet[2953]: I0127 23:58:57.938371 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fab4411-fdf0-47be-92ce-e6d644a9b3b9-tigera-ca-bundle\") pod \"calico-node-z88kf\" (UID: \"0fab4411-fdf0-47be-92ce-e6d644a9b3b9\") " pod="calico-system/calico-node-z88kf" Jan 27 23:58:57.999065 containerd[1664]: time="2026-01-27T23:58:57.998989799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bd7b47886-qt2wb,Uid:c3fa94b4-e1eb-41d1-989b-49720c47ebc2,Namespace:calico-system,Attempt:0,}" Jan 27 23:58:58.017787 containerd[1664]: time="2026-01-27T23:58:58.017741217Z" level=info msg="connecting to shim eac5700c2118b68c0222bf0e15eb6425f04bd9262b261c1d8af403a1bd68cef5" address="unix:///run/containerd/s/4f2539d92e3394a9da07bc299d5e70629c0e557d036a595ac7bd1e8c96787b57" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:58:58.040966 kubelet[2953]: E0127 23:58:58.040917 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.040966 kubelet[2953]: W0127 23:58:58.040954 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.041137 kubelet[2953]: E0127 23:58:58.040973 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.041448 kubelet[2953]: E0127 23:58:58.041432 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.041484 kubelet[2953]: W0127 23:58:58.041448 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.041484 kubelet[2953]: E0127 23:58:58.041472 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.042459 kubelet[2953]: E0127 23:58:58.042445 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.042504 kubelet[2953]: W0127 23:58:58.042459 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.042504 kubelet[2953]: E0127 23:58:58.042472 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.042989 kubelet[2953]: E0127 23:58:58.042965 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.042989 kubelet[2953]: W0127 23:58:58.042987 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.043058 kubelet[2953]: E0127 23:58:58.043000 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.043486 kubelet[2953]: E0127 23:58:58.043467 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.043486 kubelet[2953]: W0127 23:58:58.043484 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.043535 kubelet[2953]: E0127 23:58:58.043496 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.043725 kubelet[2953]: E0127 23:58:58.043712 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.043759 kubelet[2953]: W0127 23:58:58.043737 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.043759 kubelet[2953]: E0127 23:58:58.043747 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.043909 kubelet[2953]: E0127 23:58:58.043897 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.043909 kubelet[2953]: W0127 23:58:58.043907 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.043952 kubelet[2953]: E0127 23:58:58.043915 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.044078 kubelet[2953]: E0127 23:58:58.044066 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.044101 kubelet[2953]: W0127 23:58:58.044077 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.044101 kubelet[2953]: E0127 23:58:58.044086 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.044569 kubelet[2953]: E0127 23:58:58.044548 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.044613 kubelet[2953]: W0127 23:58:58.044569 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.044613 kubelet[2953]: E0127 23:58:58.044582 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.045101 systemd[1]: Started cri-containerd-eac5700c2118b68c0222bf0e15eb6425f04bd9262b261c1d8af403a1bd68cef5.scope - libcontainer container eac5700c2118b68c0222bf0e15eb6425f04bd9262b261c1d8af403a1bd68cef5. Jan 27 23:58:58.045714 kubelet[2953]: E0127 23:58:58.045189 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.045714 kubelet[2953]: W0127 23:58:58.045204 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.045714 kubelet[2953]: E0127 23:58:58.045216 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.045714 kubelet[2953]: E0127 23:58:58.045591 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.045714 kubelet[2953]: W0127 23:58:58.045605 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.045714 kubelet[2953]: E0127 23:58:58.045677 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.046219 kubelet[2953]: E0127 23:58:58.045936 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.046219 kubelet[2953]: W0127 23:58:58.045951 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.046219 kubelet[2953]: E0127 23:58:58.045963 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.050901 kubelet[2953]: E0127 23:58:58.048200 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.050901 kubelet[2953]: W0127 23:58:58.048219 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.050901 kubelet[2953]: E0127 23:58:58.048234 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.050901 kubelet[2953]: E0127 23:58:58.048406 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.050901 kubelet[2953]: W0127 23:58:58.048414 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.050901 kubelet[2953]: E0127 23:58:58.048422 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.050901 kubelet[2953]: E0127 23:58:58.048588 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.050901 kubelet[2953]: W0127 23:58:58.048596 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.050901 kubelet[2953]: E0127 23:58:58.048604 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.050901 kubelet[2953]: E0127 23:58:58.048849 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051165 kubelet[2953]: W0127 23:58:58.048861 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051165 kubelet[2953]: E0127 23:58:58.048871 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051165 kubelet[2953]: E0127 23:58:58.049064 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051165 kubelet[2953]: W0127 23:58:58.049071 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051165 kubelet[2953]: E0127 23:58:58.049079 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051165 kubelet[2953]: E0127 23:58:58.049217 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051165 kubelet[2953]: W0127 23:58:58.049225 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051165 kubelet[2953]: E0127 23:58:58.049233 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051165 kubelet[2953]: E0127 23:58:58.049359 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051165 kubelet[2953]: W0127 23:58:58.049365 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051343 kubelet[2953]: E0127 23:58:58.049373 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051343 kubelet[2953]: E0127 23:58:58.049468 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051343 kubelet[2953]: W0127 23:58:58.049474 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051343 kubelet[2953]: E0127 23:58:58.049481 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051343 kubelet[2953]: E0127 23:58:58.049571 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051343 kubelet[2953]: W0127 23:58:58.049577 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051343 kubelet[2953]: E0127 23:58:58.049583 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051343 kubelet[2953]: E0127 23:58:58.049739 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051343 kubelet[2953]: W0127 23:58:58.049747 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051343 kubelet[2953]: E0127 23:58:58.049755 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051533 kubelet[2953]: E0127 23:58:58.049876 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051533 kubelet[2953]: W0127 23:58:58.049882 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051533 kubelet[2953]: E0127 23:58:58.049889 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051533 kubelet[2953]: E0127 23:58:58.050172 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051533 kubelet[2953]: W0127 23:58:58.050178 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051533 kubelet[2953]: E0127 23:58:58.050186 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.051533 kubelet[2953]: E0127 23:58:58.050329 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.051533 kubelet[2953]: W0127 23:58:58.050335 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.051533 kubelet[2953]: E0127 23:58:58.050343 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.056528 kubelet[2953]: E0127 23:58:58.056497 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.056528 kubelet[2953]: W0127 23:58:58.056521 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.057314 kubelet[2953]: E0127 23:58:58.056541 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.075000 audit: BPF prog-id=151 op=LOAD Jan 27 23:58:58.077243 kubelet[2953]: E0127 23:58:58.077120 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:58:58.078000 audit: BPF prog-id=152 op=LOAD Jan 27 23:58:58.078000 audit[3391]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=3380 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561633537303063323131386236386330323232626630653135656236 Jan 27 23:58:58.078000 audit: BPF prog-id=152 op=UNLOAD Jan 27 23:58:58.078000 audit[3391]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3380 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561633537303063323131386236386330323232626630653135656236 Jan 27 23:58:58.078000 audit: BPF prog-id=153 op=LOAD Jan 27 23:58:58.078000 audit[3391]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3380 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561633537303063323131386236386330323232626630653135656236 Jan 27 23:58:58.078000 audit: BPF prog-id=154 op=LOAD Jan 27 23:58:58.078000 audit[3391]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3380 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561633537303063323131386236386330323232626630653135656236 Jan 27 23:58:58.078000 audit: BPF prog-id=154 op=UNLOAD Jan 27 23:58:58.078000 audit[3391]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3380 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561633537303063323131386236386330323232626630653135656236 Jan 27 23:58:58.078000 audit: BPF prog-id=153 op=UNLOAD Jan 27 23:58:58.078000 audit[3391]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3380 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561633537303063323131386236386330323232626630653135656236 Jan 27 23:58:58.078000 audit: BPF prog-id=155 op=LOAD Jan 27 23:58:58.078000 audit[3391]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3380 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561633537303063323131386236386330323232626630653135656236 Jan 27 23:58:58.111655 containerd[1664]: time="2026-01-27T23:58:58.111395265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bd7b47886-qt2wb,Uid:c3fa94b4-e1eb-41d1-989b-49720c47ebc2,Namespace:calico-system,Attempt:0,} returns sandbox id \"eac5700c2118b68c0222bf0e15eb6425f04bd9262b261c1d8af403a1bd68cef5\"" Jan 27 23:58:58.113143 containerd[1664]: time="2026-01-27T23:58:58.113112711Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 27 23:58:58.128915 kubelet[2953]: E0127 23:58:58.128858 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.128915 kubelet[2953]: W0127 23:58:58.128883 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.128915 kubelet[2953]: E0127 23:58:58.128904 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.129465 kubelet[2953]: E0127 23:58:58.129436 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.129517 kubelet[2953]: W0127 23:58:58.129459 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.129517 kubelet[2953]: E0127 23:58:58.129506 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.129701 kubelet[2953]: E0127 23:58:58.129689 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.129701 kubelet[2953]: W0127 23:58:58.129700 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.129769 kubelet[2953]: E0127 23:58:58.129710 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.129871 kubelet[2953]: E0127 23:58:58.129858 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.129871 kubelet[2953]: W0127 23:58:58.129869 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.129924 kubelet[2953]: E0127 23:58:58.129877 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.130041 kubelet[2953]: E0127 23:58:58.130028 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.130104 kubelet[2953]: W0127 23:58:58.130091 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.130129 kubelet[2953]: E0127 23:58:58.130108 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.130289 kubelet[2953]: E0127 23:58:58.130277 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.130314 kubelet[2953]: W0127 23:58:58.130288 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.130314 kubelet[2953]: E0127 23:58:58.130297 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.130446 kubelet[2953]: E0127 23:58:58.130436 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.130446 kubelet[2953]: W0127 23:58:58.130446 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.130490 kubelet[2953]: E0127 23:58:58.130453 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.131638 kubelet[2953]: E0127 23:58:58.131586 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.131638 kubelet[2953]: W0127 23:58:58.131615 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.131638 kubelet[2953]: E0127 23:58:58.131628 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.131988 kubelet[2953]: E0127 23:58:58.131971 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.131988 kubelet[2953]: W0127 23:58:58.131986 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.132050 kubelet[2953]: E0127 23:58:58.131997 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.132800 kubelet[2953]: E0127 23:58:58.132771 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.132800 kubelet[2953]: W0127 23:58:58.132788 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.132800 kubelet[2953]: E0127 23:58:58.132800 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.133174 kubelet[2953]: E0127 23:58:58.133070 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.133212 kubelet[2953]: W0127 23:58:58.133175 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.133212 kubelet[2953]: E0127 23:58:58.133189 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.133699 kubelet[2953]: E0127 23:58:58.133681 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.133699 kubelet[2953]: W0127 23:58:58.133696 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.133992 kubelet[2953]: E0127 23:58:58.133974 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.134453 kubelet[2953]: E0127 23:58:58.134435 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.134512 kubelet[2953]: W0127 23:58:58.134498 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.134544 kubelet[2953]: E0127 23:58:58.134514 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.134682 kubelet[2953]: E0127 23:58:58.134671 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.134712 kubelet[2953]: W0127 23:58:58.134682 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.134712 kubelet[2953]: E0127 23:58:58.134690 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.134857 kubelet[2953]: E0127 23:58:58.134845 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.134885 kubelet[2953]: W0127 23:58:58.134856 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.134885 kubelet[2953]: E0127 23:58:58.134868 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.135012 kubelet[2953]: E0127 23:58:58.135001 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.135037 kubelet[2953]: W0127 23:58:58.135012 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.135037 kubelet[2953]: E0127 23:58:58.135020 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.135180 kubelet[2953]: E0127 23:58:58.135169 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.135206 kubelet[2953]: W0127 23:58:58.135179 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.135206 kubelet[2953]: E0127 23:58:58.135188 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.135325 kubelet[2953]: E0127 23:58:58.135314 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.135399 kubelet[2953]: W0127 23:58:58.135324 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.135399 kubelet[2953]: E0127 23:58:58.135343 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.135473 kubelet[2953]: E0127 23:58:58.135462 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.135473 kubelet[2953]: W0127 23:58:58.135471 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.135516 kubelet[2953]: E0127 23:58:58.135479 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.136018 kubelet[2953]: E0127 23:58:58.135995 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.136018 kubelet[2953]: W0127 23:58:58.136011 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.136126 kubelet[2953]: E0127 23:58:58.136022 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.150952 kubelet[2953]: E0127 23:58:58.150913 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.150952 kubelet[2953]: W0127 23:58:58.150937 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.150952 kubelet[2953]: E0127 23:58:58.150957 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.151145 kubelet[2953]: I0127 23:58:58.151011 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d6f938d-8e51-4e63-b408-0de368dbd7d7-kubelet-dir\") pod \"csi-node-driver-bmscm\" (UID: \"1d6f938d-8e51-4e63-b408-0de368dbd7d7\") " pod="calico-system/csi-node-driver-bmscm" Jan 27 23:58:58.151592 kubelet[2953]: E0127 23:58:58.151234 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.151592 kubelet[2953]: W0127 23:58:58.151250 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.151592 kubelet[2953]: E0127 23:58:58.151262 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.151592 kubelet[2953]: I0127 23:58:58.151277 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/1d6f938d-8e51-4e63-b408-0de368dbd7d7-varrun\") pod \"csi-node-driver-bmscm\" (UID: \"1d6f938d-8e51-4e63-b408-0de368dbd7d7\") " pod="calico-system/csi-node-driver-bmscm" Jan 27 23:58:58.152161 kubelet[2953]: E0127 23:58:58.151613 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.152161 kubelet[2953]: W0127 23:58:58.151623 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.152161 kubelet[2953]: E0127 23:58:58.151631 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.152161 kubelet[2953]: I0127 23:58:58.151651 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1d6f938d-8e51-4e63-b408-0de368dbd7d7-socket-dir\") pod \"csi-node-driver-bmscm\" (UID: \"1d6f938d-8e51-4e63-b408-0de368dbd7d7\") " pod="calico-system/csi-node-driver-bmscm" Jan 27 23:58:58.152161 kubelet[2953]: E0127 23:58:58.151856 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.152161 kubelet[2953]: W0127 23:58:58.151866 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.152161 kubelet[2953]: E0127 23:58:58.151875 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.152161 kubelet[2953]: I0127 23:58:58.151906 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5xj\" (UniqueName: \"kubernetes.io/projected/1d6f938d-8e51-4e63-b408-0de368dbd7d7-kube-api-access-vn5xj\") pod \"csi-node-driver-bmscm\" (UID: \"1d6f938d-8e51-4e63-b408-0de368dbd7d7\") " pod="calico-system/csi-node-driver-bmscm" Jan 27 23:58:58.153001 kubelet[2953]: E0127 23:58:58.152425 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.153001 kubelet[2953]: W0127 23:58:58.152437 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.153001 kubelet[2953]: E0127 23:58:58.152448 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.153001 kubelet[2953]: I0127 23:58:58.152474 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1d6f938d-8e51-4e63-b408-0de368dbd7d7-registration-dir\") pod \"csi-node-driver-bmscm\" (UID: \"1d6f938d-8e51-4e63-b408-0de368dbd7d7\") " pod="calico-system/csi-node-driver-bmscm" Jan 27 23:58:58.153001 kubelet[2953]: E0127 23:58:58.152685 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.153001 kubelet[2953]: W0127 23:58:58.152695 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.153001 kubelet[2953]: E0127 23:58:58.152705 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.153434 kubelet[2953]: E0127 23:58:58.153411 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.153434 kubelet[2953]: W0127 23:58:58.153428 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.153434 kubelet[2953]: E0127 23:58:58.153443 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.153704 kubelet[2953]: E0127 23:58:58.153656 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.153704 kubelet[2953]: W0127 23:58:58.153674 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.153704 kubelet[2953]: E0127 23:58:58.153683 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.154476 kubelet[2953]: E0127 23:58:58.154408 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.154476 kubelet[2953]: W0127 23:58:58.154426 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.154476 kubelet[2953]: E0127 23:58:58.154439 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.155083 kubelet[2953]: E0127 23:58:58.155038 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.155083 kubelet[2953]: W0127 23:58:58.155065 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.155083 kubelet[2953]: E0127 23:58:58.155077 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.155501 kubelet[2953]: E0127 23:58:58.155354 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.155501 kubelet[2953]: W0127 23:58:58.155366 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.155501 kubelet[2953]: E0127 23:58:58.155376 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.155585 kubelet[2953]: E0127 23:58:58.155561 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.155585 kubelet[2953]: W0127 23:58:58.155570 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.155585 kubelet[2953]: E0127 23:58:58.155578 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.155868 kubelet[2953]: E0127 23:58:58.155856 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.155868 kubelet[2953]: W0127 23:58:58.155867 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.156216 kubelet[2953]: E0127 23:58:58.155876 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.156216 kubelet[2953]: E0127 23:58:58.156034 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.156216 kubelet[2953]: W0127 23:58:58.156044 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.156216 kubelet[2953]: E0127 23:58:58.156052 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.156216 kubelet[2953]: E0127 23:58:58.156204 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.156216 kubelet[2953]: W0127 23:58:58.156211 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.156216 kubelet[2953]: E0127 23:58:58.156218 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.180046 containerd[1664]: time="2026-01-27T23:58:58.179990796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z88kf,Uid:0fab4411-fdf0-47be-92ce-e6d644a9b3b9,Namespace:calico-system,Attempt:0,}" Jan 27 23:58:58.217124 containerd[1664]: time="2026-01-27T23:58:58.216942390Z" level=info msg="connecting to shim b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84" address="unix:///run/containerd/s/4394089d34f0151c924aa43d20700525c9478f1400ca1819b233cdc17f65b8a7" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:58:58.250185 systemd[1]: Started cri-containerd-b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84.scope - libcontainer container b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84. Jan 27 23:58:58.253501 kubelet[2953]: E0127 23:58:58.253461 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.253501 kubelet[2953]: W0127 23:58:58.253482 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.253501 kubelet[2953]: E0127 23:58:58.253501 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.253753 kubelet[2953]: E0127 23:58:58.253714 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.253753 kubelet[2953]: W0127 23:58:58.253732 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.253753 kubelet[2953]: E0127 23:58:58.253742 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.253953 kubelet[2953]: E0127 23:58:58.253931 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.253953 kubelet[2953]: W0127 23:58:58.253942 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.253953 kubelet[2953]: E0127 23:58:58.253951 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.254160 kubelet[2953]: E0127 23:58:58.254140 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.254160 kubelet[2953]: W0127 23:58:58.254151 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.254160 kubelet[2953]: E0127 23:58:58.254160 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.254338 kubelet[2953]: E0127 23:58:58.254322 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.254338 kubelet[2953]: W0127 23:58:58.254332 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.254400 kubelet[2953]: E0127 23:58:58.254340 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.254623 kubelet[2953]: E0127 23:58:58.254588 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.254623 kubelet[2953]: W0127 23:58:58.254615 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.254693 kubelet[2953]: E0127 23:58:58.254628 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.254866 kubelet[2953]: E0127 23:58:58.254838 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.254866 kubelet[2953]: W0127 23:58:58.254852 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.254866 kubelet[2953]: E0127 23:58:58.254861 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.255271 kubelet[2953]: E0127 23:58:58.255005 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.255271 kubelet[2953]: W0127 23:58:58.255014 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.255271 kubelet[2953]: E0127 23:58:58.255024 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.255271 kubelet[2953]: E0127 23:58:58.255260 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.255271 kubelet[2953]: W0127 23:58:58.255270 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.255396 kubelet[2953]: E0127 23:58:58.255279 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.255480 kubelet[2953]: E0127 23:58:58.255462 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.255480 kubelet[2953]: W0127 23:58:58.255474 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.255531 kubelet[2953]: E0127 23:58:58.255483 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.255667 kubelet[2953]: E0127 23:58:58.255650 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.255667 kubelet[2953]: W0127 23:58:58.255660 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.255782 kubelet[2953]: E0127 23:58:58.255668 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.255872 kubelet[2953]: E0127 23:58:58.255854 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.255872 kubelet[2953]: W0127 23:58:58.255865 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.255872 kubelet[2953]: E0127 23:58:58.255873 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.256081 kubelet[2953]: E0127 23:58:58.256062 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.256081 kubelet[2953]: W0127 23:58:58.256073 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.256081 kubelet[2953]: E0127 23:58:58.256081 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.256261 kubelet[2953]: E0127 23:58:58.256241 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.256300 kubelet[2953]: W0127 23:58:58.256291 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.256322 kubelet[2953]: E0127 23:58:58.256304 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.256482 kubelet[2953]: E0127 23:58:58.256462 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.256482 kubelet[2953]: W0127 23:58:58.256474 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.256547 kubelet[2953]: E0127 23:58:58.256493 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.256701 kubelet[2953]: E0127 23:58:58.256682 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.256701 kubelet[2953]: W0127 23:58:58.256695 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.256819 kubelet[2953]: E0127 23:58:58.256704 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.256956 kubelet[2953]: E0127 23:58:58.256935 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.256956 kubelet[2953]: W0127 23:58:58.256947 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.256956 kubelet[2953]: E0127 23:58:58.256956 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.257129 kubelet[2953]: E0127 23:58:58.257111 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.257129 kubelet[2953]: W0127 23:58:58.257122 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.257129 kubelet[2953]: E0127 23:58:58.257130 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.257310 kubelet[2953]: E0127 23:58:58.257290 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.257310 kubelet[2953]: W0127 23:58:58.257300 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.257310 kubelet[2953]: E0127 23:58:58.257308 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.257512 kubelet[2953]: E0127 23:58:58.257489 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.257550 kubelet[2953]: W0127 23:58:58.257513 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.257550 kubelet[2953]: E0127 23:58:58.257523 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.257814 kubelet[2953]: E0127 23:58:58.257793 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.257857 kubelet[2953]: W0127 23:58:58.257817 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.257857 kubelet[2953]: E0127 23:58:58.257829 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.258048 kubelet[2953]: E0127 23:58:58.258019 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.258048 kubelet[2953]: W0127 23:58:58.258034 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.258106 kubelet[2953]: E0127 23:58:58.258056 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.258717 kubelet[2953]: E0127 23:58:58.258680 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.258717 kubelet[2953]: W0127 23:58:58.258696 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.258817 kubelet[2953]: E0127 23:58:58.258709 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.259828 kubelet[2953]: E0127 23:58:58.259803 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.259828 kubelet[2953]: W0127 23:58:58.259820 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.259828 kubelet[2953]: E0127 23:58:58.259831 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.260278 kubelet[2953]: E0127 23:58:58.260250 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.260278 kubelet[2953]: W0127 23:58:58.260264 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.260359 kubelet[2953]: E0127 23:58:58.260275 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.263000 audit: BPF prog-id=156 op=LOAD Jan 27 23:58:58.264000 audit: BPF prog-id=157 op=LOAD Jan 27 23:58:58.264000 audit[3510]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=3499 pid=3510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237343431333964613863356462303364393338663032346238366333 Jan 27 23:58:58.264000 audit: BPF prog-id=157 op=UNLOAD Jan 27 23:58:58.264000 audit[3510]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237343431333964613863356462303364393338663032346238366333 Jan 27 23:58:58.264000 audit: BPF prog-id=158 op=LOAD Jan 27 23:58:58.264000 audit[3510]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3499 pid=3510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237343431333964613863356462303364393338663032346238366333 Jan 27 23:58:58.264000 audit: BPF prog-id=159 op=LOAD Jan 27 23:58:58.264000 audit[3510]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3499 pid=3510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237343431333964613863356462303364393338663032346238366333 Jan 27 23:58:58.264000 audit: BPF prog-id=159 op=UNLOAD Jan 27 23:58:58.264000 audit[3510]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237343431333964613863356462303364393338663032346238366333 Jan 27 23:58:58.264000 audit: BPF prog-id=158 op=UNLOAD Jan 27 23:58:58.264000 audit[3510]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237343431333964613863356462303364393338663032346238366333 Jan 27 23:58:58.264000 audit: BPF prog-id=160 op=LOAD Jan 27 23:58:58.264000 audit[3510]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3499 pid=3510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237343431333964613863356462303364393338663032346238366333 Jan 27 23:58:58.270673 kubelet[2953]: E0127 23:58:58.270547 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:58:58.270673 kubelet[2953]: W0127 23:58:58.270564 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:58:58.270673 kubelet[2953]: E0127 23:58:58.270577 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:58:58.282356 containerd[1664]: time="2026-01-27T23:58:58.282310951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z88kf,Uid:0fab4411-fdf0-47be-92ce-e6d644a9b3b9,Namespace:calico-system,Attempt:0,} returns sandbox id \"b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84\"" Jan 27 23:58:58.670000 audit[3564]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3564 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:58.670000 audit[3564]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffeefe5ce0 a2=0 a3=1 items=0 ppid=3063 pid=3564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.670000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:58.681000 audit[3564]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3564 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:58:58.681000 audit[3564]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeefe5ce0 a2=0 a3=1 items=0 ppid=3063 pid=3564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:58:58.681000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:58:59.552672 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount283161469.mount: Deactivated successfully. Jan 27 23:58:59.663893 kubelet[2953]: E0127 23:58:59.663849 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:00.744803 containerd[1664]: time="2026-01-27T23:59:00.744749327Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:00.745773 containerd[1664]: time="2026-01-27T23:59:00.745706930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 27 23:59:00.746639 containerd[1664]: time="2026-01-27T23:59:00.746580932Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:00.748684 containerd[1664]: time="2026-01-27T23:59:00.748641819Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:00.749263 containerd[1664]: time="2026-01-27T23:59:00.749221221Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.63607431s" Jan 27 23:59:00.749263 containerd[1664]: time="2026-01-27T23:59:00.749256221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 27 23:59:00.750350 containerd[1664]: time="2026-01-27T23:59:00.750327384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 27 23:59:00.763442 containerd[1664]: time="2026-01-27T23:59:00.763387704Z" level=info msg="CreateContainer within sandbox \"eac5700c2118b68c0222bf0e15eb6425f04bd9262b261c1d8af403a1bd68cef5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 27 23:59:00.773507 containerd[1664]: time="2026-01-27T23:59:00.771918850Z" level=info msg="Container 8f66c1e016a34b598429296c77e753fe3c1448c8fd21cbd472815014799219b0: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:59:00.783227 containerd[1664]: time="2026-01-27T23:59:00.783179605Z" level=info msg="CreateContainer within sandbox \"eac5700c2118b68c0222bf0e15eb6425f04bd9262b261c1d8af403a1bd68cef5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"8f66c1e016a34b598429296c77e753fe3c1448c8fd21cbd472815014799219b0\"" Jan 27 23:59:00.784436 containerd[1664]: time="2026-01-27T23:59:00.784384809Z" level=info msg="StartContainer for \"8f66c1e016a34b598429296c77e753fe3c1448c8fd21cbd472815014799219b0\"" Jan 27 23:59:00.786175 containerd[1664]: time="2026-01-27T23:59:00.786138814Z" level=info msg="connecting to shim 8f66c1e016a34b598429296c77e753fe3c1448c8fd21cbd472815014799219b0" address="unix:///run/containerd/s/4f2539d92e3394a9da07bc299d5e70629c0e557d036a595ac7bd1e8c96787b57" protocol=ttrpc version=3 Jan 27 23:59:00.805978 systemd[1]: Started cri-containerd-8f66c1e016a34b598429296c77e753fe3c1448c8fd21cbd472815014799219b0.scope - libcontainer container 8f66c1e016a34b598429296c77e753fe3c1448c8fd21cbd472815014799219b0. Jan 27 23:59:00.817000 audit: BPF prog-id=161 op=LOAD Jan 27 23:59:00.817000 audit: BPF prog-id=162 op=LOAD Jan 27 23:59:00.817000 audit[3575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3380 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:00.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866363663316530313661333462353938343239323936633737653735 Jan 27 23:59:00.817000 audit: BPF prog-id=162 op=UNLOAD Jan 27 23:59:00.817000 audit[3575]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3380 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:00.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866363663316530313661333462353938343239323936633737653735 Jan 27 23:59:00.817000 audit: BPF prog-id=163 op=LOAD Jan 27 23:59:00.817000 audit[3575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3380 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:00.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866363663316530313661333462353938343239323936633737653735 Jan 27 23:59:00.817000 audit: BPF prog-id=164 op=LOAD Jan 27 23:59:00.817000 audit[3575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3380 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:00.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866363663316530313661333462353938343239323936633737653735 Jan 27 23:59:00.817000 audit: BPF prog-id=164 op=UNLOAD Jan 27 23:59:00.817000 audit[3575]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3380 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:00.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866363663316530313661333462353938343239323936633737653735 Jan 27 23:59:00.817000 audit: BPF prog-id=163 op=UNLOAD Jan 27 23:59:00.817000 audit[3575]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3380 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:00.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866363663316530313661333462353938343239323936633737653735 Jan 27 23:59:00.817000 audit: BPF prog-id=165 op=LOAD Jan 27 23:59:00.817000 audit[3575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3380 pid=3575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:00.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866363663316530313661333462353938343239323936633737653735 Jan 27 23:59:00.848966 containerd[1664]: time="2026-01-27T23:59:00.848695607Z" level=info msg="StartContainer for \"8f66c1e016a34b598429296c77e753fe3c1448c8fd21cbd472815014799219b0\" returns successfully" Jan 27 23:59:01.661253 kubelet[2953]: E0127 23:59:01.661113 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:01.783053 kubelet[2953]: I0127 23:59:01.782991 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7bd7b47886-qt2wb" podStartSLOduration=2.145372526 podStartE2EDuration="4.782977361s" podCreationTimestamp="2026-01-27 23:58:57 +0000 UTC" firstStartedPulling="2026-01-27 23:58:58.112602109 +0000 UTC m=+28.544480651" lastFinishedPulling="2026-01-27 23:59:00.750206984 +0000 UTC m=+31.182085486" observedRunningTime="2026-01-27 23:59:01.782849761 +0000 UTC m=+32.214728303" watchObservedRunningTime="2026-01-27 23:59:01.782977361 +0000 UTC m=+32.214855903" Jan 27 23:59:01.856064 kubelet[2953]: E0127 23:59:01.856033 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.856064 kubelet[2953]: W0127 23:59:01.856057 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.856238 kubelet[2953]: E0127 23:59:01.856077 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.856294 kubelet[2953]: E0127 23:59:01.856284 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.856335 kubelet[2953]: W0127 23:59:01.856294 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.856335 kubelet[2953]: E0127 23:59:01.856333 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.856494 kubelet[2953]: E0127 23:59:01.856484 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.856494 kubelet[2953]: W0127 23:59:01.856494 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.856564 kubelet[2953]: E0127 23:59:01.856502 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.856643 kubelet[2953]: E0127 23:59:01.856632 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.856686 kubelet[2953]: W0127 23:59:01.856643 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.856686 kubelet[2953]: E0127 23:59:01.856653 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.856823 kubelet[2953]: E0127 23:59:01.856811 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.856823 kubelet[2953]: W0127 23:59:01.856823 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.856880 kubelet[2953]: E0127 23:59:01.856831 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.856968 kubelet[2953]: E0127 23:59:01.856957 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.856968 kubelet[2953]: W0127 23:59:01.856967 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.857046 kubelet[2953]: E0127 23:59:01.856975 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.857105 kubelet[2953]: E0127 23:59:01.857096 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.857105 kubelet[2953]: W0127 23:59:01.857105 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.857180 kubelet[2953]: E0127 23:59:01.857114 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.857247 kubelet[2953]: E0127 23:59:01.857233 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.857247 kubelet[2953]: W0127 23:59:01.857244 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.857313 kubelet[2953]: E0127 23:59:01.857251 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.857395 kubelet[2953]: E0127 23:59:01.857385 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.857395 kubelet[2953]: W0127 23:59:01.857395 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.857787 kubelet[2953]: E0127 23:59:01.857402 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.857827 kubelet[2953]: E0127 23:59:01.857805 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.857827 kubelet[2953]: W0127 23:59:01.857817 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.857870 kubelet[2953]: E0127 23:59:01.857828 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.858201 kubelet[2953]: E0127 23:59:01.857994 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.858201 kubelet[2953]: W0127 23:59:01.858010 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.858201 kubelet[2953]: E0127 23:59:01.858019 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.858201 kubelet[2953]: E0127 23:59:01.858162 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.858201 kubelet[2953]: W0127 23:59:01.858169 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.858201 kubelet[2953]: E0127 23:59:01.858176 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.858919 kubelet[2953]: E0127 23:59:01.858310 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.858919 kubelet[2953]: W0127 23:59:01.858318 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.858919 kubelet[2953]: E0127 23:59:01.858325 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.858919 kubelet[2953]: E0127 23:59:01.858452 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.858919 kubelet[2953]: W0127 23:59:01.858458 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.858919 kubelet[2953]: E0127 23:59:01.858466 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.858919 kubelet[2953]: E0127 23:59:01.858615 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.858919 kubelet[2953]: W0127 23:59:01.858620 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.858919 kubelet[2953]: E0127 23:59:01.858627 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.878303 kubelet[2953]: E0127 23:59:01.878264 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.878303 kubelet[2953]: W0127 23:59:01.878291 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.878303 kubelet[2953]: E0127 23:59:01.878311 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.878637 kubelet[2953]: E0127 23:59:01.878564 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.878637 kubelet[2953]: W0127 23:59:01.878574 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.878637 kubelet[2953]: E0127 23:59:01.878585 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.878932 kubelet[2953]: E0127 23:59:01.878914 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.878932 kubelet[2953]: W0127 23:59:01.878930 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.878993 kubelet[2953]: E0127 23:59:01.878957 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.879295 kubelet[2953]: E0127 23:59:01.879277 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.879295 kubelet[2953]: W0127 23:59:01.879293 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.879360 kubelet[2953]: E0127 23:59:01.879307 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.879737 kubelet[2953]: E0127 23:59:01.879701 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.879814 kubelet[2953]: W0127 23:59:01.879743 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.879850 kubelet[2953]: E0127 23:59:01.879820 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.880124 kubelet[2953]: E0127 23:59:01.880105 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.880124 kubelet[2953]: W0127 23:59:01.880122 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.880218 kubelet[2953]: E0127 23:59:01.880133 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.880380 kubelet[2953]: E0127 23:59:01.880362 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.880380 kubelet[2953]: W0127 23:59:01.880375 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.880440 kubelet[2953]: E0127 23:59:01.880385 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.880644 kubelet[2953]: E0127 23:59:01.880606 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.880644 kubelet[2953]: W0127 23:59:01.880631 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.880644 kubelet[2953]: E0127 23:59:01.880644 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.881045 kubelet[2953]: E0127 23:59:01.880891 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.881045 kubelet[2953]: W0127 23:59:01.880900 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.881045 kubelet[2953]: E0127 23:59:01.880909 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.881423 kubelet[2953]: E0127 23:59:01.881297 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.881474 kubelet[2953]: W0127 23:59:01.881426 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.881474 kubelet[2953]: E0127 23:59:01.881448 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.881756 kubelet[2953]: E0127 23:59:01.881738 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.881756 kubelet[2953]: W0127 23:59:01.881755 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.881863 kubelet[2953]: E0127 23:59:01.881767 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.881973 kubelet[2953]: E0127 23:59:01.881958 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.881973 kubelet[2953]: W0127 23:59:01.881972 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.882042 kubelet[2953]: E0127 23:59:01.881981 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.882868 kubelet[2953]: E0127 23:59:01.882811 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.882868 kubelet[2953]: W0127 23:59:01.882832 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.883428 kubelet[2953]: E0127 23:59:01.883405 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.883887 kubelet[2953]: E0127 23:59:01.883833 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.883887 kubelet[2953]: W0127 23:59:01.883862 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.883887 kubelet[2953]: E0127 23:59:01.883877 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.884155 kubelet[2953]: E0127 23:59:01.884139 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.884155 kubelet[2953]: W0127 23:59:01.884155 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.884256 kubelet[2953]: E0127 23:59:01.884167 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.884539 kubelet[2953]: E0127 23:59:01.884516 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.884539 kubelet[2953]: W0127 23:59:01.884537 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.884589 kubelet[2953]: E0127 23:59:01.884548 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.884827 kubelet[2953]: E0127 23:59:01.884811 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.884827 kubelet[2953]: W0127 23:59:01.884824 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.885221 kubelet[2953]: E0127 23:59:01.884834 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:01.885295 kubelet[2953]: E0127 23:59:01.885275 2953 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 23:59:01.885323 kubelet[2953]: W0127 23:59:01.885295 2953 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 23:59:01.885323 kubelet[2953]: E0127 23:59:01.885309 2953 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 23:59:02.294320 containerd[1664]: time="2026-01-27T23:59:02.294264134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:02.295420 containerd[1664]: time="2026-01-27T23:59:02.295376977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:02.297330 containerd[1664]: time="2026-01-27T23:59:02.296967022Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:02.299616 containerd[1664]: time="2026-01-27T23:59:02.299580110Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:02.300442 containerd[1664]: time="2026-01-27T23:59:02.300407193Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.550048969s" Jan 27 23:59:02.300442 containerd[1664]: time="2026-01-27T23:59:02.300441073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 27 23:59:02.305549 containerd[1664]: time="2026-01-27T23:59:02.305524969Z" level=info msg="CreateContainer within sandbox \"b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 27 23:59:02.317047 containerd[1664]: time="2026-01-27T23:59:02.317011644Z" level=info msg="Container eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:59:02.326088 containerd[1664]: time="2026-01-27T23:59:02.326039672Z" level=info msg="CreateContainer within sandbox \"b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447\"" Jan 27 23:59:02.326623 containerd[1664]: time="2026-01-27T23:59:02.326596553Z" level=info msg="StartContainer for \"eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447\"" Jan 27 23:59:02.329316 containerd[1664]: time="2026-01-27T23:59:02.329248162Z" level=info msg="connecting to shim eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447" address="unix:///run/containerd/s/4394089d34f0151c924aa43d20700525c9478f1400ca1819b233cdc17f65b8a7" protocol=ttrpc version=3 Jan 27 23:59:02.352976 systemd[1]: Started cri-containerd-eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447.scope - libcontainer container eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447. Jan 27 23:59:02.397909 kernel: kauditd_printk_skb: 74 callbacks suppressed Jan 27 23:59:02.398030 kernel: audit: type=1334 audit(1769558342.396:563): prog-id=166 op=LOAD Jan 27 23:59:02.396000 audit: BPF prog-id=166 op=LOAD Jan 27 23:59:02.396000 audit[3651]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.396000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.405515 kernel: audit: type=1300 audit(1769558342.396:563): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.405662 kernel: audit: type=1327 audit(1769558342.396:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.405684 kernel: audit: type=1334 audit(1769558342.396:564): prog-id=167 op=LOAD Jan 27 23:59:02.396000 audit: BPF prog-id=167 op=LOAD Jan 27 23:59:02.406450 kernel: audit: type=1300 audit(1769558342.396:564): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.396000 audit[3651]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.396000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.412873 kernel: audit: type=1327 audit(1769558342.396:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.397000 audit: BPF prog-id=167 op=UNLOAD Jan 27 23:59:02.414148 kernel: audit: type=1334 audit(1769558342.397:565): prog-id=167 op=UNLOAD Jan 27 23:59:02.397000 audit[3651]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.417576 kernel: audit: type=1300 audit(1769558342.397:565): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.397000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.420865 kernel: audit: type=1327 audit(1769558342.397:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.421020 kernel: audit: type=1334 audit(1769558342.397:566): prog-id=166 op=UNLOAD Jan 27 23:59:02.397000 audit: BPF prog-id=166 op=UNLOAD Jan 27 23:59:02.397000 audit[3651]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.397000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.397000 audit: BPF prog-id=168 op=LOAD Jan 27 23:59:02.397000 audit[3651]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3499 pid=3651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:02.397000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561623638376136643466363836393566343962663563303638343933 Jan 27 23:59:02.441712 containerd[1664]: time="2026-01-27T23:59:02.441666547Z" level=info msg="StartContainer for \"eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447\" returns successfully" Jan 27 23:59:02.452390 systemd[1]: cri-containerd-eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447.scope: Deactivated successfully. Jan 27 23:59:02.455000 audit: BPF prog-id=168 op=UNLOAD Jan 27 23:59:02.456167 containerd[1664]: time="2026-01-27T23:59:02.456083592Z" level=info msg="received container exit event container_id:\"eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447\" id:\"eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447\" pid:3663 exited_at:{seconds:1769558342 nanos:455283869}" Jan 27 23:59:02.478742 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eab687a6d4f68695f49bf5c0684937eea0a9eac42bff5432c86b31e49afec447-rootfs.mount: Deactivated successfully. Jan 27 23:59:02.774743 kubelet[2953]: I0127 23:59:02.774525 2953 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 23:59:02.775645 containerd[1664]: time="2026-01-27T23:59:02.775613455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 27 23:59:03.662483 kubelet[2953]: E0127 23:59:03.662377 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:05.661473 kubelet[2953]: E0127 23:59:05.661428 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:05.828768 containerd[1664]: time="2026-01-27T23:59:05.828703048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:05.829907 containerd[1664]: time="2026-01-27T23:59:05.829857771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 27 23:59:05.831754 containerd[1664]: time="2026-01-27T23:59:05.831071895Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:05.833391 containerd[1664]: time="2026-01-27T23:59:05.833365382Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:05.834763 containerd[1664]: time="2026-01-27T23:59:05.834715746Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.059060811s" Jan 27 23:59:05.834878 containerd[1664]: time="2026-01-27T23:59:05.834863787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 27 23:59:05.838620 containerd[1664]: time="2026-01-27T23:59:05.838585758Z" level=info msg="CreateContainer within sandbox \"b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 27 23:59:05.850034 containerd[1664]: time="2026-01-27T23:59:05.849979873Z" level=info msg="Container e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:59:05.858621 containerd[1664]: time="2026-01-27T23:59:05.858552820Z" level=info msg="CreateContainer within sandbox \"b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f\"" Jan 27 23:59:05.859604 containerd[1664]: time="2026-01-27T23:59:05.859575983Z" level=info msg="StartContainer for \"e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f\"" Jan 27 23:59:05.861239 containerd[1664]: time="2026-01-27T23:59:05.861185268Z" level=info msg="connecting to shim e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f" address="unix:///run/containerd/s/4394089d34f0151c924aa43d20700525c9478f1400ca1819b233cdc17f65b8a7" protocol=ttrpc version=3 Jan 27 23:59:05.882980 systemd[1]: Started cri-containerd-e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f.scope - libcontainer container e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f. Jan 27 23:59:05.947000 audit: BPF prog-id=169 op=LOAD Jan 27 23:59:05.947000 audit[3712]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3499 pid=3712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:05.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531346162386230333862303139363738313934653134386362333534 Jan 27 23:59:05.947000 audit: BPF prog-id=170 op=LOAD Jan 27 23:59:05.947000 audit[3712]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3499 pid=3712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:05.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531346162386230333862303139363738313934653134386362333534 Jan 27 23:59:05.947000 audit: BPF prog-id=170 op=UNLOAD Jan 27 23:59:05.947000 audit[3712]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:05.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531346162386230333862303139363738313934653134386362333534 Jan 27 23:59:05.947000 audit: BPF prog-id=169 op=UNLOAD Jan 27 23:59:05.947000 audit[3712]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=3712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:05.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531346162386230333862303139363738313934653134386362333534 Jan 27 23:59:05.947000 audit: BPF prog-id=171 op=LOAD Jan 27 23:59:05.947000 audit[3712]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3499 pid=3712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:05.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531346162386230333862303139363738313934653134386362333534 Jan 27 23:59:05.967998 containerd[1664]: time="2026-01-27T23:59:05.967955236Z" level=info msg="StartContainer for \"e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f\" returns successfully" Jan 27 23:59:06.372911 containerd[1664]: time="2026-01-27T23:59:06.372861162Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 27 23:59:06.374632 systemd[1]: cri-containerd-e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f.scope: Deactivated successfully. Jan 27 23:59:06.374963 systemd[1]: cri-containerd-e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f.scope: Consumed 476ms CPU time, 188.5M memory peak, 165.9M written to disk. Jan 27 23:59:06.377191 containerd[1664]: time="2026-01-27T23:59:06.377142535Z" level=info msg="received container exit event container_id:\"e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f\" id:\"e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f\" pid:3725 exited_at:{seconds:1769558346 nanos:376952694}" Jan 27 23:59:06.380000 audit: BPF prog-id=171 op=UNLOAD Jan 27 23:59:06.399322 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e14ab8b038b019678194e148cb354f1c0a72ec84cbbabded1f886a56585c8e0f-rootfs.mount: Deactivated successfully. Jan 27 23:59:06.432408 kubelet[2953]: I0127 23:59:06.431956 2953 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Jan 27 23:59:06.479858 systemd[1]: Created slice kubepods-burstable-pod43426337_f017_4b99_a432_b642f2eafaaa.slice - libcontainer container kubepods-burstable-pod43426337_f017_4b99_a432_b642f2eafaaa.slice. Jan 27 23:59:06.492406 systemd[1]: Created slice kubepods-burstable-podc3beb58c_7880_46f0_a29c_255fb717766b.slice - libcontainer container kubepods-burstable-podc3beb58c_7880_46f0_a29c_255fb717766b.slice. Jan 27 23:59:06.494951 kubelet[2953]: I0127 23:59:06.494919 2953 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 23:59:06.502871 systemd[1]: Created slice kubepods-besteffort-pod7ec7f1f1_9073_42e6_9a8a_d6f3b105d21c.slice - libcontainer container kubepods-besteffort-pod7ec7f1f1_9073_42e6_9a8a_d6f3b105d21c.slice. Jan 27 23:59:06.507981 systemd[1]: Created slice kubepods-besteffort-podcca7bd93_1a7d_448c_ad36_6b956cecc82e.slice - libcontainer container kubepods-besteffort-podcca7bd93_1a7d_448c_ad36_6b956cecc82e.slice. Jan 27 23:59:06.509587 kubelet[2953]: I0127 23:59:06.509171 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9wc\" (UniqueName: \"kubernetes.io/projected/43426337-f017-4b99-a432-b642f2eafaaa-kube-api-access-ch9wc\") pod \"coredns-66bc5c9577-wfxbj\" (UID: \"43426337-f017-4b99-a432-b642f2eafaaa\") " pod="kube-system/coredns-66bc5c9577-wfxbj" Jan 27 23:59:06.509587 kubelet[2953]: I0127 23:59:06.509218 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e61938e0-7077-4d81-9b34-6430d54d8b9f-goldmane-key-pair\") pod \"goldmane-7c778bb748-sv2v9\" (UID: \"e61938e0-7077-4d81-9b34-6430d54d8b9f\") " pod="calico-system/goldmane-7c778bb748-sv2v9" Jan 27 23:59:06.509587 kubelet[2953]: I0127 23:59:06.509235 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn8s2\" (UniqueName: \"kubernetes.io/projected/e61938e0-7077-4d81-9b34-6430d54d8b9f-kube-api-access-qn8s2\") pod \"goldmane-7c778bb748-sv2v9\" (UID: \"e61938e0-7077-4d81-9b34-6430d54d8b9f\") " pod="calico-system/goldmane-7c778bb748-sv2v9" Jan 27 23:59:06.509587 kubelet[2953]: I0127 23:59:06.509252 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddq6h\" (UniqueName: \"kubernetes.io/projected/7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c-kube-api-access-ddq6h\") pod \"calico-apiserver-bb4448d88-2hkvv\" (UID: \"7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c\") " pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" Jan 27 23:59:06.509587 kubelet[2953]: I0127 23:59:06.509268 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61938e0-7077-4d81-9b34-6430d54d8b9f-config\") pod \"goldmane-7c778bb748-sv2v9\" (UID: \"e61938e0-7077-4d81-9b34-6430d54d8b9f\") " pod="calico-system/goldmane-7c778bb748-sv2v9" Jan 27 23:59:06.509801 kubelet[2953]: I0127 23:59:06.509285 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e0df84e-16dc-494b-a3d3-1071788a0777-tigera-ca-bundle\") pod \"calico-kube-controllers-fcc6fc97b-npt4s\" (UID: \"2e0df84e-16dc-494b-a3d3-1071788a0777\") " pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" Jan 27 23:59:06.509801 kubelet[2953]: I0127 23:59:06.509300 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5k8l\" (UniqueName: \"kubernetes.io/projected/2e0df84e-16dc-494b-a3d3-1071788a0777-kube-api-access-g5k8l\") pod \"calico-kube-controllers-fcc6fc97b-npt4s\" (UID: \"2e0df84e-16dc-494b-a3d3-1071788a0777\") " pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" Jan 27 23:59:06.509801 kubelet[2953]: I0127 23:59:06.509317 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3beb58c-7880-46f0-a29c-255fb717766b-config-volume\") pod \"coredns-66bc5c9577-mp25r\" (UID: \"c3beb58c-7880-46f0-a29c-255fb717766b\") " pod="kube-system/coredns-66bc5c9577-mp25r" Jan 27 23:59:06.509801 kubelet[2953]: I0127 23:59:06.509331 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cca7bd93-1a7d-448c-ad36-6b956cecc82e-calico-apiserver-certs\") pod \"calico-apiserver-bb4448d88-jlz2t\" (UID: \"cca7bd93-1a7d-448c-ad36-6b956cecc82e\") " pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" Jan 27 23:59:06.509801 kubelet[2953]: I0127 23:59:06.509348 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bplpj\" (UniqueName: \"kubernetes.io/projected/cca7bd93-1a7d-448c-ad36-6b956cecc82e-kube-api-access-bplpj\") pod \"calico-apiserver-bb4448d88-jlz2t\" (UID: \"cca7bd93-1a7d-448c-ad36-6b956cecc82e\") " pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" Jan 27 23:59:06.510769 kubelet[2953]: I0127 23:59:06.510041 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-ca-bundle\") pod \"whisker-5bbf6f57f7-vprfw\" (UID: \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\") " pod="calico-system/whisker-5bbf6f57f7-vprfw" Jan 27 23:59:06.510878 kubelet[2953]: I0127 23:59:06.510803 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l9cp\" (UniqueName: \"kubernetes.io/projected/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-kube-api-access-7l9cp\") pod \"whisker-5bbf6f57f7-vprfw\" (UID: \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\") " pod="calico-system/whisker-5bbf6f57f7-vprfw" Jan 27 23:59:06.510878 kubelet[2953]: I0127 23:59:06.510828 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wvhp\" (UniqueName: \"kubernetes.io/projected/c3beb58c-7880-46f0-a29c-255fb717766b-kube-api-access-6wvhp\") pod \"coredns-66bc5c9577-mp25r\" (UID: \"c3beb58c-7880-46f0-a29c-255fb717766b\") " pod="kube-system/coredns-66bc5c9577-mp25r" Jan 27 23:59:06.510878 kubelet[2953]: I0127 23:59:06.510846 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c-calico-apiserver-certs\") pod \"calico-apiserver-bb4448d88-2hkvv\" (UID: \"7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c\") " pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" Jan 27 23:59:06.510878 kubelet[2953]: I0127 23:59:06.510862 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-backend-key-pair\") pod \"whisker-5bbf6f57f7-vprfw\" (UID: \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\") " pod="calico-system/whisker-5bbf6f57f7-vprfw" Jan 27 23:59:06.511023 kubelet[2953]: I0127 23:59:06.510881 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43426337-f017-4b99-a432-b642f2eafaaa-config-volume\") pod \"coredns-66bc5c9577-wfxbj\" (UID: \"43426337-f017-4b99-a432-b642f2eafaaa\") " pod="kube-system/coredns-66bc5c9577-wfxbj" Jan 27 23:59:06.511023 kubelet[2953]: I0127 23:59:06.510897 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61938e0-7077-4d81-9b34-6430d54d8b9f-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-sv2v9\" (UID: \"e61938e0-7077-4d81-9b34-6430d54d8b9f\") " pod="calico-system/goldmane-7c778bb748-sv2v9" Jan 27 23:59:06.515027 systemd[1]: Created slice kubepods-besteffort-pod2e0df84e_16dc_494b_a3d3_1071788a0777.slice - libcontainer container kubepods-besteffort-pod2e0df84e_16dc_494b_a3d3_1071788a0777.slice. Jan 27 23:59:06.520633 systemd[1]: Created slice kubepods-besteffort-podcbf32e17_cd7c_4af0_a1fc_ed07dd768807.slice - libcontainer container kubepods-besteffort-podcbf32e17_cd7c_4af0_a1fc_ed07dd768807.slice. Jan 27 23:59:06.528017 systemd[1]: Created slice kubepods-besteffort-pode61938e0_7077_4d81_9b34_6430d54d8b9f.slice - libcontainer container kubepods-besteffort-pode61938e0_7077_4d81_9b34_6430d54d8b9f.slice. Jan 27 23:59:06.547000 audit[3760]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3760 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:06.547000 audit[3760]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe5fc6940 a2=0 a3=1 items=0 ppid=3063 pid=3760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:06.547000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:06.552000 audit[3760]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3760 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:06.552000 audit[3760]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffe5fc6940 a2=0 a3=1 items=0 ppid=3063 pid=3760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:06.552000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:06.788217 containerd[1664]: time="2026-01-27T23:59:06.788175200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 27 23:59:06.793012 containerd[1664]: time="2026-01-27T23:59:06.792972974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wfxbj,Uid:43426337-f017-4b99-a432-b642f2eafaaa,Namespace:kube-system,Attempt:0,}" Jan 27 23:59:06.809844 containerd[1664]: time="2026-01-27T23:59:06.809759746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mp25r,Uid:c3beb58c-7880-46f0-a29c-255fb717766b,Namespace:kube-system,Attempt:0,}" Jan 27 23:59:06.811628 containerd[1664]: time="2026-01-27T23:59:06.811600832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-2hkvv,Uid:7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c,Namespace:calico-apiserver,Attempt:0,}" Jan 27 23:59:06.815754 containerd[1664]: time="2026-01-27T23:59:06.815681164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-jlz2t,Uid:cca7bd93-1a7d-448c-ad36-6b956cecc82e,Namespace:calico-apiserver,Attempt:0,}" Jan 27 23:59:06.823036 containerd[1664]: time="2026-01-27T23:59:06.822999387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fcc6fc97b-npt4s,Uid:2e0df84e-16dc-494b-a3d3-1071788a0777,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:06.826834 containerd[1664]: time="2026-01-27T23:59:06.826777878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bbf6f57f7-vprfw,Uid:cbf32e17-cd7c-4af0-a1fc-ed07dd768807,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:06.836422 containerd[1664]: time="2026-01-27T23:59:06.836322628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sv2v9,Uid:e61938e0-7077-4d81-9b34-6430d54d8b9f,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:06.915089 containerd[1664]: time="2026-01-27T23:59:06.915001710Z" level=error msg="Failed to destroy network for sandbox \"71ff86fd7b36963ebfe4a11fafd20a5812d32a55a6d76f76dfea22141b685266\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.918060 systemd[1]: run-netns-cni\x2d191e1235\x2df7b4\x2d8486\x2d65a6\x2d7ef8e2c7a242.mount: Deactivated successfully. Jan 27 23:59:06.922905 containerd[1664]: time="2026-01-27T23:59:06.922846294Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wfxbj,Uid:43426337-f017-4b99-a432-b642f2eafaaa,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"71ff86fd7b36963ebfe4a11fafd20a5812d32a55a6d76f76dfea22141b685266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.923127 kubelet[2953]: E0127 23:59:06.923079 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71ff86fd7b36963ebfe4a11fafd20a5812d32a55a6d76f76dfea22141b685266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.923393 kubelet[2953]: E0127 23:59:06.923161 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71ff86fd7b36963ebfe4a11fafd20a5812d32a55a6d76f76dfea22141b685266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-wfxbj" Jan 27 23:59:06.923393 kubelet[2953]: E0127 23:59:06.923180 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71ff86fd7b36963ebfe4a11fafd20a5812d32a55a6d76f76dfea22141b685266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-wfxbj" Jan 27 23:59:06.923393 kubelet[2953]: E0127 23:59:06.923227 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-wfxbj_kube-system(43426337-f017-4b99-a432-b642f2eafaaa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-wfxbj_kube-system(43426337-f017-4b99-a432-b642f2eafaaa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"71ff86fd7b36963ebfe4a11fafd20a5812d32a55a6d76f76dfea22141b685266\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-wfxbj" podUID="43426337-f017-4b99-a432-b642f2eafaaa" Jan 27 23:59:06.926941 containerd[1664]: time="2026-01-27T23:59:06.926595065Z" level=error msg="Failed to destroy network for sandbox \"e224152f9cf3c782141833f4debf380b2907befa85ed5feabb8f7deb187bb04d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.928224 containerd[1664]: time="2026-01-27T23:59:06.928174150Z" level=error msg="Failed to destroy network for sandbox \"a8acd0742df4ba7018a9414c62c8cee054e4546fa44e787519d1ab482eeabee4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.929154 systemd[1]: run-netns-cni\x2dc5d67a0a\x2dc7e6\x2d9591\x2dedf1\x2d17e1b0adf25f.mount: Deactivated successfully. Jan 27 23:59:06.932023 systemd[1]: run-netns-cni\x2d58f1478b\x2dfc82\x2db4c3\x2d2ea3\x2de487b72afa28.mount: Deactivated successfully. Jan 27 23:59:06.933763 containerd[1664]: time="2026-01-27T23:59:06.933695207Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mp25r,Uid:c3beb58c-7880-46f0-a29c-255fb717766b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e224152f9cf3c782141833f4debf380b2907befa85ed5feabb8f7deb187bb04d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.934003 kubelet[2953]: E0127 23:59:06.933962 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e224152f9cf3c782141833f4debf380b2907befa85ed5feabb8f7deb187bb04d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.934071 kubelet[2953]: E0127 23:59:06.934022 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e224152f9cf3c782141833f4debf380b2907befa85ed5feabb8f7deb187bb04d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-mp25r" Jan 27 23:59:06.934071 kubelet[2953]: E0127 23:59:06.934045 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e224152f9cf3c782141833f4debf380b2907befa85ed5feabb8f7deb187bb04d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-mp25r" Jan 27 23:59:06.934234 kubelet[2953]: E0127 23:59:06.934097 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-mp25r_kube-system(c3beb58c-7880-46f0-a29c-255fb717766b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-mp25r_kube-system(c3beb58c-7880-46f0-a29c-255fb717766b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e224152f9cf3c782141833f4debf380b2907befa85ed5feabb8f7deb187bb04d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-mp25r" podUID="c3beb58c-7880-46f0-a29c-255fb717766b" Jan 27 23:59:06.938026 containerd[1664]: time="2026-01-27T23:59:06.937979260Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-2hkvv,Uid:7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8acd0742df4ba7018a9414c62c8cee054e4546fa44e787519d1ab482eeabee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.938239 kubelet[2953]: E0127 23:59:06.938203 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8acd0742df4ba7018a9414c62c8cee054e4546fa44e787519d1ab482eeabee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.938293 kubelet[2953]: E0127 23:59:06.938260 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8acd0742df4ba7018a9414c62c8cee054e4546fa44e787519d1ab482eeabee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" Jan 27 23:59:06.938293 kubelet[2953]: E0127 23:59:06.938280 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8acd0742df4ba7018a9414c62c8cee054e4546fa44e787519d1ab482eeabee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" Jan 27 23:59:06.938359 kubelet[2953]: E0127 23:59:06.938325 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bb4448d88-2hkvv_calico-apiserver(7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bb4448d88-2hkvv_calico-apiserver(7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a8acd0742df4ba7018a9414c62c8cee054e4546fa44e787519d1ab482eeabee4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 27 23:59:06.943066 containerd[1664]: time="2026-01-27T23:59:06.942951396Z" level=error msg="Failed to destroy network for sandbox \"e510e49ef97276be2833e4545222d04b238ea4bd9c86e4ccad8da2c4c3063e3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.945738 systemd[1]: run-netns-cni\x2ddddc4cd9\x2dc1ba\x2d72b8\x2d9791\x2deb5a718e8477.mount: Deactivated successfully. Jan 27 23:59:06.951692 containerd[1664]: time="2026-01-27T23:59:06.951648703Z" level=error msg="Failed to destroy network for sandbox \"0be5990ad77352a778be0553f2ce2ca18f583a2d7f83b7ed18c0ea3ccefec3d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.953928 containerd[1664]: time="2026-01-27T23:59:06.953812589Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-jlz2t,Uid:cca7bd93-1a7d-448c-ad36-6b956cecc82e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e510e49ef97276be2833e4545222d04b238ea4bd9c86e4ccad8da2c4c3063e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.954280 kubelet[2953]: E0127 23:59:06.954064 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e510e49ef97276be2833e4545222d04b238ea4bd9c86e4ccad8da2c4c3063e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.954343 kubelet[2953]: E0127 23:59:06.954296 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e510e49ef97276be2833e4545222d04b238ea4bd9c86e4ccad8da2c4c3063e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" Jan 27 23:59:06.954343 kubelet[2953]: E0127 23:59:06.954318 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e510e49ef97276be2833e4545222d04b238ea4bd9c86e4ccad8da2c4c3063e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" Jan 27 23:59:06.954660 kubelet[2953]: E0127 23:59:06.954375 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bb4448d88-jlz2t_calico-apiserver(cca7bd93-1a7d-448c-ad36-6b956cecc82e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bb4448d88-jlz2t_calico-apiserver(cca7bd93-1a7d-448c-ad36-6b956cecc82e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e510e49ef97276be2833e4545222d04b238ea4bd9c86e4ccad8da2c4c3063e3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 27 23:59:06.955675 containerd[1664]: time="2026-01-27T23:59:06.955630475Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bbf6f57f7-vprfw,Uid:cbf32e17-cd7c-4af0-a1fc-ed07dd768807,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0be5990ad77352a778be0553f2ce2ca18f583a2d7f83b7ed18c0ea3ccefec3d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.956009 kubelet[2953]: E0127 23:59:06.955981 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0be5990ad77352a778be0553f2ce2ca18f583a2d7f83b7ed18c0ea3ccefec3d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.956052 kubelet[2953]: E0127 23:59:06.956023 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0be5990ad77352a778be0553f2ce2ca18f583a2d7f83b7ed18c0ea3ccefec3d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bbf6f57f7-vprfw" Jan 27 23:59:06.956052 kubelet[2953]: E0127 23:59:06.956045 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0be5990ad77352a778be0553f2ce2ca18f583a2d7f83b7ed18c0ea3ccefec3d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bbf6f57f7-vprfw" Jan 27 23:59:06.956113 kubelet[2953]: E0127 23:59:06.956091 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5bbf6f57f7-vprfw_calico-system(cbf32e17-cd7c-4af0-a1fc-ed07dd768807)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5bbf6f57f7-vprfw_calico-system(cbf32e17-cd7c-4af0-a1fc-ed07dd768807)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0be5990ad77352a778be0553f2ce2ca18f583a2d7f83b7ed18c0ea3ccefec3d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5bbf6f57f7-vprfw" podUID="cbf32e17-cd7c-4af0-a1fc-ed07dd768807" Jan 27 23:59:06.956268 containerd[1664]: time="2026-01-27T23:59:06.956241197Z" level=error msg="Failed to destroy network for sandbox \"31dc17035fd1e47a88afe56125ac897a6323fae37a7b258a361cf387c8f2c313\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.959451 containerd[1664]: time="2026-01-27T23:59:06.959412046Z" level=error msg="Failed to destroy network for sandbox \"48256df2c0dc930182397714eee430ff5c80a6503f48a2b1cc715efd85615e74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.959817 containerd[1664]: time="2026-01-27T23:59:06.959787008Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sv2v9,Uid:e61938e0-7077-4d81-9b34-6430d54d8b9f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"31dc17035fd1e47a88afe56125ac897a6323fae37a7b258a361cf387c8f2c313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.960016 kubelet[2953]: E0127 23:59:06.959988 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31dc17035fd1e47a88afe56125ac897a6323fae37a7b258a361cf387c8f2c313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.960083 kubelet[2953]: E0127 23:59:06.960051 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31dc17035fd1e47a88afe56125ac897a6323fae37a7b258a361cf387c8f2c313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sv2v9" Jan 27 23:59:06.960120 kubelet[2953]: E0127 23:59:06.960086 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31dc17035fd1e47a88afe56125ac897a6323fae37a7b258a361cf387c8f2c313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sv2v9" Jan 27 23:59:06.960155 kubelet[2953]: E0127 23:59:06.960134 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-sv2v9_calico-system(e61938e0-7077-4d81-9b34-6430d54d8b9f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-sv2v9_calico-system(e61938e0-7077-4d81-9b34-6430d54d8b9f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"31dc17035fd1e47a88afe56125ac897a6323fae37a7b258a361cf387c8f2c313\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 27 23:59:06.963481 containerd[1664]: time="2026-01-27T23:59:06.963440659Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fcc6fc97b-npt4s,Uid:2e0df84e-16dc-494b-a3d3-1071788a0777,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"48256df2c0dc930182397714eee430ff5c80a6503f48a2b1cc715efd85615e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.963733 kubelet[2953]: E0127 23:59:06.963675 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48256df2c0dc930182397714eee430ff5c80a6503f48a2b1cc715efd85615e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:06.963895 kubelet[2953]: E0127 23:59:06.963810 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48256df2c0dc930182397714eee430ff5c80a6503f48a2b1cc715efd85615e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" Jan 27 23:59:06.963895 kubelet[2953]: E0127 23:59:06.963831 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48256df2c0dc930182397714eee430ff5c80a6503f48a2b1cc715efd85615e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" Jan 27 23:59:06.964066 kubelet[2953]: E0127 23:59:06.964023 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-fcc6fc97b-npt4s_calico-system(2e0df84e-16dc-494b-a3d3-1071788a0777)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-fcc6fc97b-npt4s_calico-system(2e0df84e-16dc-494b-a3d3-1071788a0777)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48256df2c0dc930182397714eee430ff5c80a6503f48a2b1cc715efd85615e74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 27 23:59:07.669846 systemd[1]: Created slice kubepods-besteffort-pod1d6f938d_8e51_4e63_b408_0de368dbd7d7.slice - libcontainer container kubepods-besteffort-pod1d6f938d_8e51_4e63_b408_0de368dbd7d7.slice. Jan 27 23:59:07.673771 containerd[1664]: time="2026-01-27T23:59:07.673713204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bmscm,Uid:1d6f938d-8e51-4e63-b408-0de368dbd7d7,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:07.718835 containerd[1664]: time="2026-01-27T23:59:07.718783183Z" level=error msg="Failed to destroy network for sandbox \"f8c777ffd5b351e9700917d8e0f566c7aeca91fc5991dc60291cb1fb9982a27a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:07.722203 containerd[1664]: time="2026-01-27T23:59:07.722102673Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bmscm,Uid:1d6f938d-8e51-4e63-b408-0de368dbd7d7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c777ffd5b351e9700917d8e0f566c7aeca91fc5991dc60291cb1fb9982a27a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:07.722659 kubelet[2953]: E0127 23:59:07.722496 2953 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c777ffd5b351e9700917d8e0f566c7aeca91fc5991dc60291cb1fb9982a27a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 23:59:07.722659 kubelet[2953]: E0127 23:59:07.722619 2953 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c777ffd5b351e9700917d8e0f566c7aeca91fc5991dc60291cb1fb9982a27a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bmscm" Jan 27 23:59:07.722659 kubelet[2953]: E0127 23:59:07.722638 2953 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c777ffd5b351e9700917d8e0f566c7aeca91fc5991dc60291cb1fb9982a27a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bmscm" Jan 27 23:59:07.722930 kubelet[2953]: E0127 23:59:07.722870 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f8c777ffd5b351e9700917d8e0f566c7aeca91fc5991dc60291cb1fb9982a27a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:07.851050 systemd[1]: run-netns-cni\x2d13abe96d\x2dca6a\x2da5f2\x2d684f\x2d8079635d1ebc.mount: Deactivated successfully. Jan 27 23:59:07.851147 systemd[1]: run-netns-cni\x2d2bbddb15\x2d83a9\x2d6cc3\x2d3c7f\x2de938af454396.mount: Deactivated successfully. Jan 27 23:59:07.851194 systemd[1]: run-netns-cni\x2de98d233d\x2d27d4\x2d554d\x2d1e89\x2d2e73d4207be2.mount: Deactivated successfully. Jan 27 23:59:14.213254 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2740848308.mount: Deactivated successfully. Jan 27 23:59:14.233432 containerd[1664]: time="2026-01-27T23:59:14.233364785Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:14.234084 containerd[1664]: time="2026-01-27T23:59:14.233999987Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 27 23:59:14.235131 containerd[1664]: time="2026-01-27T23:59:14.235071190Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:14.236974 containerd[1664]: time="2026-01-27T23:59:14.236922316Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 23:59:14.237506 containerd[1664]: time="2026-01-27T23:59:14.237463037Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 7.449241197s" Jan 27 23:59:14.237506 containerd[1664]: time="2026-01-27T23:59:14.237501518Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 27 23:59:14.258240 containerd[1664]: time="2026-01-27T23:59:14.258185901Z" level=info msg="CreateContainer within sandbox \"b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 27 23:59:14.266943 containerd[1664]: time="2026-01-27T23:59:14.266889568Z" level=info msg="Container a40aa6778272dfa287da8922031f75e319382e1da93588b82de0e9335f908a76: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:59:14.277830 containerd[1664]: time="2026-01-27T23:59:14.277787202Z" level=info msg="CreateContainer within sandbox \"b744139da8c5db03d938f024b86c3e138e92a25e069902faa2774e98d8524f84\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a40aa6778272dfa287da8922031f75e319382e1da93588b82de0e9335f908a76\"" Jan 27 23:59:14.278937 containerd[1664]: time="2026-01-27T23:59:14.278905685Z" level=info msg="StartContainer for \"a40aa6778272dfa287da8922031f75e319382e1da93588b82de0e9335f908a76\"" Jan 27 23:59:14.280501 containerd[1664]: time="2026-01-27T23:59:14.280428410Z" level=info msg="connecting to shim a40aa6778272dfa287da8922031f75e319382e1da93588b82de0e9335f908a76" address="unix:///run/containerd/s/4394089d34f0151c924aa43d20700525c9478f1400ca1819b233cdc17f65b8a7" protocol=ttrpc version=3 Jan 27 23:59:14.301961 systemd[1]: Started cri-containerd-a40aa6778272dfa287da8922031f75e319382e1da93588b82de0e9335f908a76.scope - libcontainer container a40aa6778272dfa287da8922031f75e319382e1da93588b82de0e9335f908a76. Jan 27 23:59:14.360000 audit: BPF prog-id=172 op=LOAD Jan 27 23:59:14.362036 kernel: kauditd_printk_skb: 28 callbacks suppressed Jan 27 23:59:14.362093 kernel: audit: type=1334 audit(1769558354.360:577): prog-id=172 op=LOAD Jan 27 23:59:14.360000 audit[4039]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.366462 kernel: audit: type=1300 audit(1769558354.360:577): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.366542 kernel: audit: type=1327 audit(1769558354.360:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.360000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.360000 audit: BPF prog-id=173 op=LOAD Jan 27 23:59:14.370797 kernel: audit: type=1334 audit(1769558354.360:578): prog-id=173 op=LOAD Jan 27 23:59:14.370873 kernel: audit: type=1300 audit(1769558354.360:578): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.360000 audit[4039]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.360000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.377908 kernel: audit: type=1327 audit(1769558354.360:578): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.361000 audit: BPF prog-id=173 op=UNLOAD Jan 27 23:59:14.378744 kernel: audit: type=1334 audit(1769558354.361:579): prog-id=173 op=UNLOAD Jan 27 23:59:14.361000 audit[4039]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.382473 kernel: audit: type=1300 audit(1769558354.361:579): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.386364 kernel: audit: type=1327 audit(1769558354.361:579): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.386435 kernel: audit: type=1334 audit(1769558354.361:580): prog-id=172 op=UNLOAD Jan 27 23:59:14.361000 audit: BPF prog-id=172 op=UNLOAD Jan 27 23:59:14.361000 audit[4039]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.361000 audit: BPF prog-id=174 op=LOAD Jan 27 23:59:14.361000 audit[4039]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3499 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:14.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134306161363737383237326466613238376461383932323033316637 Jan 27 23:59:14.403584 containerd[1664]: time="2026-01-27T23:59:14.403544548Z" level=info msg="StartContainer for \"a40aa6778272dfa287da8922031f75e319382e1da93588b82de0e9335f908a76\" returns successfully" Jan 27 23:59:14.567498 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 27 23:59:14.567643 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 27 23:59:14.769204 kubelet[2953]: I0127 23:59:14.769152 2953 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-ca-bundle\") pod \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\" (UID: \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\") " Jan 27 23:59:14.769204 kubelet[2953]: I0127 23:59:14.769208 2953 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-backend-key-pair\") pod \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\" (UID: \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\") " Jan 27 23:59:14.769591 kubelet[2953]: I0127 23:59:14.769251 2953 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l9cp\" (UniqueName: \"kubernetes.io/projected/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-kube-api-access-7l9cp\") pod \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\" (UID: \"cbf32e17-cd7c-4af0-a1fc-ed07dd768807\") " Jan 27 23:59:14.770002 kubelet[2953]: I0127 23:59:14.769875 2953 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "cbf32e17-cd7c-4af0-a1fc-ed07dd768807" (UID: "cbf32e17-cd7c-4af0-a1fc-ed07dd768807"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 27 23:59:14.772927 kubelet[2953]: I0127 23:59:14.772859 2953 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "cbf32e17-cd7c-4af0-a1fc-ed07dd768807" (UID: "cbf32e17-cd7c-4af0-a1fc-ed07dd768807"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 27 23:59:14.773289 kubelet[2953]: I0127 23:59:14.773243 2953 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-kube-api-access-7l9cp" (OuterVolumeSpecName: "kube-api-access-7l9cp") pod "cbf32e17-cd7c-4af0-a1fc-ed07dd768807" (UID: "cbf32e17-cd7c-4af0-a1fc-ed07dd768807"). InnerVolumeSpecName "kube-api-access-7l9cp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 27 23:59:14.819136 systemd[1]: Removed slice kubepods-besteffort-podcbf32e17_cd7c_4af0_a1fc_ed07dd768807.slice - libcontainer container kubepods-besteffort-podcbf32e17_cd7c_4af0_a1fc_ed07dd768807.slice. Jan 27 23:59:14.832673 kubelet[2953]: I0127 23:59:14.832465 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-z88kf" podStartSLOduration=1.877726983 podStartE2EDuration="17.832447028s" podCreationTimestamp="2026-01-27 23:58:57 +0000 UTC" firstStartedPulling="2026-01-27 23:58:58.283539075 +0000 UTC m=+28.715417577" lastFinishedPulling="2026-01-27 23:59:14.23825908 +0000 UTC m=+44.670137622" observedRunningTime="2026-01-27 23:59:14.830710463 +0000 UTC m=+45.262589005" watchObservedRunningTime="2026-01-27 23:59:14.832447028 +0000 UTC m=+45.264325610" Jan 27 23:59:14.871220 kubelet[2953]: I0127 23:59:14.871163 2953 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7l9cp\" (UniqueName: \"kubernetes.io/projected/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-kube-api-access-7l9cp\") on node \"ci-4593-0-0-n-485d202ac1\" DevicePath \"\"" Jan 27 23:59:14.871220 kubelet[2953]: I0127 23:59:14.871212 2953 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-ca-bundle\") on node \"ci-4593-0-0-n-485d202ac1\" DevicePath \"\"" Jan 27 23:59:14.871220 kubelet[2953]: I0127 23:59:14.871223 2953 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cbf32e17-cd7c-4af0-a1fc-ed07dd768807-whisker-backend-key-pair\") on node \"ci-4593-0-0-n-485d202ac1\" DevicePath \"\"" Jan 27 23:59:14.891911 systemd[1]: Created slice kubepods-besteffort-poda85ad95c_92af_4836_ae16_c3e124882e38.slice - libcontainer container kubepods-besteffort-poda85ad95c_92af_4836_ae16_c3e124882e38.slice. Jan 27 23:59:14.972654 kubelet[2953]: I0127 23:59:14.972385 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a85ad95c-92af-4836-ae16-c3e124882e38-whisker-backend-key-pair\") pod \"whisker-79c89656f-55nfw\" (UID: \"a85ad95c-92af-4836-ae16-c3e124882e38\") " pod="calico-system/whisker-79c89656f-55nfw" Jan 27 23:59:14.972654 kubelet[2953]: I0127 23:59:14.972557 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmcdr\" (UniqueName: \"kubernetes.io/projected/a85ad95c-92af-4836-ae16-c3e124882e38-kube-api-access-zmcdr\") pod \"whisker-79c89656f-55nfw\" (UID: \"a85ad95c-92af-4836-ae16-c3e124882e38\") " pod="calico-system/whisker-79c89656f-55nfw" Jan 27 23:59:14.972654 kubelet[2953]: I0127 23:59:14.972578 2953 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85ad95c-92af-4836-ae16-c3e124882e38-whisker-ca-bundle\") pod \"whisker-79c89656f-55nfw\" (UID: \"a85ad95c-92af-4836-ae16-c3e124882e38\") " pod="calico-system/whisker-79c89656f-55nfw" Jan 27 23:59:15.198100 containerd[1664]: time="2026-01-27T23:59:15.197949472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79c89656f-55nfw,Uid:a85ad95c-92af-4836-ae16-c3e124882e38,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:15.217415 systemd[1]: var-lib-kubelet-pods-cbf32e17\x2dcd7c\x2d4af0\x2da1fc\x2ded07dd768807-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7l9cp.mount: Deactivated successfully. Jan 27 23:59:15.217515 systemd[1]: var-lib-kubelet-pods-cbf32e17\x2dcd7c\x2d4af0\x2da1fc\x2ded07dd768807-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 27 23:59:15.340281 systemd-networkd[1575]: cali5db10503b96: Link UP Jan 27 23:59:15.340436 systemd-networkd[1575]: cali5db10503b96: Gained carrier Jan 27 23:59:15.353144 containerd[1664]: 2026-01-27 23:59:15.226 [INFO][4104] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 27 23:59:15.353144 containerd[1664]: 2026-01-27 23:59:15.245 [INFO][4104] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0 whisker-79c89656f- calico-system a85ad95c-92af-4836-ae16-c3e124882e38 886 0 2026-01-27 23:59:14 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79c89656f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 whisker-79c89656f-55nfw eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali5db10503b96 [] [] }} ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-" Jan 27 23:59:15.353144 containerd[1664]: 2026-01-27 23:59:15.245 [INFO][4104] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" Jan 27 23:59:15.353144 containerd[1664]: 2026-01-27 23:59:15.292 [INFO][4119] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" HandleID="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Workload="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.292 [INFO][4119] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" HandleID="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Workload="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323b50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-485d202ac1", "pod":"whisker-79c89656f-55nfw", "timestamp":"2026-01-27 23:59:15.292701204 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.292 [INFO][4119] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.293 [INFO][4119] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.293 [INFO][4119] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.303 [INFO][4119] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.309 [INFO][4119] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.314 [INFO][4119] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.316 [INFO][4119] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353562 containerd[1664]: 2026-01-27 23:59:15.318 [INFO][4119] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353788 containerd[1664]: 2026-01-27 23:59:15.318 [INFO][4119] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353788 containerd[1664]: 2026-01-27 23:59:15.320 [INFO][4119] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4 Jan 27 23:59:15.353788 containerd[1664]: 2026-01-27 23:59:15.325 [INFO][4119] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353788 containerd[1664]: 2026-01-27 23:59:15.330 [INFO][4119] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.1/26] block=192.168.91.0/26 handle="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353788 containerd[1664]: 2026-01-27 23:59:15.330 [INFO][4119] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.1/26] handle="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:15.353788 containerd[1664]: 2026-01-27 23:59:15.330 [INFO][4119] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:15.353788 containerd[1664]: 2026-01-27 23:59:15.330 [INFO][4119] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.1/26] IPv6=[] ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" HandleID="k8s-pod-network.e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Workload="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" Jan 27 23:59:15.353914 containerd[1664]: 2026-01-27 23:59:15.332 [INFO][4104] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0", GenerateName:"whisker-79c89656f-", Namespace:"calico-system", SelfLink:"", UID:"a85ad95c-92af-4836-ae16-c3e124882e38", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 59, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79c89656f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"whisker-79c89656f-55nfw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5db10503b96", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:15.353914 containerd[1664]: 2026-01-27 23:59:15.332 [INFO][4104] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.1/32] ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" Jan 27 23:59:15.353984 containerd[1664]: 2026-01-27 23:59:15.333 [INFO][4104] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5db10503b96 ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" Jan 27 23:59:15.353984 containerd[1664]: 2026-01-27 23:59:15.340 [INFO][4104] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" Jan 27 23:59:15.354024 containerd[1664]: 2026-01-27 23:59:15.340 [INFO][4104] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0", GenerateName:"whisker-79c89656f-", Namespace:"calico-system", SelfLink:"", UID:"a85ad95c-92af-4836-ae16-c3e124882e38", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 59, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79c89656f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4", Pod:"whisker-79c89656f-55nfw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5db10503b96", MAC:"7e:0c:07:2d:b0:5a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:15.354068 containerd[1664]: 2026-01-27 23:59:15.350 [INFO][4104] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" Namespace="calico-system" Pod="whisker-79c89656f-55nfw" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-whisker--79c89656f--55nfw-eth0" Jan 27 23:59:15.374280 containerd[1664]: time="2026-01-27T23:59:15.374220455Z" level=info msg="connecting to shim e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4" address="unix:///run/containerd/s/81374c2cdb46dd3784b580a3c579eefd25b06dacb5181f549c297eff60a7155f" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:15.401981 systemd[1]: Started cri-containerd-e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4.scope - libcontainer container e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4. Jan 27 23:59:15.411000 audit: BPF prog-id=175 op=LOAD Jan 27 23:59:15.411000 audit: BPF prog-id=176 op=LOAD Jan 27 23:59:15.411000 audit[4153]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4142 pid=4153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:15.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613431383135383337656566643361313765323933616363316536 Jan 27 23:59:15.412000 audit: BPF prog-id=176 op=UNLOAD Jan 27 23:59:15.412000 audit[4153]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4142 pid=4153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:15.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613431383135383337656566643361313765323933616363316536 Jan 27 23:59:15.412000 audit: BPF prog-id=177 op=LOAD Jan 27 23:59:15.412000 audit[4153]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4142 pid=4153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:15.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613431383135383337656566643361313765323933616363316536 Jan 27 23:59:15.412000 audit: BPF prog-id=178 op=LOAD Jan 27 23:59:15.412000 audit[4153]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4142 pid=4153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:15.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613431383135383337656566643361313765323933616363316536 Jan 27 23:59:15.412000 audit: BPF prog-id=178 op=UNLOAD Jan 27 23:59:15.412000 audit[4153]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4142 pid=4153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:15.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613431383135383337656566643361313765323933616363316536 Jan 27 23:59:15.413000 audit: BPF prog-id=177 op=UNLOAD Jan 27 23:59:15.413000 audit[4153]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4142 pid=4153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:15.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613431383135383337656566643361313765323933616363316536 Jan 27 23:59:15.413000 audit: BPF prog-id=179 op=LOAD Jan 27 23:59:15.413000 audit[4153]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4142 pid=4153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:15.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531613431383135383337656566643361313765323933616363316536 Jan 27 23:59:15.435314 containerd[1664]: time="2026-01-27T23:59:15.435272123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79c89656f-55nfw,Uid:a85ad95c-92af-4836-ae16-c3e124882e38,Namespace:calico-system,Attempt:0,} returns sandbox id \"e1a41815837eefd3a17e293acc1e6c7e25c72dbb10ceb08c50a50703ab3f69c4\"" Jan 27 23:59:15.437105 containerd[1664]: time="2026-01-27T23:59:15.436886048Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 23:59:15.663506 kubelet[2953]: I0127 23:59:15.663450 2953 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf32e17-cd7c-4af0-a1fc-ed07dd768807" path="/var/lib/kubelet/pods/cbf32e17-cd7c-4af0-a1fc-ed07dd768807/volumes" Jan 27 23:59:15.760503 containerd[1664]: time="2026-01-27T23:59:15.760442843Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:15.762121 containerd[1664]: time="2026-01-27T23:59:15.762060488Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 23:59:15.762197 containerd[1664]: time="2026-01-27T23:59:15.762117688Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:15.762389 kubelet[2953]: E0127 23:59:15.762355 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 23:59:15.762484 kubelet[2953]: E0127 23:59:15.762471 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 23:59:15.762624 kubelet[2953]: E0127 23:59:15.762604 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:15.765021 containerd[1664]: time="2026-01-27T23:59:15.764961657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 23:59:16.097305 containerd[1664]: time="2026-01-27T23:59:16.096539237Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:16.097751 containerd[1664]: time="2026-01-27T23:59:16.097696921Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 23:59:16.097873 containerd[1664]: time="2026-01-27T23:59:16.097743601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:16.098129 kubelet[2953]: E0127 23:59:16.098086 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 23:59:16.098394 kubelet[2953]: E0127 23:59:16.098135 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 23:59:16.098394 kubelet[2953]: E0127 23:59:16.098213 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:16.098394 kubelet[2953]: E0127 23:59:16.098255 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 27 23:59:16.100000 audit: BPF prog-id=180 op=LOAD Jan 27 23:59:16.100000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd616fbe8 a2=98 a3=ffffd616fbd8 items=0 ppid=4236 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 23:59:16.100000 audit: BPF prog-id=180 op=UNLOAD Jan 27 23:59:16.100000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd616fbb8 a3=0 items=0 ppid=4236 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 23:59:16.100000 audit: BPF prog-id=181 op=LOAD Jan 27 23:59:16.100000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd616fa98 a2=74 a3=95 items=0 ppid=4236 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 23:59:16.100000 audit: BPF prog-id=181 op=UNLOAD Jan 27 23:59:16.100000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4236 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 23:59:16.100000 audit: BPF prog-id=182 op=LOAD Jan 27 23:59:16.100000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd616fac8 a2=40 a3=ffffd616faf8 items=0 ppid=4236 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 23:59:16.100000 audit: BPF prog-id=182 op=UNLOAD Jan 27 23:59:16.100000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffd616faf8 items=0 ppid=4236 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 23:59:16.102000 audit: BPF prog-id=183 op=LOAD Jan 27 23:59:16.102000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffec7f7ff8 a2=98 a3=ffffec7f7fe8 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.102000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.102000 audit: BPF prog-id=183 op=UNLOAD Jan 27 23:59:16.102000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffec7f7fc8 a3=0 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.102000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.103000 audit: BPF prog-id=184 op=LOAD Jan 27 23:59:16.103000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffec7f7c88 a2=74 a3=95 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.103000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.103000 audit: BPF prog-id=184 op=UNLOAD Jan 27 23:59:16.103000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.103000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.103000 audit: BPF prog-id=185 op=LOAD Jan 27 23:59:16.103000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffec7f7ce8 a2=94 a3=2 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.103000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.103000 audit: BPF prog-id=185 op=UNLOAD Jan 27 23:59:16.103000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.103000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.206000 audit: BPF prog-id=186 op=LOAD Jan 27 23:59:16.206000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffec7f7ca8 a2=40 a3=ffffec7f7cd8 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.206000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.208000 audit: BPF prog-id=186 op=UNLOAD Jan 27 23:59:16.208000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffec7f7cd8 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.208000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.219000 audit: BPF prog-id=187 op=LOAD Jan 27 23:59:16.219000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffec7f7cb8 a2=94 a3=4 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.219000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.219000 audit: BPF prog-id=187 op=UNLOAD Jan 27 23:59:16.219000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.219000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.219000 audit: BPF prog-id=188 op=LOAD Jan 27 23:59:16.219000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffec7f7af8 a2=94 a3=5 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.219000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.219000 audit: BPF prog-id=188 op=UNLOAD Jan 27 23:59:16.219000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.219000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.220000 audit: BPF prog-id=189 op=LOAD Jan 27 23:59:16.220000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffec7f7d28 a2=94 a3=6 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.220000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.220000 audit: BPF prog-id=189 op=UNLOAD Jan 27 23:59:16.220000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.220000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.220000 audit: BPF prog-id=190 op=LOAD Jan 27 23:59:16.220000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffec7f74f8 a2=94 a3=83 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.220000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.220000 audit: BPF prog-id=191 op=LOAD Jan 27 23:59:16.220000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffec7f72b8 a2=94 a3=2 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.220000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.220000 audit: BPF prog-id=191 op=UNLOAD Jan 27 23:59:16.220000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.220000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.221000 audit: BPF prog-id=190 op=UNLOAD Jan 27 23:59:16.221000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=8256620 a3=8249b00 items=0 ppid=4236 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.221000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 23:59:16.231000 audit: BPF prog-id=192 op=LOAD Jan 27 23:59:16.231000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc2dfeb28 a2=98 a3=ffffc2dfeb18 items=0 ppid=4236 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.231000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 23:59:16.231000 audit: BPF prog-id=192 op=UNLOAD Jan 27 23:59:16.231000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc2dfeaf8 a3=0 items=0 ppid=4236 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.231000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 23:59:16.231000 audit: BPF prog-id=193 op=LOAD Jan 27 23:59:16.231000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc2dfe9d8 a2=74 a3=95 items=0 ppid=4236 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.231000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 23:59:16.231000 audit: BPF prog-id=193 op=UNLOAD Jan 27 23:59:16.231000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4236 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.231000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 23:59:16.231000 audit: BPF prog-id=194 op=LOAD Jan 27 23:59:16.231000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc2dfea08 a2=40 a3=ffffc2dfea38 items=0 ppid=4236 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.231000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 23:59:16.231000 audit: BPF prog-id=194 op=UNLOAD Jan 27 23:59:16.231000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffc2dfea38 items=0 ppid=4236 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.231000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 23:59:16.293284 systemd-networkd[1575]: vxlan.calico: Link UP Jan 27 23:59:16.293296 systemd-networkd[1575]: vxlan.calico: Gained carrier Jan 27 23:59:16.309000 audit: BPF prog-id=195 op=LOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff7c93f38 a2=98 a3=fffff7c93f28 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=195 op=UNLOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff7c93f08 a3=0 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=196 op=LOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff7c93c18 a2=74 a3=95 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=196 op=UNLOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=197 op=LOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff7c93c78 a2=94 a3=2 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=197 op=UNLOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=198 op=LOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff7c93af8 a2=40 a3=fffff7c93b28 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=198 op=UNLOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=fffff7c93b28 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=199 op=LOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff7c93c48 a2=94 a3=b7 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.309000 audit: BPF prog-id=199 op=UNLOAD Jan 27 23:59:16.309000 audit[4364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.309000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.310000 audit: BPF prog-id=200 op=LOAD Jan 27 23:59:16.310000 audit[4364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff7c932f8 a2=94 a3=2 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.310000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.310000 audit: BPF prog-id=200 op=UNLOAD Jan 27 23:59:16.310000 audit[4364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.310000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.310000 audit: BPF prog-id=201 op=LOAD Jan 27 23:59:16.310000 audit[4364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff7c93488 a2=94 a3=30 items=0 ppid=4236 pid=4364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.310000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 23:59:16.313000 audit: BPF prog-id=202 op=LOAD Jan 27 23:59:16.313000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff9156fa8 a2=98 a3=fffff9156f98 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.313000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.313000 audit: BPF prog-id=202 op=UNLOAD Jan 27 23:59:16.313000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff9156f78 a3=0 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.313000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.313000 audit: BPF prog-id=203 op=LOAD Jan 27 23:59:16.313000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff9156c38 a2=74 a3=95 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.313000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.313000 audit: BPF prog-id=203 op=UNLOAD Jan 27 23:59:16.313000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.313000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.313000 audit: BPF prog-id=204 op=LOAD Jan 27 23:59:16.313000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff9156c98 a2=94 a3=2 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.313000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.313000 audit: BPF prog-id=204 op=UNLOAD Jan 27 23:59:16.313000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.313000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.415000 audit: BPF prog-id=205 op=LOAD Jan 27 23:59:16.415000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff9156c58 a2=40 a3=fffff9156c88 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.415000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.415000 audit: BPF prog-id=205 op=UNLOAD Jan 27 23:59:16.415000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=fffff9156c88 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.415000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.427000 audit: BPF prog-id=206 op=LOAD Jan 27 23:59:16.427000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff9156c68 a2=94 a3=4 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.427000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.427000 audit: BPF prog-id=206 op=UNLOAD Jan 27 23:59:16.427000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.427000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.427000 audit: BPF prog-id=207 op=LOAD Jan 27 23:59:16.427000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff9156aa8 a2=94 a3=5 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.427000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.427000 audit: BPF prog-id=207 op=UNLOAD Jan 27 23:59:16.427000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.427000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.427000 audit: BPF prog-id=208 op=LOAD Jan 27 23:59:16.427000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff9156cd8 a2=94 a3=6 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.427000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.427000 audit: BPF prog-id=208 op=UNLOAD Jan 27 23:59:16.427000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.427000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.427000 audit: BPF prog-id=209 op=LOAD Jan 27 23:59:16.427000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff91564a8 a2=94 a3=83 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.427000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.428000 audit: BPF prog-id=210 op=LOAD Jan 27 23:59:16.428000 audit[4366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=fffff9156268 a2=94 a3=2 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.428000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.428000 audit: BPF prog-id=210 op=UNLOAD Jan 27 23:59:16.428000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.428000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.428000 audit: BPF prog-id=209 op=UNLOAD Jan 27 23:59:16.428000 audit[4366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=27bea620 a3=27bddb00 items=0 ppid=4236 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.428000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 23:59:16.449000 audit: BPF prog-id=201 op=UNLOAD Jan 27 23:59:16.449000 audit[4236]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=4000ceb040 a2=0 a3=0 items=0 ppid=4212 pid=4236 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.449000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 27 23:59:16.499000 audit[4391]: NETFILTER_CFG table=mangle:119 family=2 entries=16 op=nft_register_chain pid=4391 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:16.499000 audit[4391]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffffd2eddc0 a2=0 a3=ffffbf29dfa8 items=0 ppid=4236 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.499000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:16.503000 audit[4395]: NETFILTER_CFG table=nat:120 family=2 entries=15 op=nft_register_chain pid=4395 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:16.503000 audit[4395]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=fffff2cb5f50 a2=0 a3=ffffa61defa8 items=0 ppid=4236 pid=4395 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.503000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:16.512000 audit[4394]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4394 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:16.512000 audit[4394]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffd30bdaa0 a2=0 a3=ffffa971dfa8 items=0 ppid=4236 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.512000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:16.513000 audit[4397]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4397 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:16.513000 audit[4397]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffea7cd020 a2=0 a3=ffffa7b03fa8 items=0 ppid=4236 pid=4397 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.513000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:16.818923 kubelet[2953]: E0127 23:59:16.818741 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 27 23:59:16.841000 audit[4425]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4425 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:16.841000 audit[4425]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffee7f4d00 a2=0 a3=1 items=0 ppid=3063 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.841000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:16.851000 audit[4425]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4425 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:16.851000 audit[4425]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffee7f4d00 a2=0 a3=1 items=0 ppid=3063 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:16.851000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:17.048385 systemd-networkd[1575]: cali5db10503b96: Gained IPv6LL Jan 27 23:59:17.665582 containerd[1664]: time="2026-01-27T23:59:17.665542904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mp25r,Uid:c3beb58c-7880-46f0-a29c-255fb717766b,Namespace:kube-system,Attempt:0,}" Jan 27 23:59:17.667432 containerd[1664]: time="2026-01-27T23:59:17.667397470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sv2v9,Uid:e61938e0-7077-4d81-9b34-6430d54d8b9f,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:17.668908 containerd[1664]: time="2026-01-27T23:59:17.668876154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fcc6fc97b-npt4s,Uid:2e0df84e-16dc-494b-a3d3-1071788a0777,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:17.795468 systemd-networkd[1575]: cali79da3fbf755: Link UP Jan 27 23:59:17.796070 systemd-networkd[1575]: cali79da3fbf755: Gained carrier Jan 27 23:59:17.814522 containerd[1664]: 2026-01-27 23:59:17.726 [INFO][4449] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0 calico-kube-controllers-fcc6fc97b- calico-system 2e0df84e-16dc-494b-a3d3-1071788a0777 820 0 2026-01-27 23:58:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:fcc6fc97b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 calico-kube-controllers-fcc6fc97b-npt4s eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali79da3fbf755 [] [] }} ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-" Jan 27 23:59:17.814522 containerd[1664]: 2026-01-27 23:59:17.726 [INFO][4449] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" Jan 27 23:59:17.814522 containerd[1664]: 2026-01-27 23:59:17.752 [INFO][4478] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" HandleID="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.752 [INFO][4478] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" HandleID="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c510), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-485d202ac1", "pod":"calico-kube-controllers-fcc6fc97b-npt4s", "timestamp":"2026-01-27 23:59:17.752208691 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.752 [INFO][4478] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.752 [INFO][4478] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.752 [INFO][4478] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.764 [INFO][4478] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.769 [INFO][4478] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.774 [INFO][4478] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.776 [INFO][4478] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.814777 containerd[1664]: 2026-01-27 23:59:17.778 [INFO][4478] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.815033 containerd[1664]: 2026-01-27 23:59:17.778 [INFO][4478] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.815033 containerd[1664]: 2026-01-27 23:59:17.780 [INFO][4478] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2 Jan 27 23:59:17.815033 containerd[1664]: 2026-01-27 23:59:17.785 [INFO][4478] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.815033 containerd[1664]: 2026-01-27 23:59:17.790 [INFO][4478] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.2/26] block=192.168.91.0/26 handle="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.815033 containerd[1664]: 2026-01-27 23:59:17.790 [INFO][4478] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.2/26] handle="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.815033 containerd[1664]: 2026-01-27 23:59:17.791 [INFO][4478] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:17.815033 containerd[1664]: 2026-01-27 23:59:17.791 [INFO][4478] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.2/26] IPv6=[] ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" HandleID="k8s-pod-network.ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" Jan 27 23:59:17.815149 containerd[1664]: 2026-01-27 23:59:17.792 [INFO][4449] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0", GenerateName:"calico-kube-controllers-fcc6fc97b-", Namespace:"calico-system", SelfLink:"", UID:"2e0df84e-16dc-494b-a3d3-1071788a0777", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"fcc6fc97b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"calico-kube-controllers-fcc6fc97b-npt4s", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali79da3fbf755", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:17.815197 containerd[1664]: 2026-01-27 23:59:17.793 [INFO][4449] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.2/32] ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" Jan 27 23:59:17.815197 containerd[1664]: 2026-01-27 23:59:17.793 [INFO][4449] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79da3fbf755 ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" Jan 27 23:59:17.815197 containerd[1664]: 2026-01-27 23:59:17.796 [INFO][4449] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" Jan 27 23:59:17.815263 containerd[1664]: 2026-01-27 23:59:17.797 [INFO][4449] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0", GenerateName:"calico-kube-controllers-fcc6fc97b-", Namespace:"calico-system", SelfLink:"", UID:"2e0df84e-16dc-494b-a3d3-1071788a0777", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"fcc6fc97b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2", Pod:"calico-kube-controllers-fcc6fc97b-npt4s", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali79da3fbf755", MAC:"4a:cf:91:95:c3:24", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:17.815327 containerd[1664]: 2026-01-27 23:59:17.811 [INFO][4449] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" Namespace="calico-system" Pod="calico-kube-controllers-fcc6fc97b-npt4s" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--kube--controllers--fcc6fc97b--npt4s-eth0" Jan 27 23:59:17.824000 audit[4511]: NETFILTER_CFG table=filter:125 family=2 entries=36 op=nft_register_chain pid=4511 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:17.824000 audit[4511]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19576 a0=3 a1=ffffe699bbe0 a2=0 a3=ffff8a811fa8 items=0 ppid=4236 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.824000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:17.838388 containerd[1664]: time="2026-01-27T23:59:17.838322676Z" level=info msg="connecting to shim ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2" address="unix:///run/containerd/s/562dbae0590ae603af3d510f46eab79c7c9291a83bb57895234ea9432d455318" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:17.863961 systemd[1]: Started cri-containerd-ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2.scope - libcontainer container ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2. Jan 27 23:59:17.874000 audit: BPF prog-id=211 op=LOAD Jan 27 23:59:17.875000 audit: BPF prog-id=212 op=LOAD Jan 27 23:59:17.875000 audit[4532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4520 pid=4532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.875000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363656336366437343062383532643134353537393435353635663339 Jan 27 23:59:17.875000 audit: BPF prog-id=212 op=UNLOAD Jan 27 23:59:17.875000 audit[4532]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4520 pid=4532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.875000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363656336366437343062383532643134353537393435353635663339 Jan 27 23:59:17.875000 audit: BPF prog-id=213 op=LOAD Jan 27 23:59:17.875000 audit[4532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4520 pid=4532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.875000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363656336366437343062383532643134353537393435353635663339 Jan 27 23:59:17.876000 audit: BPF prog-id=214 op=LOAD Jan 27 23:59:17.876000 audit[4532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4520 pid=4532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363656336366437343062383532643134353537393435353635663339 Jan 27 23:59:17.876000 audit: BPF prog-id=214 op=UNLOAD Jan 27 23:59:17.876000 audit[4532]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4520 pid=4532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363656336366437343062383532643134353537393435353635663339 Jan 27 23:59:17.876000 audit: BPF prog-id=213 op=UNLOAD Jan 27 23:59:17.876000 audit[4532]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4520 pid=4532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363656336366437343062383532643134353537393435353635663339 Jan 27 23:59:17.876000 audit: BPF prog-id=215 op=LOAD Jan 27 23:59:17.876000 audit[4532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4520 pid=4532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363656336366437343062383532643134353537393435353635663339 Jan 27 23:59:17.927787 systemd-networkd[1575]: cali66aca675b2c: Link UP Jan 27 23:59:17.934518 systemd-networkd[1575]: cali66aca675b2c: Gained carrier Jan 27 23:59:17.957636 containerd[1664]: time="2026-01-27T23:59:17.957590203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fcc6fc97b-npt4s,Uid:2e0df84e-16dc-494b-a3d3-1071788a0777,Namespace:calico-system,Attempt:0,} returns sandbox id \"ccec66d740b852d14557945565f39a125a83300f6b483b63f4938b827db506b2\"" Jan 27 23:59:17.957869 containerd[1664]: 2026-01-27 23:59:17.726 [INFO][4451] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0 goldmane-7c778bb748- calico-system e61938e0-7077-4d81-9b34-6430d54d8b9f 819 0 2026-01-27 23:58:55 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 goldmane-7c778bb748-sv2v9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali66aca675b2c [] [] }} ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-" Jan 27 23:59:17.957869 containerd[1664]: 2026-01-27 23:59:17.726 [INFO][4451] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" Jan 27 23:59:17.957869 containerd[1664]: 2026-01-27 23:59:17.753 [INFO][4477] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" HandleID="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Workload="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.753 [INFO][4477] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" HandleID="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Workload="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c730), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-485d202ac1", "pod":"goldmane-7c778bb748-sv2v9", "timestamp":"2026-01-27 23:59:17.753850656 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.754 [INFO][4477] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.791 [INFO][4477] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.791 [INFO][4477] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.866 [INFO][4477] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.871 [INFO][4477] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.879 [INFO][4477] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.882 [INFO][4477] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958090 containerd[1664]: 2026-01-27 23:59:17.885 [INFO][4477] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958647 containerd[1664]: 2026-01-27 23:59:17.885 [INFO][4477] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958647 containerd[1664]: 2026-01-27 23:59:17.887 [INFO][4477] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8 Jan 27 23:59:17.958647 containerd[1664]: 2026-01-27 23:59:17.893 [INFO][4477] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958647 containerd[1664]: 2026-01-27 23:59:17.910 [INFO][4477] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.3/26] block=192.168.91.0/26 handle="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958647 containerd[1664]: 2026-01-27 23:59:17.910 [INFO][4477] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.3/26] handle="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:17.958647 containerd[1664]: 2026-01-27 23:59:17.911 [INFO][4477] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:17.958647 containerd[1664]: 2026-01-27 23:59:17.911 [INFO][4477] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.3/26] IPv6=[] ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" HandleID="k8s-pod-network.4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Workload="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" Jan 27 23:59:17.958828 containerd[1664]: 2026-01-27 23:59:17.919 [INFO][4451] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"e61938e0-7077-4d81-9b34-6430d54d8b9f", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"goldmane-7c778bb748-sv2v9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali66aca675b2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:17.958888 containerd[1664]: 2026-01-27 23:59:17.919 [INFO][4451] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.3/32] ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" Jan 27 23:59:17.958888 containerd[1664]: 2026-01-27 23:59:17.919 [INFO][4451] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66aca675b2c ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" Jan 27 23:59:17.958888 containerd[1664]: 2026-01-27 23:59:17.934 [INFO][4451] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" Jan 27 23:59:17.958945 containerd[1664]: 2026-01-27 23:59:17.937 [INFO][4451] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"e61938e0-7077-4d81-9b34-6430d54d8b9f", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8", Pod:"goldmane-7c778bb748-sv2v9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali66aca675b2c", MAC:"da:f0:9b:42:29:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:17.959029 containerd[1664]: 2026-01-27 23:59:17.953 [INFO][4451] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" Namespace="calico-system" Pod="goldmane-7c778bb748-sv2v9" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-goldmane--7c778bb748--sv2v9-eth0" Jan 27 23:59:17.960798 containerd[1664]: time="2026-01-27T23:59:17.960342331Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 23:59:17.982000 audit[4569]: NETFILTER_CFG table=filter:126 family=2 entries=48 op=nft_register_chain pid=4569 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:17.982000 audit[4569]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26368 a0=3 a1=ffffff900320 a2=0 a3=ffff8f774fa8 items=0 ppid=4236 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:17.982000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:17.996072 containerd[1664]: time="2026-01-27T23:59:17.995938681Z" level=info msg="connecting to shim 4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8" address="unix:///run/containerd/s/5b35b96f682302a0097fb84dc98f04a01e8d958f1078ac138c6a6601ed15d5cb" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:18.011223 systemd-networkd[1575]: cali34c84bd662c: Link UP Jan 27 23:59:18.011614 systemd-networkd[1575]: cali34c84bd662c: Gained carrier Jan 27 23:59:18.028997 systemd[1]: Started cri-containerd-4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8.scope - libcontainer container 4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8. Jan 27 23:59:18.030651 containerd[1664]: 2026-01-27 23:59:17.726 [INFO][4435] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0 coredns-66bc5c9577- kube-system c3beb58c-7880-46f0-a29c-255fb717766b 810 0 2026-01-27 23:58:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 coredns-66bc5c9577-mp25r eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali34c84bd662c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-" Jan 27 23:59:18.030651 containerd[1664]: 2026-01-27 23:59:17.726 [INFO][4435] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" Jan 27 23:59:18.030651 containerd[1664]: 2026-01-27 23:59:17.755 [INFO][4480] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" HandleID="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Workload="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.755 [INFO][4480] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" HandleID="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Workload="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400021e320), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-n-485d202ac1", "pod":"coredns-66bc5c9577-mp25r", "timestamp":"2026-01-27 23:59:17.755028739 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.755 [INFO][4480] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.911 [INFO][4480] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.911 [INFO][4480] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.966 [INFO][4480] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.972 [INFO][4480] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.981 [INFO][4480] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.988 [INFO][4480] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.030813 containerd[1664]: 2026-01-27 23:59:17.992 [INFO][4480] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.031006 containerd[1664]: 2026-01-27 23:59:17.992 [INFO][4480] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.031006 containerd[1664]: 2026-01-27 23:59:17.994 [INFO][4480] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d Jan 27 23:59:18.031006 containerd[1664]: 2026-01-27 23:59:18.000 [INFO][4480] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.031006 containerd[1664]: 2026-01-27 23:59:18.006 [INFO][4480] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.4/26] block=192.168.91.0/26 handle="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.031006 containerd[1664]: 2026-01-27 23:59:18.006 [INFO][4480] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.4/26] handle="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:18.031006 containerd[1664]: 2026-01-27 23:59:18.006 [INFO][4480] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:18.031006 containerd[1664]: 2026-01-27 23:59:18.006 [INFO][4480] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.4/26] IPv6=[] ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" HandleID="k8s-pod-network.38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Workload="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" Jan 27 23:59:18.031285 containerd[1664]: 2026-01-27 23:59:18.008 [INFO][4435] cni-plugin/k8s.go 418: Populated endpoint ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"c3beb58c-7880-46f0-a29c-255fb717766b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"coredns-66bc5c9577-mp25r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali34c84bd662c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:18.031285 containerd[1664]: 2026-01-27 23:59:18.009 [INFO][4435] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.4/32] ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" Jan 27 23:59:18.031285 containerd[1664]: 2026-01-27 23:59:18.009 [INFO][4435] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali34c84bd662c ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" Jan 27 23:59:18.031285 containerd[1664]: 2026-01-27 23:59:18.011 [INFO][4435] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" Jan 27 23:59:18.031285 containerd[1664]: 2026-01-27 23:59:18.012 [INFO][4435] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"c3beb58c-7880-46f0-a29c-255fb717766b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d", Pod:"coredns-66bc5c9577-mp25r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali34c84bd662c", MAC:"5e:b6:26:35:73:80", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:18.031974 containerd[1664]: 2026-01-27 23:59:18.028 [INFO][4435] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" Namespace="kube-system" Pod="coredns-66bc5c9577-mp25r" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--mp25r-eth0" Jan 27 23:59:18.042000 audit: BPF prog-id=216 op=LOAD Jan 27 23:59:18.043000 audit: BPF prog-id=217 op=LOAD Jan 27 23:59:18.043000 audit[4591]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4579 pid=4591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464363163636461386161383930333263636632376536386531653033 Jan 27 23:59:18.043000 audit: BPF prog-id=217 op=UNLOAD Jan 27 23:59:18.043000 audit[4591]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4579 pid=4591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464363163636461386161383930333263636632376536386531653033 Jan 27 23:59:18.043000 audit: BPF prog-id=218 op=LOAD Jan 27 23:59:18.043000 audit[4591]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4579 pid=4591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464363163636461386161383930333263636632376536386531653033 Jan 27 23:59:18.043000 audit: BPF prog-id=219 op=LOAD Jan 27 23:59:18.043000 audit[4591]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4579 pid=4591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464363163636461386161383930333263636632376536386531653033 Jan 27 23:59:18.043000 audit: BPF prog-id=219 op=UNLOAD Jan 27 23:59:18.043000 audit[4591]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4579 pid=4591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464363163636461386161383930333263636632376536386531653033 Jan 27 23:59:18.043000 audit: BPF prog-id=218 op=UNLOAD Jan 27 23:59:18.043000 audit[4591]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4579 pid=4591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464363163636461386161383930333263636632376536386531653033 Jan 27 23:59:18.043000 audit: BPF prog-id=220 op=LOAD Jan 27 23:59:18.043000 audit[4591]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4579 pid=4591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464363163636461386161383930333263636632376536386531653033 Jan 27 23:59:18.045000 audit[4618]: NETFILTER_CFG table=filter:127 family=2 entries=50 op=nft_register_chain pid=4618 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:18.045000 audit[4618]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24928 a0=3 a1=fffffc760390 a2=0 a3=ffff82940fa8 items=0 ppid=4236 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.045000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:18.059595 containerd[1664]: time="2026-01-27T23:59:18.059548556Z" level=info msg="connecting to shim 38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d" address="unix:///run/containerd/s/91a32f2fe18f97f041056e90c42c1703660bd5579795c1c302ae57cabbc40c86" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:18.071188 containerd[1664]: time="2026-01-27T23:59:18.071149032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sv2v9,Uid:e61938e0-7077-4d81-9b34-6430d54d8b9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"4d61ccda8aa89032ccf27e68e1e03205264277ebcb0aff9f8d6056fe8722c3f8\"" Jan 27 23:59:18.071881 systemd-networkd[1575]: vxlan.calico: Gained IPv6LL Jan 27 23:59:18.090975 systemd[1]: Started cri-containerd-38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d.scope - libcontainer container 38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d. Jan 27 23:59:18.100000 audit: BPF prog-id=221 op=LOAD Jan 27 23:59:18.101000 audit: BPF prog-id=222 op=LOAD Jan 27 23:59:18.101000 audit[4643]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4627 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338643237663565373937363762623632646661643031313738343066 Jan 27 23:59:18.101000 audit: BPF prog-id=222 op=UNLOAD Jan 27 23:59:18.101000 audit[4643]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338643237663565373937363762623632646661643031313738343066 Jan 27 23:59:18.101000 audit: BPF prog-id=223 op=LOAD Jan 27 23:59:18.101000 audit[4643]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=4627 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338643237663565373937363762623632646661643031313738343066 Jan 27 23:59:18.101000 audit: BPF prog-id=224 op=LOAD Jan 27 23:59:18.101000 audit[4643]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=4627 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338643237663565373937363762623632646661643031313738343066 Jan 27 23:59:18.101000 audit: BPF prog-id=224 op=UNLOAD Jan 27 23:59:18.101000 audit[4643]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338643237663565373937363762623632646661643031313738343066 Jan 27 23:59:18.101000 audit: BPF prog-id=223 op=UNLOAD Jan 27 23:59:18.101000 audit[4643]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338643237663565373937363762623632646661643031313738343066 Jan 27 23:59:18.101000 audit: BPF prog-id=225 op=LOAD Jan 27 23:59:18.101000 audit[4643]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=4627 pid=4643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.101000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338643237663565373937363762623632646661643031313738343066 Jan 27 23:59:18.124482 containerd[1664]: time="2026-01-27T23:59:18.124440156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mp25r,Uid:c3beb58c-7880-46f0-a29c-255fb717766b,Namespace:kube-system,Attempt:0,} returns sandbox id \"38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d\"" Jan 27 23:59:18.130148 containerd[1664]: time="2026-01-27T23:59:18.130098413Z" level=info msg="CreateContainer within sandbox \"38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 27 23:59:18.138085 containerd[1664]: time="2026-01-27T23:59:18.138044718Z" level=info msg="Container fd45bcc08f590dc02373a4334765962f8973d8400b63689045b108908db441a0: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:59:18.145614 containerd[1664]: time="2026-01-27T23:59:18.145573021Z" level=info msg="CreateContainer within sandbox \"38d27f5e79767bb62dfad0117840f5508a86c2f0a88e5f603f361d9344fd2b4d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fd45bcc08f590dc02373a4334765962f8973d8400b63689045b108908db441a0\"" Jan 27 23:59:18.146139 containerd[1664]: time="2026-01-27T23:59:18.146108623Z" level=info msg="StartContainer for \"fd45bcc08f590dc02373a4334765962f8973d8400b63689045b108908db441a0\"" Jan 27 23:59:18.147509 containerd[1664]: time="2026-01-27T23:59:18.147480627Z" level=info msg="connecting to shim fd45bcc08f590dc02373a4334765962f8973d8400b63689045b108908db441a0" address="unix:///run/containerd/s/91a32f2fe18f97f041056e90c42c1703660bd5579795c1c302ae57cabbc40c86" protocol=ttrpc version=3 Jan 27 23:59:18.168964 systemd[1]: Started cri-containerd-fd45bcc08f590dc02373a4334765962f8973d8400b63689045b108908db441a0.scope - libcontainer container fd45bcc08f590dc02373a4334765962f8973d8400b63689045b108908db441a0. Jan 27 23:59:18.178000 audit: BPF prog-id=226 op=LOAD Jan 27 23:59:18.180000 audit: BPF prog-id=227 op=LOAD Jan 27 23:59:18.180000 audit[4670]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4627 pid=4670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664343562636330386635393064633032333733613433333437363539 Jan 27 23:59:18.180000 audit: BPF prog-id=227 op=UNLOAD Jan 27 23:59:18.180000 audit[4670]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664343562636330386635393064633032333733613433333437363539 Jan 27 23:59:18.180000 audit: BPF prog-id=228 op=LOAD Jan 27 23:59:18.180000 audit[4670]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4627 pid=4670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664343562636330386635393064633032333733613433333437363539 Jan 27 23:59:18.180000 audit: BPF prog-id=229 op=LOAD Jan 27 23:59:18.180000 audit[4670]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4627 pid=4670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664343562636330386635393064633032333733613433333437363539 Jan 27 23:59:18.180000 audit: BPF prog-id=229 op=UNLOAD Jan 27 23:59:18.180000 audit[4670]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664343562636330386635393064633032333733613433333437363539 Jan 27 23:59:18.180000 audit: BPF prog-id=228 op=UNLOAD Jan 27 23:59:18.180000 audit[4670]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4627 pid=4670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664343562636330386635393064633032333733613433333437363539 Jan 27 23:59:18.180000 audit: BPF prog-id=230 op=LOAD Jan 27 23:59:18.180000 audit[4670]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4627 pid=4670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664343562636330386635393064633032333733613433333437363539 Jan 27 23:59:18.197910 containerd[1664]: time="2026-01-27T23:59:18.197870382Z" level=info msg="StartContainer for \"fd45bcc08f590dc02373a4334765962f8973d8400b63689045b108908db441a0\" returns successfully" Jan 27 23:59:18.305329 containerd[1664]: time="2026-01-27T23:59:18.305177912Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:18.306782 containerd[1664]: time="2026-01-27T23:59:18.306742517Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 23:59:18.306988 containerd[1664]: time="2026-01-27T23:59:18.306824197Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:18.307134 kubelet[2953]: E0127 23:59:18.307095 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 23:59:18.307511 kubelet[2953]: E0127 23:59:18.307143 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 23:59:18.307511 kubelet[2953]: E0127 23:59:18.307311 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-fcc6fc97b-npt4s_calico-system(2e0df84e-16dc-494b-a3d3-1071788a0777): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:18.307511 kubelet[2953]: E0127 23:59:18.307343 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 27 23:59:18.308064 containerd[1664]: time="2026-01-27T23:59:18.307834440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 23:59:18.657571 containerd[1664]: time="2026-01-27T23:59:18.657364395Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:18.658964 containerd[1664]: time="2026-01-27T23:59:18.658923960Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 23:59:18.659136 containerd[1664]: time="2026-01-27T23:59:18.658984960Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:18.659362 kubelet[2953]: E0127 23:59:18.659300 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 23:59:18.659362 kubelet[2953]: E0127 23:59:18.659359 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 23:59:18.659471 kubelet[2953]: E0127 23:59:18.659446 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sv2v9_calico-system(e61938e0-7077-4d81-9b34-6430d54d8b9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:18.659620 kubelet[2953]: E0127 23:59:18.659583 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 27 23:59:18.826762 kubelet[2953]: E0127 23:59:18.826255 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 27 23:59:18.827922 kubelet[2953]: E0127 23:59:18.827874 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 27 23:59:18.848000 audit[4705]: NETFILTER_CFG table=filter:128 family=2 entries=20 op=nft_register_rule pid=4705 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:18.848000 audit[4705]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc8dc2e60 a2=0 a3=1 items=0 ppid=3063 pid=4705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.848000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:18.853410 kubelet[2953]: I0127 23:59:18.853352 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-mp25r" podStartSLOduration=42.853335798 podStartE2EDuration="42.853335798s" podCreationTimestamp="2026-01-27 23:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 23:59:18.83751811 +0000 UTC m=+49.269396732" watchObservedRunningTime="2026-01-27 23:59:18.853335798 +0000 UTC m=+49.285214340" Jan 27 23:59:18.853000 audit[4705]: NETFILTER_CFG table=nat:129 family=2 entries=14 op=nft_register_rule pid=4705 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:18.853000 audit[4705]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffc8dc2e60 a2=0 a3=1 items=0 ppid=3063 pid=4705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:18.853000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:19.479940 systemd-networkd[1575]: cali79da3fbf755: Gained IPv6LL Jan 27 23:59:19.544077 systemd-networkd[1575]: cali34c84bd662c: Gained IPv6LL Jan 27 23:59:19.608024 systemd-networkd[1575]: cali66aca675b2c: Gained IPv6LL Jan 27 23:59:19.668750 containerd[1664]: time="2026-01-27T23:59:19.668675427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-jlz2t,Uid:cca7bd93-1a7d-448c-ad36-6b956cecc82e,Namespace:calico-apiserver,Attempt:0,}" Jan 27 23:59:19.780016 systemd-networkd[1575]: calia8f19bc085e: Link UP Jan 27 23:59:19.780415 systemd-networkd[1575]: calia8f19bc085e: Gained carrier Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.708 [INFO][4706] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0 calico-apiserver-bb4448d88- calico-apiserver cca7bd93-1a7d-448c-ad36-6b956cecc82e 811 0 2026-01-27 23:58:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:bb4448d88 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 calico-apiserver-bb4448d88-jlz2t eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia8f19bc085e [] [] }} ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.709 [INFO][4706] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.732 [INFO][4721] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" HandleID="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.732 [INFO][4721] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" HandleID="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000502a80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-n-485d202ac1", "pod":"calico-apiserver-bb4448d88-jlz2t", "timestamp":"2026-01-27 23:59:19.732430743 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.732 [INFO][4721] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.732 [INFO][4721] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.732 [INFO][4721] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.743 [INFO][4721] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.749 [INFO][4721] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.755 [INFO][4721] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.757 [INFO][4721] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.760 [INFO][4721] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.760 [INFO][4721] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.762 [INFO][4721] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.766 [INFO][4721] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.775 [INFO][4721] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.5/26] block=192.168.91.0/26 handle="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.775 [INFO][4721] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.5/26] handle="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.776 [INFO][4721] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:19.796826 containerd[1664]: 2026-01-27 23:59:19.776 [INFO][4721] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.5/26] IPv6=[] ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" HandleID="k8s-pod-network.6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" Jan 27 23:59:19.797629 containerd[1664]: 2026-01-27 23:59:19.777 [INFO][4706] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0", GenerateName:"calico-apiserver-bb4448d88-", Namespace:"calico-apiserver", SelfLink:"", UID:"cca7bd93-1a7d-448c-ad36-6b956cecc82e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bb4448d88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"calico-apiserver-bb4448d88-jlz2t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia8f19bc085e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:19.797629 containerd[1664]: 2026-01-27 23:59:19.777 [INFO][4706] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.5/32] ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" Jan 27 23:59:19.797629 containerd[1664]: 2026-01-27 23:59:19.778 [INFO][4706] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia8f19bc085e ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" Jan 27 23:59:19.797629 containerd[1664]: 2026-01-27 23:59:19.780 [INFO][4706] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" Jan 27 23:59:19.797629 containerd[1664]: 2026-01-27 23:59:19.780 [INFO][4706] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0", GenerateName:"calico-apiserver-bb4448d88-", Namespace:"calico-apiserver", SelfLink:"", UID:"cca7bd93-1a7d-448c-ad36-6b956cecc82e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bb4448d88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc", Pod:"calico-apiserver-bb4448d88-jlz2t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia8f19bc085e", MAC:"ee:51:95:09:75:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:19.797629 containerd[1664]: 2026-01-27 23:59:19.794 [INFO][4706] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-jlz2t" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--jlz2t-eth0" Jan 27 23:59:19.813000 audit[4738]: NETFILTER_CFG table=filter:130 family=2 entries=62 op=nft_register_chain pid=4738 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:19.815274 kernel: kauditd_printk_skb: 334 callbacks suppressed Jan 27 23:59:19.815327 kernel: audit: type=1325 audit(1769558359.813:695): table=filter:130 family=2 entries=62 op=nft_register_chain pid=4738 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:19.813000 audit[4738]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=31772 a0=3 a1=ffffcb6ac6a0 a2=0 a3=ffffb7bb2fa8 items=0 ppid=4236 pid=4738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.823246 kernel: audit: type=1300 audit(1769558359.813:695): arch=c00000b7 syscall=211 success=yes exit=31772 a0=3 a1=ffffcb6ac6a0 a2=0 a3=ffffb7bb2fa8 items=0 ppid=4236 pid=4738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.823340 kernel: audit: type=1327 audit(1769558359.813:695): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:19.813000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:19.826140 containerd[1664]: time="2026-01-27T23:59:19.826101711Z" level=info msg="connecting to shim 6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc" address="unix:///run/containerd/s/06e823b7357e8642637fce83975ac4dac1eed03b660d6b41a25ea13e708b684a" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:19.829662 kubelet[2953]: E0127 23:59:19.829591 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 27 23:59:19.829662 kubelet[2953]: E0127 23:59:19.829588 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 27 23:59:19.859989 systemd[1]: Started cri-containerd-6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc.scope - libcontainer container 6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc. Jan 27 23:59:19.871000 audit: BPF prog-id=231 op=LOAD Jan 27 23:59:19.873826 kernel: audit: type=1334 audit(1769558359.871:696): prog-id=231 op=LOAD Jan 27 23:59:19.873000 audit: BPF prog-id=232 op=LOAD Jan 27 23:59:19.873000 audit[4760]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.878002 kernel: audit: type=1334 audit(1769558359.873:697): prog-id=232 op=LOAD Jan 27 23:59:19.878137 kernel: audit: type=1300 audit(1769558359.873:697): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.878200 kernel: audit: type=1327 audit(1769558359.873:697): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.873000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.873000 audit: BPF prog-id=232 op=UNLOAD Jan 27 23:59:19.882220 kernel: audit: type=1334 audit(1769558359.873:698): prog-id=232 op=UNLOAD Jan 27 23:59:19.873000 audit[4760]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.885414 kernel: audit: type=1300 audit(1769558359.873:698): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.873000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.888743 kernel: audit: type=1327 audit(1769558359.873:698): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.874000 audit[4780]: NETFILTER_CFG table=filter:131 family=2 entries=17 op=nft_register_rule pid=4780 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:19.874000 audit[4780]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe5fb9b40 a2=0 a3=1 items=0 ppid=3063 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.874000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:19.874000 audit: BPF prog-id=233 op=LOAD Jan 27 23:59:19.874000 audit[4760]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.880000 audit: BPF prog-id=234 op=LOAD Jan 27 23:59:19.880000 audit[4760]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.880000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.881000 audit: BPF prog-id=234 op=UNLOAD Jan 27 23:59:19.881000 audit[4760]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.881000 audit: BPF prog-id=233 op=UNLOAD Jan 27 23:59:19.881000 audit[4760]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.881000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.884000 audit: BPF prog-id=235 op=LOAD Jan 27 23:59:19.884000 audit[4760]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4748 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.884000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631353564373639626433643537666266643965653365316663323139 Jan 27 23:59:19.887000 audit[4780]: NETFILTER_CFG table=nat:132 family=2 entries=35 op=nft_register_chain pid=4780 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:19.887000 audit[4780]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffe5fb9b40 a2=0 a3=1 items=0 ppid=3063 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:19.887000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:19.915084 containerd[1664]: time="2026-01-27T23:59:19.915044865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-jlz2t,Uid:cca7bd93-1a7d-448c-ad36-6b956cecc82e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6155d769bd3d57fbfd9ee3e1fc219b6492de3afb2192a332eaafc0cfe9db30dc\"" Jan 27 23:59:19.916762 containerd[1664]: time="2026-01-27T23:59:19.916706190Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 23:59:20.251524 containerd[1664]: time="2026-01-27T23:59:20.251392259Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:20.252795 containerd[1664]: time="2026-01-27T23:59:20.252751664Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 23:59:20.252877 containerd[1664]: time="2026-01-27T23:59:20.252823544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:20.253044 kubelet[2953]: E0127 23:59:20.253007 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:20.253088 kubelet[2953]: E0127 23:59:20.253053 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:20.253141 kubelet[2953]: E0127 23:59:20.253120 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-jlz2t_calico-apiserver(cca7bd93-1a7d-448c-ad36-6b956cecc82e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:20.253177 kubelet[2953]: E0127 23:59:20.253158 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 27 23:59:20.664330 containerd[1664]: time="2026-01-27T23:59:20.664277970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wfxbj,Uid:43426337-f017-4b99-a432-b642f2eafaaa,Namespace:kube-system,Attempt:0,}" Jan 27 23:59:20.769138 systemd-networkd[1575]: cali856d6619fcd: Link UP Jan 27 23:59:20.769913 systemd-networkd[1575]: cali856d6619fcd: Gained carrier Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.702 [INFO][4789] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0 coredns-66bc5c9577- kube-system 43426337-f017-4b99-a432-b642f2eafaaa 804 0 2026-01-27 23:58:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 coredns-66bc5c9577-wfxbj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali856d6619fcd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.702 [INFO][4789] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.725 [INFO][4803] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" HandleID="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Workload="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.725 [INFO][4803] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" HandleID="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Workload="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d470), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-n-485d202ac1", "pod":"coredns-66bc5c9577-wfxbj", "timestamp":"2026-01-27 23:59:20.725338118 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.725 [INFO][4803] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.725 [INFO][4803] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.725 [INFO][4803] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.735 [INFO][4803] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.740 [INFO][4803] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.745 [INFO][4803] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.748 [INFO][4803] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.750 [INFO][4803] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.750 [INFO][4803] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.752 [INFO][4803] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94 Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.757 [INFO][4803] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.763 [INFO][4803] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.6/26] block=192.168.91.0/26 handle="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.763 [INFO][4803] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.6/26] handle="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.763 [INFO][4803] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:20.784942 containerd[1664]: 2026-01-27 23:59:20.764 [INFO][4803] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.6/26] IPv6=[] ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" HandleID="k8s-pod-network.1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Workload="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" Jan 27 23:59:20.785683 containerd[1664]: 2026-01-27 23:59:20.766 [INFO][4789] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"43426337-f017-4b99-a432-b642f2eafaaa", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"coredns-66bc5c9577-wfxbj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali856d6619fcd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:20.785683 containerd[1664]: 2026-01-27 23:59:20.766 [INFO][4789] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.6/32] ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" Jan 27 23:59:20.785683 containerd[1664]: 2026-01-27 23:59:20.766 [INFO][4789] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali856d6619fcd ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" Jan 27 23:59:20.785683 containerd[1664]: 2026-01-27 23:59:20.769 [INFO][4789] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" Jan 27 23:59:20.785683 containerd[1664]: 2026-01-27 23:59:20.769 [INFO][4789] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"43426337-f017-4b99-a432-b642f2eafaaa", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94", Pod:"coredns-66bc5c9577-wfxbj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali856d6619fcd", MAC:"ee:6a:c4:d5:f4:47", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:20.786346 containerd[1664]: 2026-01-27 23:59:20.782 [INFO][4789] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" Namespace="kube-system" Pod="coredns-66bc5c9577-wfxbj" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-coredns--66bc5c9577--wfxbj-eth0" Jan 27 23:59:20.797000 audit[4821]: NETFILTER_CFG table=filter:133 family=2 entries=48 op=nft_register_chain pid=4821 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:20.797000 audit[4821]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22720 a0=3 a1=ffffdc60cfc0 a2=0 a3=ffffbd802fa8 items=0 ppid=4236 pid=4821 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.797000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:20.811342 containerd[1664]: time="2026-01-27T23:59:20.811269382Z" level=info msg="connecting to shim 1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94" address="unix:///run/containerd/s/5f5049a9ed4274981671f92b0a983dce4092a0474c36d868d334c9743f2e369c" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:20.836425 kubelet[2953]: E0127 23:59:20.836379 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 27 23:59:20.836984 systemd[1]: Started cri-containerd-1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94.scope - libcontainer container 1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94. Jan 27 23:59:20.849000 audit: BPF prog-id=236 op=LOAD Jan 27 23:59:20.850000 audit: BPF prog-id=237 op=LOAD Jan 27 23:59:20.850000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4831 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164613932636233643161303462303466616233326334343931396663 Jan 27 23:59:20.850000 audit: BPF prog-id=237 op=UNLOAD Jan 27 23:59:20.850000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4831 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164613932636233643161303462303466616233326334343931396663 Jan 27 23:59:20.851000 audit: BPF prog-id=238 op=LOAD Jan 27 23:59:20.851000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4831 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164613932636233643161303462303466616233326334343931396663 Jan 27 23:59:20.851000 audit: BPF prog-id=239 op=LOAD Jan 27 23:59:20.851000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4831 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164613932636233643161303462303466616233326334343931396663 Jan 27 23:59:20.851000 audit: BPF prog-id=239 op=UNLOAD Jan 27 23:59:20.851000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4831 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164613932636233643161303462303466616233326334343931396663 Jan 27 23:59:20.851000 audit: BPF prog-id=238 op=UNLOAD Jan 27 23:59:20.851000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4831 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164613932636233643161303462303466616233326334343931396663 Jan 27 23:59:20.851000 audit: BPF prog-id=240 op=LOAD Jan 27 23:59:20.851000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4831 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164613932636233643161303462303466616233326334343931396663 Jan 27 23:59:20.878742 containerd[1664]: time="2026-01-27T23:59:20.878687669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wfxbj,Uid:43426337-f017-4b99-a432-b642f2eafaaa,Namespace:kube-system,Attempt:0,} returns sandbox id \"1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94\"" Jan 27 23:59:20.885093 containerd[1664]: time="2026-01-27T23:59:20.885045849Z" level=info msg="CreateContainer within sandbox \"1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 27 23:59:20.899585 containerd[1664]: time="2026-01-27T23:59:20.898921212Z" level=info msg="Container 580adcd2634f062abb3c8d897e16acbb2359a75391c664808b2fe0a9737d7dc4: CDI devices from CRI Config.CDIDevices: []" Jan 27 23:59:20.908000 audit[4868]: NETFILTER_CFG table=filter:134 family=2 entries=14 op=nft_register_rule pid=4868 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:20.908000 audit[4868]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc9aff620 a2=0 a3=1 items=0 ppid=3063 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.908000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:20.912357 containerd[1664]: time="2026-01-27T23:59:20.912270053Z" level=info msg="CreateContainer within sandbox \"1da92cb3d1a04b04fab32c44919fc28c48d2749dd3ef24a2a8cb4f2fec544f94\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"580adcd2634f062abb3c8d897e16acbb2359a75391c664808b2fe0a9737d7dc4\"" Jan 27 23:59:20.913014 containerd[1664]: time="2026-01-27T23:59:20.912935215Z" level=info msg="StartContainer for \"580adcd2634f062abb3c8d897e16acbb2359a75391c664808b2fe0a9737d7dc4\"" Jan 27 23:59:20.914317 containerd[1664]: time="2026-01-27T23:59:20.914161098Z" level=info msg="connecting to shim 580adcd2634f062abb3c8d897e16acbb2359a75391c664808b2fe0a9737d7dc4" address="unix:///run/containerd/s/5f5049a9ed4274981671f92b0a983dce4092a0474c36d868d334c9743f2e369c" protocol=ttrpc version=3 Jan 27 23:59:20.917000 audit[4868]: NETFILTER_CFG table=nat:135 family=2 entries=20 op=nft_register_rule pid=4868 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:20.917000 audit[4868]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc9aff620 a2=0 a3=1 items=0 ppid=3063 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.917000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:20.934158 systemd[1]: Started cri-containerd-580adcd2634f062abb3c8d897e16acbb2359a75391c664808b2fe0a9737d7dc4.scope - libcontainer container 580adcd2634f062abb3c8d897e16acbb2359a75391c664808b2fe0a9737d7dc4. Jan 27 23:59:20.944000 audit: BPF prog-id=241 op=LOAD Jan 27 23:59:20.944000 audit: BPF prog-id=242 op=LOAD Jan 27 23:59:20.944000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4831 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538306164636432363334663036326162623363386438393765313661 Jan 27 23:59:20.944000 audit: BPF prog-id=242 op=UNLOAD Jan 27 23:59:20.944000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4831 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538306164636432363334663036326162623363386438393765313661 Jan 27 23:59:20.945000 audit: BPF prog-id=243 op=LOAD Jan 27 23:59:20.945000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4831 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538306164636432363334663036326162623363386438393765313661 Jan 27 23:59:20.945000 audit: BPF prog-id=244 op=LOAD Jan 27 23:59:20.945000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4831 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538306164636432363334663036326162623363386438393765313661 Jan 27 23:59:20.945000 audit: BPF prog-id=244 op=UNLOAD Jan 27 23:59:20.945000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4831 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538306164636432363334663036326162623363386438393765313661 Jan 27 23:59:20.945000 audit: BPF prog-id=243 op=UNLOAD Jan 27 23:59:20.945000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4831 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538306164636432363334663036326162623363386438393765313661 Jan 27 23:59:20.945000 audit: BPF prog-id=245 op=LOAD Jan 27 23:59:20.945000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4831 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:20.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538306164636432363334663036326162623363386438393765313661 Jan 27 23:59:20.951948 systemd-networkd[1575]: calia8f19bc085e: Gained IPv6LL Jan 27 23:59:20.984610 containerd[1664]: time="2026-01-27T23:59:20.984568755Z" level=info msg="StartContainer for \"580adcd2634f062abb3c8d897e16acbb2359a75391c664808b2fe0a9737d7dc4\" returns successfully" Jan 27 23:59:21.663522 containerd[1664]: time="2026-01-27T23:59:21.663443644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-2hkvv,Uid:7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c,Namespace:calico-apiserver,Attempt:0,}" Jan 27 23:59:21.778545 systemd-networkd[1575]: cali5a3b8676894: Link UP Jan 27 23:59:21.779948 systemd-networkd[1575]: cali5a3b8676894: Gained carrier Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.704 [INFO][4911] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0 calico-apiserver-bb4448d88- calico-apiserver 7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c 808 0 2026-01-27 23:58:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:bb4448d88 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 calico-apiserver-bb4448d88-2hkvv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5a3b8676894 [] [] }} ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.705 [INFO][4911] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.728 [INFO][4924] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" HandleID="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.729 [INFO][4924] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" HandleID="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001373b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-n-485d202ac1", "pod":"calico-apiserver-bb4448d88-2hkvv", "timestamp":"2026-01-27 23:59:21.728927085 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.729 [INFO][4924] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.729 [INFO][4924] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.729 [INFO][4924] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.740 [INFO][4924] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.745 [INFO][4924] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.752 [INFO][4924] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.754 [INFO][4924] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.757 [INFO][4924] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.757 [INFO][4924] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.759 [INFO][4924] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8 Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.764 [INFO][4924] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.770 [INFO][4924] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.7/26] block=192.168.91.0/26 handle="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.770 [INFO][4924] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.7/26] handle="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.771 [INFO][4924] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:21.792056 containerd[1664]: 2026-01-27 23:59:21.771 [INFO][4924] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.7/26] IPv6=[] ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" HandleID="k8s-pod-network.46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Workload="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" Jan 27 23:59:21.792821 containerd[1664]: 2026-01-27 23:59:21.774 [INFO][4911] cni-plugin/k8s.go 418: Populated endpoint ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0", GenerateName:"calico-apiserver-bb4448d88-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bb4448d88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"calico-apiserver-bb4448d88-2hkvv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5a3b8676894", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:21.792821 containerd[1664]: 2026-01-27 23:59:21.775 [INFO][4911] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.7/32] ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" Jan 27 23:59:21.792821 containerd[1664]: 2026-01-27 23:59:21.775 [INFO][4911] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5a3b8676894 ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" Jan 27 23:59:21.792821 containerd[1664]: 2026-01-27 23:59:21.777 [INFO][4911] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" Jan 27 23:59:21.792821 containerd[1664]: 2026-01-27 23:59:21.777 [INFO][4911] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0", GenerateName:"calico-apiserver-bb4448d88-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bb4448d88", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8", Pod:"calico-apiserver-bb4448d88-2hkvv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5a3b8676894", MAC:"ee:7e:cf:28:81:0d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:21.792821 containerd[1664]: 2026-01-27 23:59:21.789 [INFO][4911] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" Namespace="calico-apiserver" Pod="calico-apiserver-bb4448d88-2hkvv" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-calico--apiserver--bb4448d88--2hkvv-eth0" Jan 27 23:59:21.808000 audit[4942]: NETFILTER_CFG table=filter:136 family=2 entries=63 op=nft_register_chain pid=4942 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:21.808000 audit[4942]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=30680 a0=3 a1=ffffc0c0aee0 a2=0 a3=ffffa9673fa8 items=0 ppid=4236 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.808000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:21.829406 containerd[1664]: time="2026-01-27T23:59:21.829229634Z" level=info msg="connecting to shim 46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8" address="unix:///run/containerd/s/5ed5353137e1f8bfb7506369a98cb22de54a21cd92207d0aa08b30f5731777e0" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:21.844004 kubelet[2953]: E0127 23:59:21.843668 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 27 23:59:21.866986 systemd[1]: Started cri-containerd-46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8.scope - libcontainer container 46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8. Jan 27 23:59:21.871166 kubelet[2953]: I0127 23:59:21.870059 2953 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-wfxbj" podStartSLOduration=45.870040199 podStartE2EDuration="45.870040199s" podCreationTimestamp="2026-01-27 23:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 23:59:21.869385677 +0000 UTC m=+52.301264219" watchObservedRunningTime="2026-01-27 23:59:21.870040199 +0000 UTC m=+52.301918741" Jan 27 23:59:21.885000 audit: BPF prog-id=246 op=LOAD Jan 27 23:59:21.885000 audit: BPF prog-id=247 op=LOAD Jan 27 23:59:21.885000 audit[4962]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4951 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436633131616163633634343535313839613039613832316138303532 Jan 27 23:59:21.885000 audit: BPF prog-id=247 op=UNLOAD Jan 27 23:59:21.885000 audit[4962]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4951 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436633131616163633634343535313839613039613832316138303532 Jan 27 23:59:21.886000 audit: BPF prog-id=248 op=LOAD Jan 27 23:59:21.886000 audit[4962]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4951 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436633131616163633634343535313839613039613832316138303532 Jan 27 23:59:21.886000 audit: BPF prog-id=249 op=LOAD Jan 27 23:59:21.886000 audit[4962]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4951 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436633131616163633634343535313839613039613832316138303532 Jan 27 23:59:21.886000 audit: BPF prog-id=249 op=UNLOAD Jan 27 23:59:21.886000 audit[4962]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4951 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436633131616163633634343535313839613039613832316138303532 Jan 27 23:59:21.886000 audit: BPF prog-id=248 op=UNLOAD Jan 27 23:59:21.886000 audit[4962]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4951 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436633131616163633634343535313839613039613832316138303532 Jan 27 23:59:21.886000 audit: BPF prog-id=250 op=LOAD Jan 27 23:59:21.886000 audit[4962]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4951 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436633131616163633634343535313839613039613832316138303532 Jan 27 23:59:21.911594 containerd[1664]: time="2026-01-27T23:59:21.911493367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bb4448d88-2hkvv,Uid:7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"46c11aacc64455189a09a821a805244d841c6af55ea8133427b147cdbf5ca6e8\"" Jan 27 23:59:21.914039 containerd[1664]: time="2026-01-27T23:59:21.913914654Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 23:59:21.931000 audit[4988]: NETFILTER_CFG table=filter:137 family=2 entries=14 op=nft_register_rule pid=4988 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:21.931000 audit[4988]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffeb988c70 a2=0 a3=1 items=0 ppid=3063 pid=4988 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.931000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:21.945000 audit[4988]: NETFILTER_CFG table=nat:138 family=2 entries=56 op=nft_register_chain pid=4988 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:21.945000 audit[4988]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffeb988c70 a2=0 a3=1 items=0 ppid=3063 pid=4988 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:21.945000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:22.244509 containerd[1664]: time="2026-01-27T23:59:22.244285351Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:22.246739 containerd[1664]: time="2026-01-27T23:59:22.246681598Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 23:59:22.246887 containerd[1664]: time="2026-01-27T23:59:22.246769398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:22.247160 kubelet[2953]: E0127 23:59:22.247089 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:22.247305 kubelet[2953]: E0127 23:59:22.247175 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:22.247373 kubelet[2953]: E0127 23:59:22.247329 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-2hkvv_calico-apiserver(7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:22.247414 kubelet[2953]: E0127 23:59:22.247385 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 27 23:59:22.551974 systemd-networkd[1575]: cali856d6619fcd: Gained IPv6LL Jan 27 23:59:22.848304 kubelet[2953]: E0127 23:59:22.848038 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 27 23:59:22.966000 audit[4991]: NETFILTER_CFG table=filter:139 family=2 entries=14 op=nft_register_rule pid=4991 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:22.966000 audit[4991]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd5fa7e90 a2=0 a3=1 items=0 ppid=3063 pid=4991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:22.966000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:22.974000 audit[4991]: NETFILTER_CFG table=nat:140 family=2 entries=20 op=nft_register_rule pid=4991 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 23:59:22.974000 audit[4991]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd5fa7e90 a2=0 a3=1 items=0 ppid=3063 pid=4991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:22.974000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 23:59:23.640005 systemd-networkd[1575]: cali5a3b8676894: Gained IPv6LL Jan 27 23:59:23.666621 containerd[1664]: time="2026-01-27T23:59:23.666569286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bmscm,Uid:1d6f938d-8e51-4e63-b408-0de368dbd7d7,Namespace:calico-system,Attempt:0,}" Jan 27 23:59:23.768314 systemd-networkd[1575]: cali0d7c188b0f7: Link UP Jan 27 23:59:23.769220 systemd-networkd[1575]: cali0d7c188b0f7: Gained carrier Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.703 [INFO][4992] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0 csi-node-driver- calico-system 1d6f938d-8e51-4e63-b408-0de368dbd7d7 709 0 2026-01-27 23:58:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4593-0-0-n-485d202ac1 csi-node-driver-bmscm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0d7c188b0f7 [] [] }} ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.703 [INFO][4992] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.725 [INFO][5007] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" HandleID="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Workload="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.726 [INFO][5007] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" HandleID="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Workload="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c470), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-485d202ac1", "pod":"csi-node-driver-bmscm", "timestamp":"2026-01-27 23:59:23.725941309 +0000 UTC"}, Hostname:"ci-4593-0-0-n-485d202ac1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.726 [INFO][5007] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.726 [INFO][5007] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.726 [INFO][5007] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-485d202ac1' Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.736 [INFO][5007] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.741 [INFO][5007] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.747 [INFO][5007] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.749 [INFO][5007] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.751 [INFO][5007] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.751 [INFO][5007] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.753 [INFO][5007] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.757 [INFO][5007] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.764 [INFO][5007] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.8/26] block=192.168.91.0/26 handle="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.764 [INFO][5007] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.8/26] handle="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" host="ci-4593-0-0-n-485d202ac1" Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.764 [INFO][5007] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 23:59:23.780812 containerd[1664]: 2026-01-27 23:59:23.764 [INFO][5007] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.8/26] IPv6=[] ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" HandleID="k8s-pod-network.cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Workload="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" Jan 27 23:59:23.781455 containerd[1664]: 2026-01-27 23:59:23.766 [INFO][4992] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1d6f938d-8e51-4e63-b408-0de368dbd7d7", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"", Pod:"csi-node-driver-bmscm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0d7c188b0f7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:23.781455 containerd[1664]: 2026-01-27 23:59:23.766 [INFO][4992] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.8/32] ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" Jan 27 23:59:23.781455 containerd[1664]: 2026-01-27 23:59:23.766 [INFO][4992] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0d7c188b0f7 ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" Jan 27 23:59:23.781455 containerd[1664]: 2026-01-27 23:59:23.768 [INFO][4992] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" Jan 27 23:59:23.781455 containerd[1664]: 2026-01-27 23:59:23.768 [INFO][4992] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1d6f938d-8e51-4e63-b408-0de368dbd7d7", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 23, 58, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-485d202ac1", ContainerID:"cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade", Pod:"csi-node-driver-bmscm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0d7c188b0f7", MAC:"9a:1c:8e:c8:46:47", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 23:59:23.781455 containerd[1664]: 2026-01-27 23:59:23.778 [INFO][4992] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" Namespace="calico-system" Pod="csi-node-driver-bmscm" WorkloadEndpoint="ci--4593--0--0--n--485d202ac1-k8s-csi--node--driver--bmscm-eth0" Jan 27 23:59:23.796000 audit[5023]: NETFILTER_CFG table=filter:141 family=2 entries=56 op=nft_register_chain pid=5023 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 23:59:23.796000 audit[5023]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=25500 a0=3 a1=fffff0b3e540 a2=0 a3=ffff9c9c3fa8 items=0 ppid=4236 pid=5023 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.796000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 23:59:23.809927 containerd[1664]: time="2026-01-27T23:59:23.809885607Z" level=info msg="connecting to shim cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade" address="unix:///run/containerd/s/0a2b069ea368a0897875184927bece38187c822236f1bc3f65bdb373610dcd3a" namespace=k8s.io protocol=ttrpc version=3 Jan 27 23:59:23.848002 systemd[1]: Started cri-containerd-cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade.scope - libcontainer container cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade. Jan 27 23:59:23.857745 kubelet[2953]: E0127 23:59:23.856828 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 27 23:59:23.870000 audit: BPF prog-id=251 op=LOAD Jan 27 23:59:23.873000 audit: BPF prog-id=252 op=LOAD Jan 27 23:59:23.873000 audit[5044]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=5033 pid=5044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.873000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363343535356165326237396366663464663333646362326263656639 Jan 27 23:59:23.873000 audit: BPF prog-id=252 op=UNLOAD Jan 27 23:59:23.873000 audit[5044]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5033 pid=5044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.873000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363343535356165326237396366663464663333646362326263656639 Jan 27 23:59:23.874000 audit: BPF prog-id=253 op=LOAD Jan 27 23:59:23.874000 audit[5044]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=5033 pid=5044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363343535356165326237396366663464663333646362326263656639 Jan 27 23:59:23.874000 audit: BPF prog-id=254 op=LOAD Jan 27 23:59:23.874000 audit[5044]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=5033 pid=5044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363343535356165326237396366663464663333646362326263656639 Jan 27 23:59:23.874000 audit: BPF prog-id=254 op=UNLOAD Jan 27 23:59:23.874000 audit[5044]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5033 pid=5044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363343535356165326237396366663464663333646362326263656639 Jan 27 23:59:23.874000 audit: BPF prog-id=253 op=UNLOAD Jan 27 23:59:23.874000 audit[5044]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5033 pid=5044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363343535356165326237396366663464663333646362326263656639 Jan 27 23:59:23.874000 audit: BPF prog-id=255 op=LOAD Jan 27 23:59:23.874000 audit[5044]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=5033 pid=5044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 23:59:23.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363343535356165326237396366663464663333646362326263656639 Jan 27 23:59:23.893686 containerd[1664]: time="2026-01-27T23:59:23.893583025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bmscm,Uid:1d6f938d-8e51-4e63-b408-0de368dbd7d7,Namespace:calico-system,Attempt:0,} returns sandbox id \"cc4555ae2b79cff4df33dcb2bcef99710a417df0ed809006ab24835006248ade\"" Jan 27 23:59:23.896230 containerd[1664]: time="2026-01-27T23:59:23.896165553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 23:59:24.221906 containerd[1664]: time="2026-01-27T23:59:24.221713674Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:24.223495 containerd[1664]: time="2026-01-27T23:59:24.223437600Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 23:59:24.223581 containerd[1664]: time="2026-01-27T23:59:24.223521960Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:24.223706 kubelet[2953]: E0127 23:59:24.223660 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 23:59:24.223763 kubelet[2953]: E0127 23:59:24.223709 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 23:59:24.223827 kubelet[2953]: E0127 23:59:24.223808 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:24.224580 containerd[1664]: time="2026-01-27T23:59:24.224542163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 23:59:24.568562 containerd[1664]: time="2026-01-27T23:59:24.568388861Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:24.569927 containerd[1664]: time="2026-01-27T23:59:24.569878305Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 23:59:24.570048 containerd[1664]: time="2026-01-27T23:59:24.569957946Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:24.570201 kubelet[2953]: E0127 23:59:24.570140 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 23:59:24.570201 kubelet[2953]: E0127 23:59:24.570189 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 23:59:24.570400 kubelet[2953]: E0127 23:59:24.570267 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:24.570400 kubelet[2953]: E0127 23:59:24.570308 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:24.857224 kubelet[2953]: E0127 23:59:24.857101 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:25.623936 systemd-networkd[1575]: cali0d7c188b0f7: Gained IPv6LL Jan 27 23:59:25.858816 kubelet[2953]: E0127 23:59:25.858766 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:28.663048 containerd[1664]: time="2026-01-27T23:59:28.662997058Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 23:59:28.992605 containerd[1664]: time="2026-01-27T23:59:28.991569749Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:28.993148 containerd[1664]: time="2026-01-27T23:59:28.993108434Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 23:59:28.993279 containerd[1664]: time="2026-01-27T23:59:28.993198714Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:28.993336 kubelet[2953]: E0127 23:59:28.993297 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 23:59:28.993810 kubelet[2953]: E0127 23:59:28.993342 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 23:59:28.993810 kubelet[2953]: E0127 23:59:28.993408 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:28.994717 containerd[1664]: time="2026-01-27T23:59:28.994678478Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 23:59:29.329294 containerd[1664]: time="2026-01-27T23:59:29.329240388Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:29.331165 containerd[1664]: time="2026-01-27T23:59:29.331082153Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 23:59:29.331289 containerd[1664]: time="2026-01-27T23:59:29.331185834Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:29.331474 kubelet[2953]: E0127 23:59:29.331418 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 23:59:29.331474 kubelet[2953]: E0127 23:59:29.331466 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 23:59:29.331562 kubelet[2953]: E0127 23:59:29.331532 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:29.331602 kubelet[2953]: E0127 23:59:29.331569 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 27 23:59:32.662286 containerd[1664]: time="2026-01-27T23:59:32.662236682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 23:59:33.023855 containerd[1664]: time="2026-01-27T23:59:33.023806794Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:33.025124 containerd[1664]: time="2026-01-27T23:59:33.025086478Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 23:59:33.025214 containerd[1664]: time="2026-01-27T23:59:33.025113558Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:33.025357 kubelet[2953]: E0127 23:59:33.025308 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 23:59:33.025613 kubelet[2953]: E0127 23:59:33.025368 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 23:59:33.025613 kubelet[2953]: E0127 23:59:33.025444 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sv2v9_calico-system(e61938e0-7077-4d81-9b34-6430d54d8b9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:33.025613 kubelet[2953]: E0127 23:59:33.025478 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 27 23:59:34.662502 containerd[1664]: time="2026-01-27T23:59:34.662465675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 23:59:35.005088 containerd[1664]: time="2026-01-27T23:59:35.004896009Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:35.006577 containerd[1664]: time="2026-01-27T23:59:35.006527174Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 23:59:35.006694 containerd[1664]: time="2026-01-27T23:59:35.006624614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:35.006899 kubelet[2953]: E0127 23:59:35.006861 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 23:59:35.007380 kubelet[2953]: E0127 23:59:35.007203 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 23:59:35.007380 kubelet[2953]: E0127 23:59:35.007304 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-fcc6fc97b-npt4s_calico-system(2e0df84e-16dc-494b-a3d3-1071788a0777): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:35.007380 kubelet[2953]: E0127 23:59:35.007348 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 27 23:59:36.662668 containerd[1664]: time="2026-01-27T23:59:36.662561029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 23:59:36.997425 containerd[1664]: time="2026-01-27T23:59:36.997254338Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:37.000011 containerd[1664]: time="2026-01-27T23:59:36.999923347Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 23:59:37.000090 containerd[1664]: time="2026-01-27T23:59:37.000012747Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:37.000300 kubelet[2953]: E0127 23:59:37.000254 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:37.000618 kubelet[2953]: E0127 23:59:37.000302 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:37.000618 kubelet[2953]: E0127 23:59:37.000375 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-jlz2t_calico-apiserver(cca7bd93-1a7d-448c-ad36-6b956cecc82e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:37.000618 kubelet[2953]: E0127 23:59:37.000411 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 27 23:59:37.662478 containerd[1664]: time="2026-01-27T23:59:37.662441745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 23:59:37.996432 containerd[1664]: time="2026-01-27T23:59:37.996321252Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:37.997716 containerd[1664]: time="2026-01-27T23:59:37.997657336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 23:59:37.997795 containerd[1664]: time="2026-01-27T23:59:37.997761377Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:37.997958 kubelet[2953]: E0127 23:59:37.997922 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 23:59:37.998015 kubelet[2953]: E0127 23:59:37.997970 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 23:59:37.998185 kubelet[2953]: E0127 23:59:37.998044 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:37.998994 containerd[1664]: time="2026-01-27T23:59:37.998963660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 23:59:38.361619 containerd[1664]: time="2026-01-27T23:59:38.361546456Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:38.363008 containerd[1664]: time="2026-01-27T23:59:38.362956300Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 23:59:38.363067 containerd[1664]: time="2026-01-27T23:59:38.363008060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:38.363281 kubelet[2953]: E0127 23:59:38.363230 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 23:59:38.363281 kubelet[2953]: E0127 23:59:38.363277 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 23:59:38.363574 kubelet[2953]: E0127 23:59:38.363356 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:38.363574 kubelet[2953]: E0127 23:59:38.363403 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:38.662066 containerd[1664]: time="2026-01-27T23:59:38.661951420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 23:59:39.037161 containerd[1664]: time="2026-01-27T23:59:39.037073974Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:39.038917 containerd[1664]: time="2026-01-27T23:59:39.038878100Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 23:59:39.039024 containerd[1664]: time="2026-01-27T23:59:39.038965780Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:39.039158 kubelet[2953]: E0127 23:59:39.039123 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:39.039199 kubelet[2953]: E0127 23:59:39.039169 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 23:59:39.039256 kubelet[2953]: E0127 23:59:39.039240 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-2hkvv_calico-apiserver(7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:39.039297 kubelet[2953]: E0127 23:59:39.039274 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 27 23:59:43.662644 kubelet[2953]: E0127 23:59:43.662558 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 27 23:59:48.662372 kubelet[2953]: E0127 23:59:48.662311 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 27 23:59:49.666947 kubelet[2953]: E0127 23:59:49.666905 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 27 23:59:50.662299 kubelet[2953]: E0127 23:59:50.661887 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 27 23:59:52.662767 kubelet[2953]: E0127 23:59:52.662314 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 27 23:59:53.662889 kubelet[2953]: E0127 23:59:53.662794 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 27 23:59:57.664120 containerd[1664]: time="2026-01-27T23:59:57.663376638Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 23:59:58.008545 containerd[1664]: time="2026-01-27T23:59:58.008254579Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:58.010160 containerd[1664]: time="2026-01-27T23:59:58.010106825Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 23:59:58.010279 containerd[1664]: time="2026-01-27T23:59:58.010191585Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:58.010434 kubelet[2953]: E0127 23:59:58.010364 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 23:59:58.010434 kubelet[2953]: E0127 23:59:58.010431 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 23:59:58.010762 kubelet[2953]: E0127 23:59:58.010507 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:58.011488 containerd[1664]: time="2026-01-27T23:59:58.011458789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 23:59:58.340626 containerd[1664]: time="2026-01-27T23:59:58.340567522Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 23:59:58.341789 containerd[1664]: time="2026-01-27T23:59:58.341720285Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 23:59:58.341912 containerd[1664]: time="2026-01-27T23:59:58.341821646Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 23:59:58.342061 kubelet[2953]: E0127 23:59:58.342003 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 23:59:58.342061 kubelet[2953]: E0127 23:59:58.342054 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 23:59:58.342135 kubelet[2953]: E0127 23:59:58.342116 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 23:59:58.342187 kubelet[2953]: E0127 23:59:58.342152 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:00:00.663660 containerd[1664]: time="2026-01-28T00:00:00.663535148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:00:01.525843 containerd[1664]: time="2026-01-28T00:00:01.525637801Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:01.559895 containerd[1664]: time="2026-01-28T00:00:01.559756506Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:00:01.559895 containerd[1664]: time="2026-01-28T00:00:01.559830626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:01.560103 kubelet[2953]: E0128 00:00:01.560033 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:00:01.560103 kubelet[2953]: E0128 00:00:01.560088 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:00:01.560648 kubelet[2953]: E0128 00:00:01.560156 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-fcc6fc97b-npt4s_calico-system(2e0df84e-16dc-494b-a3d3-1071788a0777): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:01.560648 kubelet[2953]: E0128 00:00:01.560210 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:00:01.662267 containerd[1664]: time="2026-01-28T00:00:01.662224461Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:00:04.180251 containerd[1664]: time="2026-01-28T00:00:04.179853406Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:04.247394 containerd[1664]: time="2026-01-28T00:00:04.247204134Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:00:04.247394 containerd[1664]: time="2026-01-28T00:00:04.247343054Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:04.247709 kubelet[2953]: E0128 00:00:04.247654 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:00:04.247709 kubelet[2953]: E0128 00:00:04.247694 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:00:04.248300 kubelet[2953]: E0128 00:00:04.247880 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sv2v9_calico-system(e61938e0-7077-4d81-9b34-6430d54d8b9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:04.248300 kubelet[2953]: E0128 00:00:04.247923 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:00:04.248360 containerd[1664]: time="2026-01-28T00:00:04.248028576Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:00:08.026370 containerd[1664]: time="2026-01-28T00:00:08.026260480Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:08.145828 containerd[1664]: time="2026-01-28T00:00:08.145764648Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:00:08.145970 containerd[1664]: time="2026-01-28T00:00:08.145808248Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:08.146352 kubelet[2953]: E0128 00:00:08.146028 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:00:08.146352 kubelet[2953]: E0128 00:00:08.146079 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:00:08.146352 kubelet[2953]: E0128 00:00:08.146258 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-2hkvv_calico-apiserver(7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:08.146352 kubelet[2953]: E0128 00:00:08.146288 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:00:08.148203 containerd[1664]: time="2026-01-28T00:00:08.147856374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:00:12.663518 kubelet[2953]: E0128 00:00:12.663407 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:00:13.663150 kubelet[2953]: E0128 00:00:13.662994 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:00:16.449507 containerd[1664]: time="2026-01-28T00:00:16.449372954Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:16.461751 containerd[1664]: time="2026-01-28T00:00:16.461657592Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:00:16.461922 containerd[1664]: time="2026-01-28T00:00:16.461716272Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:16.462192 kubelet[2953]: E0128 00:00:16.462130 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:00:16.462192 kubelet[2953]: E0128 00:00:16.462183 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:00:16.462899 kubelet[2953]: E0128 00:00:16.462349 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-jlz2t_calico-apiserver(cca7bd93-1a7d-448c-ad36-6b956cecc82e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:16.462899 kubelet[2953]: E0128 00:00:16.462390 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:00:16.462951 containerd[1664]: time="2026-01-28T00:00:16.462479954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:00:17.662909 kubelet[2953]: E0128 00:00:17.662666 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:00:18.662821 kubelet[2953]: E0128 00:00:18.662766 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:00:19.624090 containerd[1664]: time="2026-01-28T00:00:19.624028841Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:19.746914 containerd[1664]: time="2026-01-28T00:00:19.746811179Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:00:19.746914 containerd[1664]: time="2026-01-28T00:00:19.746863699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:19.747225 kubelet[2953]: E0128 00:00:19.747179 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:00:19.747470 kubelet[2953]: E0128 00:00:19.747235 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:00:19.747470 kubelet[2953]: E0128 00:00:19.747327 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:19.748339 containerd[1664]: time="2026-01-28T00:00:19.748295423Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:00:24.228207 containerd[1664]: time="2026-01-28T00:00:24.227959445Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:24.244132 containerd[1664]: time="2026-01-28T00:00:24.243954974Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:24.244132 containerd[1664]: time="2026-01-28T00:00:24.244021254Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:00:24.244295 kubelet[2953]: E0128 00:00:24.244247 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:00:24.244546 kubelet[2953]: E0128 00:00:24.244293 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:00:24.244546 kubelet[2953]: E0128 00:00:24.244364 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:24.244546 kubelet[2953]: E0128 00:00:24.244404 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:00:25.666332 kubelet[2953]: E0128 00:00:25.666275 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:00:26.662668 kubelet[2953]: E0128 00:00:26.662616 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:00:28.662838 kubelet[2953]: E0128 00:00:28.662714 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:00:30.662241 kubelet[2953]: E0128 00:00:30.662190 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:00:30.662606 kubelet[2953]: E0128 00:00:30.662254 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:00:37.662381 kubelet[2953]: E0128 00:00:37.662329 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:00:38.662012 kubelet[2953]: E0128 00:00:38.661959 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:00:39.663664 containerd[1664]: time="2026-01-28T00:00:39.663611813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 00:00:41.475007 containerd[1664]: time="2026-01-28T00:00:41.474919066Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:41.476670 containerd[1664]: time="2026-01-28T00:00:41.476535551Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 00:00:41.476670 containerd[1664]: time="2026-01-28T00:00:41.476595391Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:41.477254 kubelet[2953]: E0128 00:00:41.477215 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:00:41.477864 kubelet[2953]: E0128 00:00:41.477401 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:00:41.477864 kubelet[2953]: E0128 00:00:41.477497 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:41.480143 containerd[1664]: time="2026-01-28T00:00:41.480111962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 00:00:41.662685 kubelet[2953]: E0128 00:00:41.662617 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:00:42.084860 containerd[1664]: time="2026-01-28T00:00:42.084814182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:42.086468 containerd[1664]: time="2026-01-28T00:00:42.086426787Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 00:00:42.086552 containerd[1664]: time="2026-01-28T00:00:42.086518707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:42.086788 kubelet[2953]: E0128 00:00:42.086743 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:00:42.087163 kubelet[2953]: E0128 00:00:42.086886 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:00:42.087163 kubelet[2953]: E0128 00:00:42.086983 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:42.087163 kubelet[2953]: E0128 00:00:42.087025 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:00:43.662840 kubelet[2953]: E0128 00:00:43.662778 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:00:45.664106 kubelet[2953]: E0128 00:00:45.663658 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:00:49.664696 kubelet[2953]: E0128 00:00:49.664617 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:00:51.661780 containerd[1664]: time="2026-01-28T00:00:51.661735166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:00:52.384674 containerd[1664]: time="2026-01-28T00:00:52.384594990Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:52.387552 containerd[1664]: time="2026-01-28T00:00:52.387506039Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:00:52.387662 containerd[1664]: time="2026-01-28T00:00:52.387569239Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:52.387979 kubelet[2953]: E0128 00:00:52.387899 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:00:52.387979 kubelet[2953]: E0128 00:00:52.387948 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:00:52.388300 kubelet[2953]: E0128 00:00:52.388026 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-fcc6fc97b-npt4s_calico-system(2e0df84e-16dc-494b-a3d3-1071788a0777): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:52.388300 kubelet[2953]: E0128 00:00:52.388058 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:00:53.662900 kubelet[2953]: E0128 00:00:53.662850 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:00:56.662613 containerd[1664]: time="2026-01-28T00:00:56.662533951Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:00:57.076770 containerd[1664]: time="2026-01-28T00:00:57.076712985Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:57.078536 containerd[1664]: time="2026-01-28T00:00:57.078484071Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:00:57.078615 containerd[1664]: time="2026-01-28T00:00:57.078578791Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:57.078898 kubelet[2953]: E0128 00:00:57.078835 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:00:57.079509 kubelet[2953]: E0128 00:00:57.078886 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:00:57.079509 kubelet[2953]: E0128 00:00:57.079273 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sv2v9_calico-system(e61938e0-7077-4d81-9b34-6430d54d8b9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:57.079964 kubelet[2953]: E0128 00:00:57.079597 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:00:58.663498 containerd[1664]: time="2026-01-28T00:00:58.663449947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:00:59.034382 containerd[1664]: time="2026-01-28T00:00:59.034338568Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:00:59.036137 containerd[1664]: time="2026-01-28T00:00:59.036097213Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:00:59.036224 containerd[1664]: time="2026-01-28T00:00:59.036163373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:00:59.036681 kubelet[2953]: E0128 00:00:59.036380 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:00:59.036681 kubelet[2953]: E0128 00:00:59.036429 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:00:59.037717 kubelet[2953]: E0128 00:00:59.036656 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-jlz2t_calico-apiserver(cca7bd93-1a7d-448c-ad36-6b956cecc82e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:00:59.037849 kubelet[2953]: E0128 00:00:59.037689 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:00:59.039085 containerd[1664]: time="2026-01-28T00:00:59.038816422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:01:05.663455 kubelet[2953]: E0128 00:01:05.663380 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:01:06.661861 kubelet[2953]: E0128 00:01:06.661794 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:01:07.068000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.6.5:22-4.153.228.146:52908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:07.070430 kernel: kauditd_printk_skb: 136 callbacks suppressed Jan 28 00:01:07.070482 kernel: audit: type=1130 audit(1769558467.068:747): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.6.5:22-4.153.228.146:52908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:07.069568 systemd[1]: Started sshd@11-10.0.6.5:22-4.153.228.146:52908.service - OpenSSH per-connection server daemon (4.153.228.146:52908). Jan 28 00:01:07.589598 sshd[5230]: Accepted publickey for core from 4.153.228.146 port 52908 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:07.588000 audit[5230]: USER_ACCT pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.592000 audit[5230]: CRED_ACQ pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.594455 sshd-session[5230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:07.596938 kernel: audit: type=1101 audit(1769558467.588:748): pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.597021 kernel: audit: type=1103 audit(1769558467.592:749): pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.599076 kernel: audit: type=1006 audit(1769558467.592:750): pid=5230 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 28 00:01:07.592000 audit[5230]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc06b80e0 a2=3 a3=0 items=0 ppid=1 pid=5230 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:07.603163 kernel: audit: type=1300 audit(1769558467.592:750): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc06b80e0 a2=3 a3=0 items=0 ppid=1 pid=5230 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:07.592000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:07.604828 kernel: audit: type=1327 audit(1769558467.592:750): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:07.605553 systemd-logind[1642]: New session 13 of user core. Jan 28 00:01:07.612091 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 28 00:01:07.613000 audit[5230]: USER_START pid=5230 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.618000 audit[5234]: CRED_ACQ pid=5234 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.622758 kernel: audit: type=1105 audit(1769558467.613:751): pid=5230 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.622849 kernel: audit: type=1103 audit(1769558467.618:752): pid=5234 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.983825 sshd[5234]: Connection closed by 4.153.228.146 port 52908 Jan 28 00:01:07.983987 sshd-session[5230]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:07.985000 audit[5230]: USER_END pid=5230 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.988772 systemd[1]: sshd@11-10.0.6.5:22-4.153.228.146:52908.service: Deactivated successfully. Jan 28 00:01:07.985000 audit[5230]: CRED_DISP pid=5230 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.990884 systemd[1]: session-13.scope: Deactivated successfully. Jan 28 00:01:07.992643 systemd-logind[1642]: Session 13 logged out. Waiting for processes to exit. Jan 28 00:01:07.992960 kernel: audit: type=1106 audit(1769558467.985:753): pid=5230 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.993018 kernel: audit: type=1104 audit(1769558467.985:754): pid=5230 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:07.988000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.6.5:22-4.153.228.146:52908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:07.996329 systemd-logind[1642]: Removed session 13. Jan 28 00:01:08.663204 kubelet[2953]: E0128 00:01:08.662954 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:01:09.190545 containerd[1664]: time="2026-01-28T00:01:09.190500454Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:01:09.192196 containerd[1664]: time="2026-01-28T00:01:09.192158779Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:01:09.192269 containerd[1664]: time="2026-01-28T00:01:09.192217419Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:01:09.192516 kubelet[2953]: E0128 00:01:09.192477 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:01:09.192559 kubelet[2953]: E0128 00:01:09.192527 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:01:09.192924 containerd[1664]: time="2026-01-28T00:01:09.192888701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:01:09.193064 kubelet[2953]: E0128 00:01:09.193017 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-2hkvv_calico-apiserver(7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:01:09.193127 kubelet[2953]: E0128 00:01:09.193086 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:01:13.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.6.5:22-4.153.228.146:52920 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:13.095765 systemd[1]: Started sshd@12-10.0.6.5:22-4.153.228.146:52920.service - OpenSSH per-connection server daemon (4.153.228.146:52920). Jan 28 00:01:13.099266 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:01:13.099350 kernel: audit: type=1130 audit(1769558473.095:756): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.6.5:22-4.153.228.146:52920 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:13.621000 audit[5249]: USER_ACCT pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.622942 sshd[5249]: Accepted publickey for core from 4.153.228.146 port 52920 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:13.625000 audit[5249]: CRED_ACQ pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.627542 sshd-session[5249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:13.629112 kernel: audit: type=1101 audit(1769558473.621:757): pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.629176 kernel: audit: type=1103 audit(1769558473.625:758): pid=5249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.629198 kernel: audit: type=1006 audit(1769558473.625:759): pid=5249 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 28 00:01:13.625000 audit[5249]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd7253540 a2=3 a3=0 items=0 ppid=1 pid=5249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:13.634442 kernel: audit: type=1300 audit(1769558473.625:759): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd7253540 a2=3 a3=0 items=0 ppid=1 pid=5249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:13.625000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:13.636186 kernel: audit: type=1327 audit(1769558473.625:759): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:13.640791 systemd-logind[1642]: New session 14 of user core. Jan 28 00:01:13.648975 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 28 00:01:13.655000 audit[5249]: USER_START pid=5249 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.664321 kubelet[2953]: E0128 00:01:13.664267 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:01:13.661000 audit[5253]: CRED_ACQ pid=5253 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.667520 kernel: audit: type=1105 audit(1769558473.655:760): pid=5249 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.667778 kernel: audit: type=1103 audit(1769558473.661:761): pid=5253 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.977266 sshd[5253]: Connection closed by 4.153.228.146 port 52920 Jan 28 00:01:13.978458 sshd-session[5249]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:13.979000 audit[5249]: USER_END pid=5249 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.984909 systemd[1]: session-14.scope: Deactivated successfully. Jan 28 00:01:13.979000 audit[5249]: CRED_DISP pid=5249 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.988420 kernel: audit: type=1106 audit(1769558473.979:762): pid=5249 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.988503 kernel: audit: type=1104 audit(1769558473.979:763): pid=5249 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:13.985720 systemd[1]: sshd@12-10.0.6.5:22-4.153.228.146:52920.service: Deactivated successfully. Jan 28 00:01:13.985000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.6.5:22-4.153.228.146:52920 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:13.989188 systemd-logind[1642]: Session 14 logged out. Waiting for processes to exit. Jan 28 00:01:13.990324 systemd-logind[1642]: Removed session 14. Jan 28 00:01:15.695642 containerd[1664]: time="2026-01-28T00:01:15.695582387Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:01:15.698325 containerd[1664]: time="2026-01-28T00:01:15.698261075Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:01:15.698398 containerd[1664]: time="2026-01-28T00:01:15.698327195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:01:15.698600 kubelet[2953]: E0128 00:01:15.698561 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:01:15.699173 kubelet[2953]: E0128 00:01:15.698611 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:01:15.699173 kubelet[2953]: E0128 00:01:15.698702 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:01:15.700188 containerd[1664]: time="2026-01-28T00:01:15.699798240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:01:16.114029 containerd[1664]: time="2026-01-28T00:01:16.113978994Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:01:16.116102 containerd[1664]: time="2026-01-28T00:01:16.116030040Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:01:16.116173 containerd[1664]: time="2026-01-28T00:01:16.116063560Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:01:16.116253 kubelet[2953]: E0128 00:01:16.116212 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:01:16.116301 kubelet[2953]: E0128 00:01:16.116265 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:01:16.116374 kubelet[2953]: E0128 00:01:16.116351 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:01:16.116608 kubelet[2953]: E0128 00:01:16.116396 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:01:18.662803 kubelet[2953]: E0128 00:01:18.662756 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:01:19.084460 systemd[1]: Started sshd@13-10.0.6.5:22-4.153.228.146:54302.service - OpenSSH per-connection server daemon (4.153.228.146:54302). Jan 28 00:01:19.083000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.6.5:22-4.153.228.146:54302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:19.088191 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:01:19.088257 kernel: audit: type=1130 audit(1769558479.083:765): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.6.5:22-4.153.228.146:54302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:19.603000 audit[5292]: USER_ACCT pid=5292 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.604958 sshd[5292]: Accepted publickey for core from 4.153.228.146 port 54302 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:19.607852 kernel: audit: type=1101 audit(1769558479.603:766): pid=5292 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.607912 kernel: audit: type=1103 audit(1769558479.607:767): pid=5292 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.607000 audit[5292]: CRED_ACQ pid=5292 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.608928 sshd-session[5292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:19.612473 kernel: audit: type=1006 audit(1769558479.607:768): pid=5292 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 28 00:01:19.607000 audit[5292]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdee2f520 a2=3 a3=0 items=0 ppid=1 pid=5292 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:19.616666 kernel: audit: type=1300 audit(1769558479.607:768): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdee2f520 a2=3 a3=0 items=0 ppid=1 pid=5292 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:19.616769 kernel: audit: type=1327 audit(1769558479.607:768): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:19.607000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:19.616296 systemd-logind[1642]: New session 15 of user core. Jan 28 00:01:19.618924 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 28 00:01:19.620000 audit[5292]: USER_START pid=5292 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.625749 kernel: audit: type=1105 audit(1769558479.620:769): pid=5292 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.625812 kernel: audit: type=1103 audit(1769558479.624:770): pid=5296 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.624000 audit[5296]: CRED_ACQ pid=5296 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.952546 sshd[5296]: Connection closed by 4.153.228.146 port 54302 Jan 28 00:01:19.952946 sshd-session[5292]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:19.954000 audit[5292]: USER_END pid=5292 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.957598 systemd[1]: sshd@13-10.0.6.5:22-4.153.228.146:54302.service: Deactivated successfully. Jan 28 00:01:19.954000 audit[5292]: CRED_DISP pid=5292 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.960584 systemd[1]: session-15.scope: Deactivated successfully. Jan 28 00:01:19.962331 kernel: audit: type=1106 audit(1769558479.954:771): pid=5292 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.962423 kernel: audit: type=1104 audit(1769558479.954:772): pid=5292 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:19.958000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.6.5:22-4.153.228.146:54302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:19.962124 systemd-logind[1642]: Session 15 logged out. Waiting for processes to exit. Jan 28 00:01:19.964149 systemd-logind[1642]: Removed session 15. Jan 28 00:01:20.067000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.6.5:22-4.153.228.146:54306 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:20.068013 systemd[1]: Started sshd@14-10.0.6.5:22-4.153.228.146:54306.service - OpenSSH per-connection server daemon (4.153.228.146:54306). Jan 28 00:01:20.599000 audit[5310]: USER_ACCT pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:20.600845 sshd[5310]: Accepted publickey for core from 4.153.228.146 port 54306 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:20.601000 audit[5310]: CRED_ACQ pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:20.601000 audit[5310]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc7d5c0a0 a2=3 a3=0 items=0 ppid=1 pid=5310 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:20.601000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:20.602619 sshd-session[5310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:20.607547 systemd-logind[1642]: New session 16 of user core. Jan 28 00:01:20.614121 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 28 00:01:20.616000 audit[5310]: USER_START pid=5310 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:20.618000 audit[5314]: CRED_ACQ pid=5314 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:20.662866 kubelet[2953]: E0128 00:01:20.662815 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:01:20.988099 sshd[5314]: Connection closed by 4.153.228.146 port 54306 Jan 28 00:01:20.987787 sshd-session[5310]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:20.989000 audit[5310]: USER_END pid=5310 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:20.989000 audit[5310]: CRED_DISP pid=5310 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:20.993144 systemd[1]: sshd@14-10.0.6.5:22-4.153.228.146:54306.service: Deactivated successfully. Jan 28 00:01:20.992000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.6.5:22-4.153.228.146:54306 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:20.997249 systemd[1]: session-16.scope: Deactivated successfully. Jan 28 00:01:21.000083 systemd-logind[1642]: Session 16 logged out. Waiting for processes to exit. Jan 28 00:01:21.001446 systemd-logind[1642]: Removed session 16. Jan 28 00:01:21.095296 systemd[1]: Started sshd@15-10.0.6.5:22-4.153.228.146:54308.service - OpenSSH per-connection server daemon (4.153.228.146:54308). Jan 28 00:01:21.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.6.5:22-4.153.228.146:54308 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:21.625000 audit[5325]: USER_ACCT pid=5325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:21.625941 sshd[5325]: Accepted publickey for core from 4.153.228.146 port 54308 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:21.626000 audit[5325]: CRED_ACQ pid=5325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:21.626000 audit[5325]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe204b9d0 a2=3 a3=0 items=0 ppid=1 pid=5325 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:21.626000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:21.628070 sshd-session[5325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:21.632828 systemd-logind[1642]: New session 17 of user core. Jan 28 00:01:21.645968 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 28 00:01:21.647000 audit[5325]: USER_START pid=5325 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:21.649000 audit[5329]: CRED_ACQ pid=5329 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:21.662458 kubelet[2953]: E0128 00:01:21.662420 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:01:21.995594 sshd[5329]: Connection closed by 4.153.228.146 port 54308 Jan 28 00:01:21.996912 sshd-session[5325]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:21.999000 audit[5325]: USER_END pid=5325 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:21.999000 audit[5325]: CRED_DISP pid=5325 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:22.002991 systemd[1]: sshd@15-10.0.6.5:22-4.153.228.146:54308.service: Deactivated successfully. Jan 28 00:01:22.002000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.6.5:22-4.153.228.146:54308 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:22.005147 systemd[1]: session-17.scope: Deactivated successfully. Jan 28 00:01:22.007375 systemd-logind[1642]: Session 17 logged out. Waiting for processes to exit. Jan 28 00:01:22.009001 systemd-logind[1642]: Removed session 17. Jan 28 00:01:24.661578 kubelet[2953]: E0128 00:01:24.661529 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:01:27.102436 systemd[1]: Started sshd@16-10.0.6.5:22-4.153.228.146:35094.service - OpenSSH per-connection server daemon (4.153.228.146:35094). Jan 28 00:01:27.101000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.6.5:22-4.153.228.146:35094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:27.103323 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 28 00:01:27.103373 kernel: audit: type=1130 audit(1769558487.101:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.6.5:22-4.153.228.146:35094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:27.621000 audit[5342]: USER_ACCT pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.622204 sshd[5342]: Accepted publickey for core from 4.153.228.146 port 35094 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:27.624000 audit[5342]: CRED_ACQ pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.626319 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:27.628470 kernel: audit: type=1101 audit(1769558487.621:793): pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.628582 kernel: audit: type=1103 audit(1769558487.624:794): pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.630562 kernel: audit: type=1006 audit(1769558487.624:795): pid=5342 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 28 00:01:27.630643 kernel: audit: type=1300 audit(1769558487.624:795): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc0a46970 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:27.624000 audit[5342]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc0a46970 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:27.634503 systemd-logind[1642]: New session 18 of user core. Jan 28 00:01:27.624000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:27.636518 kernel: audit: type=1327 audit(1769558487.624:795): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:27.638017 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 28 00:01:27.640000 audit[5342]: USER_START pid=5342 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.642000 audit[5346]: CRED_ACQ pid=5346 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.648265 kernel: audit: type=1105 audit(1769558487.640:796): pid=5342 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.648398 kernel: audit: type=1103 audit(1769558487.642:797): pid=5346 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.665917 kubelet[2953]: E0128 00:01:27.665067 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:01:27.665917 kubelet[2953]: E0128 00:01:27.665217 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:01:27.969935 sshd[5346]: Connection closed by 4.153.228.146 port 35094 Jan 28 00:01:27.970292 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:27.972000 audit[5342]: USER_END pid=5342 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.975453 systemd[1]: sshd@16-10.0.6.5:22-4.153.228.146:35094.service: Deactivated successfully. Jan 28 00:01:27.972000 audit[5342]: CRED_DISP pid=5342 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.977235 systemd[1]: session-18.scope: Deactivated successfully. Jan 28 00:01:27.978096 systemd-logind[1642]: Session 18 logged out. Waiting for processes to exit. Jan 28 00:01:27.979209 systemd-logind[1642]: Removed session 18. Jan 28 00:01:27.980011 kernel: audit: type=1106 audit(1769558487.972:798): pid=5342 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.980059 kernel: audit: type=1104 audit(1769558487.972:799): pid=5342 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:27.974000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.6.5:22-4.153.228.146:35094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:29.663379 kubelet[2953]: E0128 00:01:29.663181 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:01:31.662830 kubelet[2953]: E0128 00:01:31.662707 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:01:33.074000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.6.5:22-4.153.228.146:35102 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:33.075152 systemd[1]: Started sshd@17-10.0.6.5:22-4.153.228.146:35102.service - OpenSSH per-connection server daemon (4.153.228.146:35102). Jan 28 00:01:33.075912 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:01:33.075972 kernel: audit: type=1130 audit(1769558493.074:801): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.6.5:22-4.153.228.146:35102 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:33.601000 audit[5361]: USER_ACCT pid=5361 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.604751 sshd-session[5361]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:33.605202 sshd[5361]: Accepted publickey for core from 4.153.228.146 port 35102 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:33.602000 audit[5361]: CRED_ACQ pid=5361 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.610702 kernel: audit: type=1101 audit(1769558493.601:802): pid=5361 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.610781 kernel: audit: type=1103 audit(1769558493.602:803): pid=5361 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.613472 kernel: audit: type=1006 audit(1769558493.602:804): pid=5361 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 28 00:01:33.613567 kernel: audit: type=1300 audit(1769558493.602:804): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd4b5a00 a2=3 a3=0 items=0 ppid=1 pid=5361 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:33.602000 audit[5361]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd4b5a00 a2=3 a3=0 items=0 ppid=1 pid=5361 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:33.615919 systemd-logind[1642]: New session 19 of user core. Jan 28 00:01:33.602000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:33.620960 kernel: audit: type=1327 audit(1769558493.602:804): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:33.621395 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 28 00:01:33.624000 audit[5361]: USER_START pid=5361 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.624000 audit[5365]: CRED_ACQ pid=5365 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.633747 kernel: audit: type=1105 audit(1769558493.624:805): pid=5361 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.633863 kernel: audit: type=1103 audit(1769558493.624:806): pid=5365 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.975182 sshd[5365]: Connection closed by 4.153.228.146 port 35102 Jan 28 00:01:33.975983 sshd-session[5361]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:33.976000 audit[5361]: USER_END pid=5361 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.977000 audit[5361]: CRED_DISP pid=5361 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.982201 systemd[1]: sshd@17-10.0.6.5:22-4.153.228.146:35102.service: Deactivated successfully. Jan 28 00:01:33.984674 systemd[1]: session-19.scope: Deactivated successfully. Jan 28 00:01:33.985245 kernel: audit: type=1106 audit(1769558493.976:807): pid=5361 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.985313 kernel: audit: type=1104 audit(1769558493.977:808): pid=5361 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:33.982000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.6.5:22-4.153.228.146:35102 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:33.986387 systemd-logind[1642]: Session 19 logged out. Waiting for processes to exit. Jan 28 00:01:33.987557 systemd-logind[1642]: Removed session 19. Jan 28 00:01:34.661769 kubelet[2953]: E0128 00:01:34.661708 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:01:38.663176 kubelet[2953]: E0128 00:01:38.663131 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:01:39.079000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.6.5:22-4.153.228.146:46020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:39.080255 systemd[1]: Started sshd@18-10.0.6.5:22-4.153.228.146:46020.service - OpenSSH per-connection server daemon (4.153.228.146:46020). Jan 28 00:01:39.081227 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:01:39.081256 kernel: audit: type=1130 audit(1769558499.079:810): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.6.5:22-4.153.228.146:46020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:39.596000 audit[5382]: USER_ACCT pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.597869 sshd[5382]: Accepted publickey for core from 4.153.228.146 port 46020 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:39.600000 audit[5382]: CRED_ACQ pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.602058 sshd-session[5382]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:39.605620 kernel: audit: type=1101 audit(1769558499.596:811): pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.605715 kernel: audit: type=1103 audit(1769558499.600:812): pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.605752 kernel: audit: type=1006 audit(1769558499.600:813): pid=5382 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 28 00:01:39.600000 audit[5382]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc2beba50 a2=3 a3=0 items=0 ppid=1 pid=5382 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:39.610036 kernel: audit: type=1300 audit(1769558499.600:813): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc2beba50 a2=3 a3=0 items=0 ppid=1 pid=5382 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:39.600000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:39.611588 kernel: audit: type=1327 audit(1769558499.600:813): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:39.615633 systemd-logind[1642]: New session 20 of user core. Jan 28 00:01:39.620915 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 28 00:01:39.622000 audit[5382]: USER_START pid=5382 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.626000 audit[5386]: CRED_ACQ pid=5386 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.630402 kernel: audit: type=1105 audit(1769558499.622:814): pid=5382 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.630465 kernel: audit: type=1103 audit(1769558499.626:815): pid=5386 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.968585 sshd[5386]: Connection closed by 4.153.228.146 port 46020 Jan 28 00:01:39.969282 sshd-session[5382]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:39.970000 audit[5382]: USER_END pid=5382 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.976314 systemd[1]: sshd@18-10.0.6.5:22-4.153.228.146:46020.service: Deactivated successfully. Jan 28 00:01:39.970000 audit[5382]: CRED_DISP pid=5382 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.978577 systemd[1]: session-20.scope: Deactivated successfully. Jan 28 00:01:39.979552 systemd-logind[1642]: Session 20 logged out. Waiting for processes to exit. Jan 28 00:01:39.981082 kernel: audit: type=1106 audit(1769558499.970:816): pid=5382 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.981526 kernel: audit: type=1104 audit(1769558499.970:817): pid=5382 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:39.975000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.6.5:22-4.153.228.146:46020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:39.982062 systemd-logind[1642]: Removed session 20. Jan 28 00:01:40.663645 kubelet[2953]: E0128 00:01:40.663594 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:01:41.662083 kubelet[2953]: E0128 00:01:41.661929 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:01:42.664230 kubelet[2953]: E0128 00:01:42.664180 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:01:43.664311 kubelet[2953]: E0128 00:01:43.664227 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:01:45.073641 systemd[1]: Started sshd@19-10.0.6.5:22-4.153.228.146:35766.service - OpenSSH per-connection server daemon (4.153.228.146:35766). Jan 28 00:01:45.073000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.6.5:22-4.153.228.146:35766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:45.075214 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:01:45.075269 kernel: audit: type=1130 audit(1769558505.073:819): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.6.5:22-4.153.228.146:35766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:45.591000 audit[5403]: USER_ACCT pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.592783 sshd[5403]: Accepted publickey for core from 4.153.228.146 port 35766 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:45.594000 audit[5403]: CRED_ACQ pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.596595 sshd-session[5403]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:45.599122 kernel: audit: type=1101 audit(1769558505.591:820): pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.599190 kernel: audit: type=1103 audit(1769558505.594:821): pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.601213 kernel: audit: type=1006 audit(1769558505.594:822): pid=5403 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 28 00:01:45.594000 audit[5403]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffcdd0a40 a2=3 a3=0 items=0 ppid=1 pid=5403 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:45.606171 kernel: audit: type=1300 audit(1769558505.594:822): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffcdd0a40 a2=3 a3=0 items=0 ppid=1 pid=5403 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:45.606268 kernel: audit: type=1327 audit(1769558505.594:822): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:45.594000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:45.604953 systemd-logind[1642]: New session 21 of user core. Jan 28 00:01:45.611960 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 28 00:01:45.614000 audit[5403]: USER_START pid=5403 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.618000 audit[5407]: CRED_ACQ pid=5407 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.622572 kernel: audit: type=1105 audit(1769558505.614:823): pid=5403 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.622654 kernel: audit: type=1103 audit(1769558505.618:824): pid=5407 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.947021 sshd[5407]: Connection closed by 4.153.228.146 port 35766 Jan 28 00:01:45.947143 sshd-session[5403]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:45.948000 audit[5403]: USER_END pid=5403 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.951837 systemd[1]: sshd@19-10.0.6.5:22-4.153.228.146:35766.service: Deactivated successfully. Jan 28 00:01:45.948000 audit[5403]: CRED_DISP pid=5403 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.954031 systemd[1]: session-21.scope: Deactivated successfully. Jan 28 00:01:45.956316 kernel: audit: type=1106 audit(1769558505.948:825): pid=5403 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.956394 kernel: audit: type=1104 audit(1769558505.948:826): pid=5403 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:45.956645 systemd-logind[1642]: Session 21 logged out. Waiting for processes to exit. Jan 28 00:01:45.950000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.6.5:22-4.153.228.146:35766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:45.958434 systemd-logind[1642]: Removed session 21. Jan 28 00:01:48.661948 kubelet[2953]: E0128 00:01:48.661892 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:01:51.053857 systemd[1]: Started sshd@20-10.0.6.5:22-4.153.228.146:35772.service - OpenSSH per-connection server daemon (4.153.228.146:35772). Jan 28 00:01:51.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.6.5:22-4.153.228.146:35772 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:51.055042 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:01:51.055131 kernel: audit: type=1130 audit(1769558511.053:828): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.6.5:22-4.153.228.146:35772 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:51.572000 audit[5446]: USER_ACCT pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.576818 sshd[5446]: Accepted publickey for core from 4.153.228.146 port 35772 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:51.576000 audit[5446]: CRED_ACQ pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.580169 kernel: audit: type=1101 audit(1769558511.572:829): pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.580247 kernel: audit: type=1103 audit(1769558511.576:830): pid=5446 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.577663 sshd-session[5446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:51.576000 audit[5446]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc1fb0d30 a2=3 a3=0 items=0 ppid=1 pid=5446 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:51.582837 kernel: audit: type=1006 audit(1769558511.576:831): pid=5446 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 28 00:01:51.576000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:51.587930 kernel: audit: type=1300 audit(1769558511.576:831): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc1fb0d30 a2=3 a3=0 items=0 ppid=1 pid=5446 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:51.588012 kernel: audit: type=1327 audit(1769558511.576:831): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:51.590963 systemd-logind[1642]: New session 22 of user core. Jan 28 00:01:51.598956 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 28 00:01:51.601000 audit[5446]: USER_START pid=5446 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.606761 kernel: audit: type=1105 audit(1769558511.601:832): pid=5446 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.606000 audit[5450]: CRED_ACQ pid=5450 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.610770 kernel: audit: type=1103 audit(1769558511.606:833): pid=5450 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.666935 kubelet[2953]: E0128 00:01:51.666871 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:01:51.921098 sshd[5450]: Connection closed by 4.153.228.146 port 35772 Jan 28 00:01:51.920925 sshd-session[5446]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:51.924000 audit[5446]: USER_END pid=5446 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.931682 systemd[1]: sshd@20-10.0.6.5:22-4.153.228.146:35772.service: Deactivated successfully. Jan 28 00:01:51.924000 audit[5446]: CRED_DISP pid=5446 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.935476 kernel: audit: type=1106 audit(1769558511.924:834): pid=5446 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.935558 kernel: audit: type=1104 audit(1769558511.924:835): pid=5446 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:51.934080 systemd[1]: session-22.scope: Deactivated successfully. Jan 28 00:01:51.931000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.6.5:22-4.153.228.146:35772 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:51.937436 systemd-logind[1642]: Session 22 logged out. Waiting for processes to exit. Jan 28 00:01:51.938836 systemd-logind[1642]: Removed session 22. Jan 28 00:01:52.662338 kubelet[2953]: E0128 00:01:52.662283 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:01:53.662896 kubelet[2953]: E0128 00:01:53.662850 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:01:54.661790 kubelet[2953]: E0128 00:01:54.661721 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:01:56.662172 kubelet[2953]: E0128 00:01:56.662123 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:01:57.025675 systemd[1]: Started sshd@21-10.0.6.5:22-4.153.228.146:35010.service - OpenSSH per-connection server daemon (4.153.228.146:35010). Jan 28 00:01:57.025000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.6.5:22-4.153.228.146:35010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:57.029389 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:01:57.029474 kernel: audit: type=1130 audit(1769558517.025:837): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.6.5:22-4.153.228.146:35010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:57.540637 sshd[5470]: Accepted publickey for core from 4.153.228.146 port 35010 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:57.539000 audit[5470]: USER_ACCT pid=5470 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.543000 audit[5470]: CRED_ACQ pid=5470 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.544873 sshd-session[5470]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:57.546995 kernel: audit: type=1101 audit(1769558517.539:838): pid=5470 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.547089 kernel: audit: type=1103 audit(1769558517.543:839): pid=5470 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.549105 kernel: audit: type=1006 audit(1769558517.543:840): pid=5470 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 28 00:01:57.543000 audit[5470]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe20544e0 a2=3 a3=0 items=0 ppid=1 pid=5470 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:57.552911 kernel: audit: type=1300 audit(1769558517.543:840): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe20544e0 a2=3 a3=0 items=0 ppid=1 pid=5470 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:57.543000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:57.554223 kernel: audit: type=1327 audit(1769558517.543:840): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:57.553229 systemd-logind[1642]: New session 23 of user core. Jan 28 00:01:57.563954 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 28 00:01:57.566000 audit[5470]: USER_START pid=5470 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.570000 audit[5474]: CRED_ACQ pid=5474 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.574573 kernel: audit: type=1105 audit(1769558517.566:841): pid=5470 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.574673 kernel: audit: type=1103 audit(1769558517.570:842): pid=5474 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.901419 sshd[5474]: Connection closed by 4.153.228.146 port 35010 Jan 28 00:01:57.900692 sshd-session[5470]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:57.901000 audit[5470]: USER_END pid=5470 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.901000 audit[5470]: CRED_DISP pid=5470 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.907701 systemd[1]: sshd@21-10.0.6.5:22-4.153.228.146:35010.service: Deactivated successfully. Jan 28 00:01:57.910572 kernel: audit: type=1106 audit(1769558517.901:843): pid=5470 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.910649 kernel: audit: type=1104 audit(1769558517.901:844): pid=5470 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:57.907000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.6.5:22-4.153.228.146:35010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:57.910486 systemd[1]: session-23.scope: Deactivated successfully. Jan 28 00:01:57.912168 systemd-logind[1642]: Session 23 logged out. Waiting for processes to exit. Jan 28 00:01:57.915957 systemd-logind[1642]: Removed session 23. Jan 28 00:01:58.009000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.6.5:22-4.153.228.146:35014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:58.010257 systemd[1]: Started sshd@22-10.0.6.5:22-4.153.228.146:35014.service - OpenSSH per-connection server daemon (4.153.228.146:35014). Jan 28 00:01:58.534000 audit[5487]: USER_ACCT pid=5487 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:58.535294 sshd[5487]: Accepted publickey for core from 4.153.228.146 port 35014 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:58.535000 audit[5487]: CRED_ACQ pid=5487 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:58.535000 audit[5487]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf47a460 a2=3 a3=0 items=0 ppid=1 pid=5487 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:58.535000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:58.537396 sshd-session[5487]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:58.542850 systemd-logind[1642]: New session 24 of user core. Jan 28 00:01:58.553007 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 28 00:01:58.555000 audit[5487]: USER_START pid=5487 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:58.557000 audit[5491]: CRED_ACQ pid=5491 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:58.974469 sshd[5491]: Connection closed by 4.153.228.146 port 35014 Jan 28 00:01:58.974320 sshd-session[5487]: pam_unix(sshd:session): session closed for user core Jan 28 00:01:58.975000 audit[5487]: USER_END pid=5487 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:58.975000 audit[5487]: CRED_DISP pid=5487 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:58.980067 systemd[1]: sshd@22-10.0.6.5:22-4.153.228.146:35014.service: Deactivated successfully. Jan 28 00:01:58.979000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.6.5:22-4.153.228.146:35014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:58.982346 systemd[1]: session-24.scope: Deactivated successfully. Jan 28 00:01:58.983367 systemd-logind[1642]: Session 24 logged out. Waiting for processes to exit. Jan 28 00:01:58.984586 systemd-logind[1642]: Removed session 24. Jan 28 00:01:59.080000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.6.5:22-4.153.228.146:35024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:01:59.081178 systemd[1]: Started sshd@23-10.0.6.5:22-4.153.228.146:35024.service - OpenSSH per-connection server daemon (4.153.228.146:35024). Jan 28 00:01:59.601000 audit[5502]: USER_ACCT pid=5502 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:59.602487 sshd[5502]: Accepted publickey for core from 4.153.228.146 port 35024 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:01:59.602000 audit[5502]: CRED_ACQ pid=5502 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:59.602000 audit[5502]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff59cb050 a2=3 a3=0 items=0 ppid=1 pid=5502 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:01:59.602000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:01:59.604092 sshd-session[5502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:01:59.608237 systemd-logind[1642]: New session 25 of user core. Jan 28 00:01:59.618225 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 28 00:01:59.620000 audit[5502]: USER_START pid=5502 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:01:59.621000 audit[5506]: CRED_ACQ pid=5506 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:00.426000 audit[5518]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5518 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:02:00.426000 audit[5518]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffda77dbc0 a2=0 a3=1 items=0 ppid=3063 pid=5518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:00.426000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:02:00.432000 audit[5518]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5518 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:02:00.432000 audit[5518]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffda77dbc0 a2=0 a3=1 items=0 ppid=3063 pid=5518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:00.432000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:02:00.528164 sshd[5506]: Connection closed by 4.153.228.146 port 35024 Jan 28 00:02:00.528466 sshd-session[5502]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:00.530000 audit[5502]: USER_END pid=5502 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:00.530000 audit[5502]: CRED_DISP pid=5502 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:00.535493 systemd[1]: sshd@23-10.0.6.5:22-4.153.228.146:35024.service: Deactivated successfully. Jan 28 00:02:00.537000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.6.5:22-4.153.228.146:35024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:00.540697 systemd[1]: session-25.scope: Deactivated successfully. Jan 28 00:02:00.542963 systemd-logind[1642]: Session 25 logged out. Waiting for processes to exit. Jan 28 00:02:00.546492 systemd-logind[1642]: Removed session 25. Jan 28 00:02:00.637000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.6.5:22-4.153.228.146:35032 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:00.638054 systemd[1]: Started sshd@24-10.0.6.5:22-4.153.228.146:35032.service - OpenSSH per-connection server daemon (4.153.228.146:35032). Jan 28 00:02:00.661709 kubelet[2953]: E0128 00:02:00.661654 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:02:01.170000 audit[5523]: USER_ACCT pid=5523 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:01.171482 sshd[5523]: Accepted publickey for core from 4.153.228.146 port 35032 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:02:01.171000 audit[5523]: CRED_ACQ pid=5523 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:01.171000 audit[5523]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe276fa50 a2=3 a3=0 items=0 ppid=1 pid=5523 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:01.171000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:01.173355 sshd-session[5523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:02:01.181104 systemd-logind[1642]: New session 26 of user core. Jan 28 00:02:01.186204 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 28 00:02:01.189000 audit[5523]: USER_START pid=5523 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:01.191000 audit[5527]: CRED_ACQ pid=5527 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:01.451000 audit[5535]: NETFILTER_CFG table=filter:144 family=2 entries=38 op=nft_register_rule pid=5535 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:02:01.451000 audit[5535]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc0e7b130 a2=0 a3=1 items=0 ppid=3063 pid=5535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:01.451000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:02:01.455000 audit[5535]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5535 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:02:01.455000 audit[5535]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc0e7b130 a2=0 a3=1 items=0 ppid=3063 pid=5535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:01.455000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:02:01.643438 sshd[5527]: Connection closed by 4.153.228.146 port 35032 Jan 28 00:02:01.643782 sshd-session[5523]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:01.644000 audit[5523]: USER_END pid=5523 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:01.644000 audit[5523]: CRED_DISP pid=5523 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:01.647917 systemd[1]: sshd@24-10.0.6.5:22-4.153.228.146:35032.service: Deactivated successfully. Jan 28 00:02:01.647000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.6.5:22-4.153.228.146:35032 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:01.650057 systemd[1]: session-26.scope: Deactivated successfully. Jan 28 00:02:01.650843 systemd-logind[1642]: Session 26 logged out. Waiting for processes to exit. Jan 28 00:02:01.652270 systemd-logind[1642]: Removed session 26. Jan 28 00:02:01.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.6.5:22-4.153.228.146:35042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:01.750021 systemd[1]: Started sshd@25-10.0.6.5:22-4.153.228.146:35042.service - OpenSSH per-connection server daemon (4.153.228.146:35042). Jan 28 00:02:02.261000 audit[5541]: USER_ACCT pid=5541 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.262668 sshd[5541]: Accepted publickey for core from 4.153.228.146 port 35042 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:02:02.264176 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 28 00:02:02.264246 kernel: audit: type=1101 audit(1769558522.261:878): pid=5541 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.264508 sshd-session[5541]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:02:02.263000 audit[5541]: CRED_ACQ pid=5541 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.269614 kernel: audit: type=1103 audit(1769558522.263:879): pid=5541 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.272068 kernel: audit: type=1006 audit(1769558522.263:880): pid=5541 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 28 00:02:02.263000 audit[5541]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe751c0b0 a2=3 a3=0 items=0 ppid=1 pid=5541 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:02.275488 kernel: audit: type=1300 audit(1769558522.263:880): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe751c0b0 a2=3 a3=0 items=0 ppid=1 pid=5541 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:02.263000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:02.277157 kernel: audit: type=1327 audit(1769558522.263:880): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:02.277790 systemd-logind[1642]: New session 27 of user core. Jan 28 00:02:02.284065 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 28 00:02:02.286000 audit[5541]: USER_START pid=5541 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.290000 audit[5545]: CRED_ACQ pid=5545 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.293974 kernel: audit: type=1105 audit(1769558522.286:881): pid=5541 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.294043 kernel: audit: type=1103 audit(1769558522.290:882): pid=5545 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.635225 sshd[5545]: Connection closed by 4.153.228.146 port 35042 Jan 28 00:02:02.635938 sshd-session[5541]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:02.636000 audit[5541]: USER_END pid=5541 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.640618 systemd[1]: sshd@25-10.0.6.5:22-4.153.228.146:35042.service: Deactivated successfully. Jan 28 00:02:02.636000 audit[5541]: CRED_DISP pid=5541 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.643007 systemd[1]: session-27.scope: Deactivated successfully. Jan 28 00:02:02.644585 kernel: audit: type=1106 audit(1769558522.636:883): pid=5541 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.644694 kernel: audit: type=1104 audit(1769558522.636:884): pid=5541 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:02.640000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.6.5:22-4.153.228.146:35042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:02.647940 kernel: audit: type=1131 audit(1769558522.640:885): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.6.5:22-4.153.228.146:35042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:02.648483 systemd-logind[1642]: Session 27 logged out. Waiting for processes to exit. Jan 28 00:02:02.649780 systemd-logind[1642]: Removed session 27. Jan 28 00:02:04.662520 kubelet[2953]: E0128 00:02:04.662461 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:02:04.856000 audit[5561]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5561 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:02:04.856000 audit[5561]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd65114e0 a2=0 a3=1 items=0 ppid=3063 pid=5561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:04.856000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:02:04.865000 audit[5561]: NETFILTER_CFG table=nat:147 family=2 entries=104 op=nft_register_chain pid=5561 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:02:04.865000 audit[5561]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffd65114e0 a2=0 a3=1 items=0 ppid=3063 pid=5561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:04.865000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:02:05.665705 kubelet[2953]: E0128 00:02:05.665659 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:02:06.661812 kubelet[2953]: E0128 00:02:06.661751 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:02:07.661711 kubelet[2953]: E0128 00:02:07.661640 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:02:07.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.6.5:22-4.153.228.146:38536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:07.740154 systemd[1]: Started sshd@26-10.0.6.5:22-4.153.228.146:38536.service - OpenSSH per-connection server daemon (4.153.228.146:38536). Jan 28 00:02:07.740899 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 28 00:02:07.740930 kernel: audit: type=1130 audit(1769558527.739:888): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.6.5:22-4.153.228.146:38536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:08.264000 audit[5565]: USER_ACCT pid=5565 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.265907 sshd[5565]: Accepted publickey for core from 4.153.228.146 port 38536 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:02:08.269269 sshd-session[5565]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:02:08.267000 audit[5565]: CRED_ACQ pid=5565 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.272435 kernel: audit: type=1101 audit(1769558528.264:889): pid=5565 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.272491 kernel: audit: type=1103 audit(1769558528.267:890): pid=5565 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.274398 kernel: audit: type=1006 audit(1769558528.267:891): pid=5565 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Jan 28 00:02:08.274441 kernel: audit: type=1300 audit(1769558528.267:891): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3655a40 a2=3 a3=0 items=0 ppid=1 pid=5565 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:08.267000 audit[5565]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3655a40 a2=3 a3=0 items=0 ppid=1 pid=5565 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:08.267000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:08.278834 kernel: audit: type=1327 audit(1769558528.267:891): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:08.281353 systemd-logind[1642]: New session 28 of user core. Jan 28 00:02:08.287953 systemd[1]: Started session-28.scope - Session 28 of User core. Jan 28 00:02:08.289000 audit[5565]: USER_START pid=5565 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.294755 kernel: audit: type=1105 audit(1769558528.289:892): pid=5565 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.295583 kernel: audit: type=1103 audit(1769558528.294:893): pid=5571 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.294000 audit[5571]: CRED_ACQ pid=5571 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.625699 sshd[5571]: Connection closed by 4.153.228.146 port 38536 Jan 28 00:02:08.626606 sshd-session[5565]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:08.630000 audit[5565]: USER_END pid=5565 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.636085 systemd[1]: sshd@26-10.0.6.5:22-4.153.228.146:38536.service: Deactivated successfully. Jan 28 00:02:08.630000 audit[5565]: CRED_DISP pid=5565 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.638446 systemd[1]: session-28.scope: Deactivated successfully. Jan 28 00:02:08.640001 kernel: audit: type=1106 audit(1769558528.630:894): pid=5565 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.640084 kernel: audit: type=1104 audit(1769558528.630:895): pid=5565 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:08.636000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.6.5:22-4.153.228.146:38536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:08.643839 systemd-logind[1642]: Session 28 logged out. Waiting for processes to exit. Jan 28 00:02:08.645091 systemd-logind[1642]: Removed session 28. Jan 28 00:02:10.662397 containerd[1664]: time="2026-01-28T00:02:10.662355493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 00:02:11.007992 containerd[1664]: time="2026-01-28T00:02:11.007709196Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:11.010133 containerd[1664]: time="2026-01-28T00:02:11.010084763Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 00:02:11.010225 containerd[1664]: time="2026-01-28T00:02:11.010178484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:11.010399 kubelet[2953]: E0128 00:02:11.010360 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:02:11.010658 kubelet[2953]: E0128 00:02:11.010412 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:02:11.010658 kubelet[2953]: E0128 00:02:11.010494 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:11.011635 containerd[1664]: time="2026-01-28T00:02:11.011430647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 00:02:11.344399 containerd[1664]: time="2026-01-28T00:02:11.344113551Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:11.346263 containerd[1664]: time="2026-01-28T00:02:11.346224757Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 00:02:11.346353 containerd[1664]: time="2026-01-28T00:02:11.346302038Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:11.346495 kubelet[2953]: E0128 00:02:11.346457 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:02:11.346556 kubelet[2953]: E0128 00:02:11.346503 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:02:11.346601 kubelet[2953]: E0128 00:02:11.346580 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-79c89656f-55nfw_calico-system(a85ad95c-92af-4836-ae16-c3e124882e38): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:11.346658 kubelet[2953]: E0128 00:02:11.346632 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:02:13.735753 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:02:13.735886 kernel: audit: type=1130 audit(1769558533.733:897): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.6.5:22-4.153.228.146:38538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:13.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.6.5:22-4.153.228.146:38538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:13.734229 systemd[1]: Started sshd@27-10.0.6.5:22-4.153.228.146:38538.service - OpenSSH per-connection server daemon (4.153.228.146:38538). Jan 28 00:02:14.257000 audit[5585]: USER_ACCT pid=5585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.258862 sshd[5585]: Accepted publickey for core from 4.153.228.146 port 38538 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:02:14.261670 sshd-session[5585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:02:14.260000 audit[5585]: CRED_ACQ pid=5585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.265476 kernel: audit: type=1101 audit(1769558534.257:898): pid=5585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.265545 kernel: audit: type=1103 audit(1769558534.260:899): pid=5585 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.265565 kernel: audit: type=1006 audit(1769558534.260:900): pid=5585 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Jan 28 00:02:14.266609 systemd-logind[1642]: New session 29 of user core. Jan 28 00:02:14.260000 audit[5585]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff6e8e8b0 a2=3 a3=0 items=0 ppid=1 pid=5585 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:14.271126 kernel: audit: type=1300 audit(1769558534.260:900): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff6e8e8b0 a2=3 a3=0 items=0 ppid=1 pid=5585 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:14.271182 kernel: audit: type=1327 audit(1769558534.260:900): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:14.260000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:14.277969 systemd[1]: Started session-29.scope - Session 29 of User core. Jan 28 00:02:14.279000 audit[5585]: USER_START pid=5585 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.282000 audit[5589]: CRED_ACQ pid=5589 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.288211 kernel: audit: type=1105 audit(1769558534.279:901): pid=5585 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.288305 kernel: audit: type=1103 audit(1769558534.282:902): pid=5589 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.604016 sshd[5589]: Connection closed by 4.153.228.146 port 38538 Jan 28 00:02:14.604422 sshd-session[5585]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:14.605000 audit[5585]: USER_END pid=5585 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.609151 systemd-logind[1642]: Session 29 logged out. Waiting for processes to exit. Jan 28 00:02:14.609795 systemd[1]: sshd@27-10.0.6.5:22-4.153.228.146:38538.service: Deactivated successfully. Jan 28 00:02:14.605000 audit[5585]: CRED_DISP pid=5585 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.612763 kernel: audit: type=1106 audit(1769558534.605:903): pid=5585 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.612830 kernel: audit: type=1104 audit(1769558534.605:904): pid=5585 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:14.609000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.6.5:22-4.153.228.146:38538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:14.612959 systemd[1]: session-29.scope: Deactivated successfully. Jan 28 00:02:14.617125 systemd-logind[1642]: Removed session 29. Jan 28 00:02:15.662255 kubelet[2953]: E0128 00:02:15.662199 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:02:16.661683 kubelet[2953]: E0128 00:02:16.661614 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:02:16.662501 kubelet[2953]: E0128 00:02:16.662447 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:02:18.663751 containerd[1664]: time="2026-01-28T00:02:18.663697469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:02:19.004698 containerd[1664]: time="2026-01-28T00:02:19.004339716Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:19.006375 containerd[1664]: time="2026-01-28T00:02:19.006262282Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:02:19.006375 containerd[1664]: time="2026-01-28T00:02:19.006286082Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:19.006554 kubelet[2953]: E0128 00:02:19.006513 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:02:19.007279 kubelet[2953]: E0128 00:02:19.006570 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:02:19.007279 kubelet[2953]: E0128 00:02:19.006663 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-fcc6fc97b-npt4s_calico-system(2e0df84e-16dc-494b-a3d3-1071788a0777): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:19.007279 kubelet[2953]: E0128 00:02:19.006695 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:02:19.666305 containerd[1664]: time="2026-01-28T00:02:19.665936230Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:02:19.714447 systemd[1]: Started sshd@28-10.0.6.5:22-4.153.228.146:58410.service - OpenSSH per-connection server daemon (4.153.228.146:58410). Jan 28 00:02:19.713000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.6.5:22-4.153.228.146:58410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:19.717846 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:02:19.717922 kernel: audit: type=1130 audit(1769558539.713:906): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.6.5:22-4.153.228.146:58410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:20.034772 containerd[1664]: time="2026-01-28T00:02:20.034701164Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:20.036189 containerd[1664]: time="2026-01-28T00:02:20.036145568Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:02:20.036295 containerd[1664]: time="2026-01-28T00:02:20.036187689Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:20.036427 kubelet[2953]: E0128 00:02:20.036366 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:02:20.036427 kubelet[2953]: E0128 00:02:20.036414 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:02:20.036790 kubelet[2953]: E0128 00:02:20.036497 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-jlz2t_calico-apiserver(cca7bd93-1a7d-448c-ad36-6b956cecc82e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:20.036790 kubelet[2953]: E0128 00:02:20.036527 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:02:20.243000 audit[5630]: USER_ACCT pid=5630 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.243996 sshd[5630]: Accepted publickey for core from 4.153.228.146 port 58410 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:02:20.247750 kernel: audit: type=1101 audit(1769558540.243:907): pid=5630 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.247000 audit[5630]: CRED_ACQ pid=5630 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.249516 sshd-session[5630]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:02:20.253778 kernel: audit: type=1103 audit(1769558540.247:908): pid=5630 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.253858 kernel: audit: type=1006 audit(1769558540.247:909): pid=5630 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Jan 28 00:02:20.247000 audit[5630]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe3212990 a2=3 a3=0 items=0 ppid=1 pid=5630 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:20.258340 kernel: audit: type=1300 audit(1769558540.247:909): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe3212990 a2=3 a3=0 items=0 ppid=1 pid=5630 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:20.247000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:20.260388 kernel: audit: type=1327 audit(1769558540.247:909): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:20.261014 systemd-logind[1642]: New session 30 of user core. Jan 28 00:02:20.265958 systemd[1]: Started session-30.scope - Session 30 of User core. Jan 28 00:02:20.268000 audit[5630]: USER_START pid=5630 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.273771 kernel: audit: type=1105 audit(1769558540.268:910): pid=5630 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.273857 kernel: audit: type=1103 audit(1769558540.272:911): pid=5634 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.272000 audit[5634]: CRED_ACQ pid=5634 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.612865 sshd[5634]: Connection closed by 4.153.228.146 port 58410 Jan 28 00:02:20.613348 sshd-session[5630]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:20.613000 audit[5630]: USER_END pid=5630 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.617682 systemd[1]: sshd@28-10.0.6.5:22-4.153.228.146:58410.service: Deactivated successfully. Jan 28 00:02:20.614000 audit[5630]: CRED_DISP pid=5630 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.619578 systemd[1]: session-30.scope: Deactivated successfully. Jan 28 00:02:20.621972 kernel: audit: type=1106 audit(1769558540.613:912): pid=5630 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.622162 kernel: audit: type=1104 audit(1769558540.614:913): pid=5630 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:20.617000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.6.5:22-4.153.228.146:58410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:20.622488 systemd-logind[1642]: Session 30 logged out. Waiting for processes to exit. Jan 28 00:02:20.623470 systemd-logind[1642]: Removed session 30. Jan 28 00:02:25.734423 systemd[1]: Started sshd@29-10.0.6.5:22-4.153.228.146:34878.service - OpenSSH per-connection server daemon (4.153.228.146:34878). Jan 28 00:02:25.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.6.5:22-4.153.228.146:34878 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:25.735355 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:02:25.735492 kernel: audit: type=1130 audit(1769558545.733:915): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.6.5:22-4.153.228.146:34878 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:26.263000 audit[5662]: USER_ACCT pid=5662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.265936 sshd[5662]: Accepted publickey for core from 4.153.228.146 port 34878 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:02:26.269294 sshd-session[5662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:02:26.267000 audit[5662]: CRED_ACQ pid=5662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.275610 kernel: audit: type=1101 audit(1769558546.263:916): pid=5662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.275735 kernel: audit: type=1103 audit(1769558546.267:917): pid=5662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.278586 kernel: audit: type=1006 audit(1769558546.267:918): pid=5662 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=31 res=1 Jan 28 00:02:26.267000 audit[5662]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc0dc89e0 a2=3 a3=0 items=0 ppid=1 pid=5662 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:26.279819 systemd-logind[1642]: New session 31 of user core. Jan 28 00:02:26.283178 kernel: audit: type=1300 audit(1769558546.267:918): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc0dc89e0 a2=3 a3=0 items=0 ppid=1 pid=5662 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:26.267000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:26.285870 systemd[1]: Started session-31.scope - Session 31 of User core. Jan 28 00:02:26.287468 kernel: audit: type=1327 audit(1769558546.267:918): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:26.291000 audit[5662]: USER_START pid=5662 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.296000 audit[5666]: CRED_ACQ pid=5666 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.300271 kernel: audit: type=1105 audit(1769558546.291:919): pid=5662 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.300430 kernel: audit: type=1103 audit(1769558546.296:920): pid=5666 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.635014 sshd[5666]: Connection closed by 4.153.228.146 port 34878 Jan 28 00:02:26.635340 sshd-session[5662]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:26.636000 audit[5662]: USER_END pid=5662 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.639865 systemd[1]: sshd@29-10.0.6.5:22-4.153.228.146:34878.service: Deactivated successfully. Jan 28 00:02:26.636000 audit[5662]: CRED_DISP pid=5662 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.642278 systemd[1]: session-31.scope: Deactivated successfully. Jan 28 00:02:26.643051 systemd-logind[1642]: Session 31 logged out. Waiting for processes to exit. Jan 28 00:02:26.643930 kernel: audit: type=1106 audit(1769558546.636:921): pid=5662 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.643990 kernel: audit: type=1104 audit(1769558546.636:922): pid=5662 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:26.640000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.6.5:22-4.153.228.146:34878 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:26.644459 systemd-logind[1642]: Removed session 31. Jan 28 00:02:26.665024 kubelet[2953]: E0128 00:02:26.662903 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:02:28.662448 containerd[1664]: time="2026-01-28T00:02:28.662406008Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:02:28.988549 containerd[1664]: time="2026-01-28T00:02:28.988429559Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:28.989893 containerd[1664]: time="2026-01-28T00:02:28.989846363Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:02:28.990020 containerd[1664]: time="2026-01-28T00:02:28.989944043Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:28.990169 kubelet[2953]: E0128 00:02:28.990119 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:02:28.990487 kubelet[2953]: E0128 00:02:28.990223 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:02:28.990487 kubelet[2953]: E0128 00:02:28.990303 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sv2v9_calico-system(e61938e0-7077-4d81-9b34-6430d54d8b9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:28.990487 kubelet[2953]: E0128 00:02:28.990334 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:02:29.663180 kubelet[2953]: E0128 00:02:29.663125 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:02:30.662284 kubelet[2953]: E0128 00:02:30.662239 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:02:31.663546 containerd[1664]: time="2026-01-28T00:02:31.663119877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:02:31.739782 systemd[1]: Started sshd@30-10.0.6.5:22-4.153.228.146:34880.service - OpenSSH per-connection server daemon (4.153.228.146:34880). Jan 28 00:02:31.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.6.5:22-4.153.228.146:34880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:31.741210 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:02:31.741285 kernel: audit: type=1130 audit(1769558551.739:924): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.6.5:22-4.153.228.146:34880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:31.993479 containerd[1664]: time="2026-01-28T00:02:31.993243653Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:31.994830 containerd[1664]: time="2026-01-28T00:02:31.994702417Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:02:31.994830 containerd[1664]: time="2026-01-28T00:02:31.994747978Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:31.995297 kubelet[2953]: E0128 00:02:31.994980 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:02:31.995297 kubelet[2953]: E0128 00:02:31.995077 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:02:31.995297 kubelet[2953]: E0128 00:02:31.995215 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-bb4448d88-2hkvv_calico-apiserver(7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:31.995297 kubelet[2953]: E0128 00:02:31.995248 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:02:32.248000 audit[5689]: USER_ACCT pid=5689 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.252790 sshd[5689]: Accepted publickey for core from 4.153.228.146 port 34880 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:02:32.252610 sshd-session[5689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:02:32.249000 audit[5689]: CRED_ACQ pid=5689 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.254797 kernel: audit: type=1101 audit(1769558552.248:925): pid=5689 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.265461 kernel: audit: type=1103 audit(1769558552.249:926): pid=5689 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.265548 kernel: audit: type=1006 audit(1769558552.249:927): pid=5689 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Jan 28 00:02:32.265569 kernel: audit: type=1300 audit(1769558552.249:927): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc08aa650 a2=3 a3=0 items=0 ppid=1 pid=5689 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:32.249000 audit[5689]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc08aa650 a2=3 a3=0 items=0 ppid=1 pid=5689 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:02:32.249000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:32.267240 kernel: audit: type=1327 audit(1769558552.249:927): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:02:32.267773 systemd-logind[1642]: New session 32 of user core. Jan 28 00:02:32.283039 systemd[1]: Started session-32.scope - Session 32 of User core. Jan 28 00:02:32.286000 audit[5689]: USER_START pid=5689 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.288000 audit[5696]: CRED_ACQ pid=5696 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.294456 kernel: audit: type=1105 audit(1769558552.286:928): pid=5689 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.294515 kernel: audit: type=1103 audit(1769558552.288:929): pid=5696 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.603648 sshd[5696]: Connection closed by 4.153.228.146 port 34880 Jan 28 00:02:32.602918 sshd-session[5689]: pam_unix(sshd:session): session closed for user core Jan 28 00:02:32.603000 audit[5689]: USER_END pid=5689 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.611131 systemd[1]: sshd@30-10.0.6.5:22-4.153.228.146:34880.service: Deactivated successfully. Jan 28 00:02:32.603000 audit[5689]: CRED_DISP pid=5689 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.613255 systemd[1]: session-32.scope: Deactivated successfully. Jan 28 00:02:32.615555 kernel: audit: type=1106 audit(1769558552.603:930): pid=5689 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.615605 kernel: audit: type=1104 audit(1769558552.603:931): pid=5689 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:02:32.610000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.6.5:22-4.153.228.146:34880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:02:32.623137 systemd-logind[1642]: Session 32 logged out. Waiting for processes to exit. Jan 28 00:02:32.624076 systemd-logind[1642]: Removed session 32. Jan 28 00:02:34.662584 kubelet[2953]: E0128 00:02:34.662539 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:02:39.663648 kubelet[2953]: E0128 00:02:39.663559 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:02:40.661790 containerd[1664]: time="2026-01-28T00:02:40.661720815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:02:41.021045 containerd[1664]: time="2026-01-28T00:02:41.020999720Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:41.022488 containerd[1664]: time="2026-01-28T00:02:41.022435485Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:02:41.022559 containerd[1664]: time="2026-01-28T00:02:41.022507005Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:41.022733 kubelet[2953]: E0128 00:02:41.022694 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:02:41.022999 kubelet[2953]: E0128 00:02:41.022754 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:02:41.022999 kubelet[2953]: E0128 00:02:41.022847 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:41.023849 containerd[1664]: time="2026-01-28T00:02:41.023820769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:02:41.368620 containerd[1664]: time="2026-01-28T00:02:41.367887668Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:02:41.370076 containerd[1664]: time="2026-01-28T00:02:41.369954074Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:02:41.370259 containerd[1664]: time="2026-01-28T00:02:41.370012874Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:02:41.370490 kubelet[2953]: E0128 00:02:41.370449 2953 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:02:41.370542 kubelet[2953]: E0128 00:02:41.370500 2953 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:02:41.370580 kubelet[2953]: E0128 00:02:41.370566 2953 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-bmscm_calico-system(1d6f938d-8e51-4e63-b408-0de368dbd7d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:02:41.370635 kubelet[2953]: E0128 00:02:41.370607 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:02:41.663274 kubelet[2953]: E0128 00:02:41.663155 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:02:42.662283 kubelet[2953]: E0128 00:02:42.662213 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:02:45.663356 kubelet[2953]: E0128 00:02:45.663201 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:02:46.662286 kubelet[2953]: E0128 00:02:46.662166 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:02:53.663609 kubelet[2953]: E0128 00:02:53.663553 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:02:53.663609 kubelet[2953]: E0128 00:02:53.663590 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:02:54.661933 kubelet[2953]: E0128 00:02:54.661877 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:02:55.662760 kubelet[2953]: E0128 00:02:55.662674 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:02:59.662142 kubelet[2953]: E0128 00:02:59.662077 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:03:01.662539 kubelet[2953]: E0128 00:03:01.662481 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:03:04.166890 systemd[1]: cri-containerd-0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d.scope: Deactivated successfully. Jan 28 00:03:04.167226 systemd[1]: cri-containerd-0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d.scope: Consumed 40.960s CPU time, 111.5M memory peak. Jan 28 00:03:04.169306 containerd[1664]: time="2026-01-28T00:03:04.169271620Z" level=info msg="received container exit event container_id:\"0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d\" id:\"0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d\" pid:3283 exit_status:1 exited_at:{seconds:1769558584 nanos:167885495}" Jan 28 00:03:04.170000 audit: BPF prog-id=146 op=UNLOAD Jan 28 00:03:04.172790 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:03:04.172850 kernel: audit: type=1334 audit(1769558584.170:933): prog-id=146 op=UNLOAD Jan 28 00:03:04.170000 audit: BPF prog-id=150 op=UNLOAD Jan 28 00:03:04.174457 kernel: audit: type=1334 audit(1769558584.170:934): prog-id=150 op=UNLOAD Jan 28 00:03:04.189792 systemd[1790]: Created slice background.slice - User Background Tasks Slice. Jan 28 00:03:04.190360 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d-rootfs.mount: Deactivated successfully. Jan 28 00:03:04.191346 systemd[1790]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 28 00:03:04.212746 systemd[1790]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 28 00:03:04.379544 kubelet[2953]: I0128 00:03:04.379491 2953 scope.go:117] "RemoveContainer" containerID="0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d" Jan 28 00:03:04.381769 containerd[1664]: time="2026-01-28T00:03:04.381307952Z" level=info msg="CreateContainer within sandbox \"c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 28 00:03:04.390679 containerd[1664]: time="2026-01-28T00:03:04.390104899Z" level=info msg="Container 3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:03:04.401940 containerd[1664]: time="2026-01-28T00:03:04.401872695Z" level=info msg="CreateContainer within sandbox \"c246844e81815ec093600f62ce0f6e077aa53c198be96c7eab841f3439b9d5d7\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800\"" Jan 28 00:03:04.402470 containerd[1664]: time="2026-01-28T00:03:04.402445137Z" level=info msg="StartContainer for \"3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800\"" Jan 28 00:03:04.403494 containerd[1664]: time="2026-01-28T00:03:04.403462980Z" level=info msg="connecting to shim 3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800" address="unix:///run/containerd/s/d2ad2c05e2e2d28b0102df5b46a3e072f419d4f1b4535737cc41b2f217ce93f1" protocol=ttrpc version=3 Jan 28 00:03:04.427119 systemd[1]: Started cri-containerd-3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800.scope - libcontainer container 3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800. Jan 28 00:03:04.435000 audit: BPF prog-id=256 op=LOAD Jan 28 00:03:04.437000 audit: BPF prog-id=257 op=LOAD Jan 28 00:03:04.439294 kernel: audit: type=1334 audit(1769558584.435:935): prog-id=256 op=LOAD Jan 28 00:03:04.439342 kernel: audit: type=1334 audit(1769558584.437:936): prog-id=257 op=LOAD Jan 28 00:03:04.439378 kernel: audit: type=1300 audit(1769558584.437:936): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.437000 audit[5757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.437000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.447130 kernel: audit: type=1327 audit(1769558584.437:936): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.447282 kernel: audit: type=1334 audit(1769558584.437:937): prog-id=257 op=UNLOAD Jan 28 00:03:04.437000 audit: BPF prog-id=257 op=UNLOAD Jan 28 00:03:04.437000 audit[5757]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.451800 kernel: audit: type=1300 audit(1769558584.437:937): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.437000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.455934 kernel: audit: type=1327 audit(1769558584.437:937): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.456089 kernel: audit: type=1334 audit(1769558584.441:938): prog-id=258 op=LOAD Jan 28 00:03:04.441000 audit: BPF prog-id=258 op=LOAD Jan 28 00:03:04.441000 audit[5757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.441000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.441000 audit: BPF prog-id=259 op=LOAD Jan 28 00:03:04.441000 audit[5757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.441000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.441000 audit: BPF prog-id=259 op=UNLOAD Jan 28 00:03:04.441000 audit[5757]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.441000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.441000 audit: BPF prog-id=258 op=UNLOAD Jan 28 00:03:04.441000 audit[5757]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.441000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.441000 audit: BPF prog-id=260 op=LOAD Jan 28 00:03:04.441000 audit[5757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=3134 pid=5757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:04.441000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361316265333362333634303261333838383831666462356530616162 Jan 28 00:03:04.472232 containerd[1664]: time="2026-01-28T00:03:04.472194952Z" level=info msg="StartContainer for \"3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800\" returns successfully" Jan 28 00:03:04.634370 kubelet[2953]: E0128 00:03:04.634326 2953 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.6.5:40604->10.0.6.48:2379: read: connection timed out" Jan 28 00:03:05.337967 systemd[1]: cri-containerd-34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9.scope: Deactivated successfully. Jan 28 00:03:05.338318 systemd[1]: cri-containerd-34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9.scope: Consumed 4.727s CPU time, 67.4M memory peak. Jan 28 00:03:05.337000 audit: BPF prog-id=261 op=LOAD Jan 28 00:03:05.337000 audit: BPF prog-id=83 op=UNLOAD Jan 28 00:03:05.341207 containerd[1664]: time="2026-01-28T00:03:05.341158945Z" level=info msg="received container exit event container_id:\"34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9\" id:\"34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9\" pid:2787 exit_status:1 exited_at:{seconds:1769558585 nanos:340849264}" Jan 28 00:03:05.341000 audit: BPF prog-id=98 op=UNLOAD Jan 28 00:03:05.341000 audit: BPF prog-id=102 op=UNLOAD Jan 28 00:03:05.362903 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9-rootfs.mount: Deactivated successfully. Jan 28 00:03:05.385714 kubelet[2953]: I0128 00:03:05.384152 2953 scope.go:117] "RemoveContainer" containerID="34fad8d09f806caf5b5780989e785602f8c4a27f518bb59a325e3aa295559be9" Jan 28 00:03:05.387545 containerd[1664]: time="2026-01-28T00:03:05.387364607Z" level=info msg="CreateContainer within sandbox \"f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 28 00:03:05.404741 containerd[1664]: time="2026-01-28T00:03:05.404570260Z" level=info msg="Container e7f70092352d3eeccaf9f8a4f4d4515961f8dc6c303d211e3162abf8dfb81a27: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:03:05.420936 containerd[1664]: time="2026-01-28T00:03:05.420854910Z" level=info msg="CreateContainer within sandbox \"f4cde5f3478c209444195ecde63beaba0d6b60bc6ab8123b9337b26b9b96e21f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"e7f70092352d3eeccaf9f8a4f4d4515961f8dc6c303d211e3162abf8dfb81a27\"" Jan 28 00:03:05.421702 containerd[1664]: time="2026-01-28T00:03:05.421676993Z" level=info msg="StartContainer for \"e7f70092352d3eeccaf9f8a4f4d4515961f8dc6c303d211e3162abf8dfb81a27\"" Jan 28 00:03:05.422866 containerd[1664]: time="2026-01-28T00:03:05.422837516Z" level=info msg="connecting to shim e7f70092352d3eeccaf9f8a4f4d4515961f8dc6c303d211e3162abf8dfb81a27" address="unix:///run/containerd/s/6280a52ce344780ff047bc0b07da428ba86951bd109fc5da4e2306008644126e" protocol=ttrpc version=3 Jan 28 00:03:05.444981 systemd[1]: Started cri-containerd-e7f70092352d3eeccaf9f8a4f4d4515961f8dc6c303d211e3162abf8dfb81a27.scope - libcontainer container e7f70092352d3eeccaf9f8a4f4d4515961f8dc6c303d211e3162abf8dfb81a27. Jan 28 00:03:05.455000 audit: BPF prog-id=262 op=LOAD Jan 28 00:03:05.456000 audit: BPF prog-id=263 op=LOAD Jan 28 00:03:05.456000 audit[5804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2619 pid=5804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:05.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537663730303932333532643365656363616639663861346634643435 Jan 28 00:03:05.456000 audit: BPF prog-id=263 op=UNLOAD Jan 28 00:03:05.456000 audit[5804]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=5804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:05.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537663730303932333532643365656363616639663861346634643435 Jan 28 00:03:05.456000 audit: BPF prog-id=264 op=LOAD Jan 28 00:03:05.456000 audit[5804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2619 pid=5804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:05.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537663730303932333532643365656363616639663861346634643435 Jan 28 00:03:05.456000 audit: BPF prog-id=265 op=LOAD Jan 28 00:03:05.456000 audit[5804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2619 pid=5804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:05.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537663730303932333532643365656363616639663861346634643435 Jan 28 00:03:05.456000 audit: BPF prog-id=265 op=UNLOAD Jan 28 00:03:05.456000 audit[5804]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=5804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:05.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537663730303932333532643365656363616639663861346634643435 Jan 28 00:03:05.456000 audit: BPF prog-id=264 op=UNLOAD Jan 28 00:03:05.456000 audit[5804]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2619 pid=5804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:05.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537663730303932333532643365656363616639663861346634643435 Jan 28 00:03:05.456000 audit: BPF prog-id=266 op=LOAD Jan 28 00:03:05.456000 audit[5804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2619 pid=5804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:05.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537663730303932333532643365656363616639663861346634643435 Jan 28 00:03:05.483004 containerd[1664]: time="2026-01-28T00:03:05.482948541Z" level=info msg="StartContainer for \"e7f70092352d3eeccaf9f8a4f4d4515961f8dc6c303d211e3162abf8dfb81a27\" returns successfully" Jan 28 00:03:05.665267 kubelet[2953]: E0128 00:03:05.665153 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38" Jan 28 00:03:06.662899 kubelet[2953]: E0128 00:03:06.662807 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sv2v9" podUID="e61938e0-7077-4d81-9b34-6430d54d8b9f" Jan 28 00:03:08.662284 kubelet[2953]: E0128 00:03:08.662226 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-bmscm" podUID="1d6f938d-8e51-4e63-b408-0de368dbd7d7" Jan 28 00:03:09.661834 kubelet[2953]: E0128 00:03:09.661792 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-jlz2t" podUID="cca7bd93-1a7d-448c-ad36-6b956cecc82e" Jan 28 00:03:10.324791 systemd[1]: cri-containerd-fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9.scope: Deactivated successfully. Jan 28 00:03:10.324000 audit: BPF prog-id=267 op=LOAD Jan 28 00:03:10.325148 systemd[1]: cri-containerd-fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9.scope: Consumed 3.743s CPU time, 25.8M memory peak. Jan 28 00:03:10.327245 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 28 00:03:10.327310 kernel: audit: type=1334 audit(1769558590.324:955): prog-id=267 op=LOAD Jan 28 00:03:10.324000 audit: BPF prog-id=88 op=UNLOAD Jan 28 00:03:10.328219 kernel: audit: type=1334 audit(1769558590.324:956): prog-id=88 op=UNLOAD Jan 28 00:03:10.329143 containerd[1664]: time="2026-01-28T00:03:10.329109519Z" level=info msg="received container exit event container_id:\"fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9\" id:\"fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9\" pid:2796 exit_status:1 exited_at:{seconds:1769558590 nanos:328588278}" Jan 28 00:03:10.328000 audit: BPF prog-id=103 op=UNLOAD Jan 28 00:03:10.328000 audit: BPF prog-id=107 op=UNLOAD Jan 28 00:03:10.331815 kernel: audit: type=1334 audit(1769558590.328:957): prog-id=103 op=UNLOAD Jan 28 00:03:10.331883 kernel: audit: type=1334 audit(1769558590.328:958): prog-id=107 op=UNLOAD Jan 28 00:03:10.351850 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9-rootfs.mount: Deactivated successfully. Jan 28 00:03:10.410139 kubelet[2953]: I0128 00:03:10.410107 2953 scope.go:117] "RemoveContainer" containerID="fe9e64336f1543bd833a4b647252a721779182d39d4374069705c4025dc5b0f9" Jan 28 00:03:10.412310 containerd[1664]: time="2026-01-28T00:03:10.412274212Z" level=info msg="CreateContainer within sandbox \"f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 28 00:03:10.422764 containerd[1664]: time="2026-01-28T00:03:10.422195322Z" level=info msg="Container 358e1e1d3b455a491aa42e8c2d8f6b5d7d6d547f2d4700830fc5e848add0f8cd: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:03:10.430810 containerd[1664]: time="2026-01-28T00:03:10.430772628Z" level=info msg="CreateContainer within sandbox \"f200ea53b83c58812861d48194f3de263e3863b6bce167d564f42ba2772872fe\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"358e1e1d3b455a491aa42e8c2d8f6b5d7d6d547f2d4700830fc5e848add0f8cd\"" Jan 28 00:03:10.431529 containerd[1664]: time="2026-01-28T00:03:10.431303510Z" level=info msg="StartContainer for \"358e1e1d3b455a491aa42e8c2d8f6b5d7d6d547f2d4700830fc5e848add0f8cd\"" Jan 28 00:03:10.432551 containerd[1664]: time="2026-01-28T00:03:10.432525833Z" level=info msg="connecting to shim 358e1e1d3b455a491aa42e8c2d8f6b5d7d6d547f2d4700830fc5e848add0f8cd" address="unix:///run/containerd/s/9c667f694dadb34d4804ddbb332cbba113438327055098393e17246dafe329be" protocol=ttrpc version=3 Jan 28 00:03:10.455337 systemd[1]: Started cri-containerd-358e1e1d3b455a491aa42e8c2d8f6b5d7d6d547f2d4700830fc5e848add0f8cd.scope - libcontainer container 358e1e1d3b455a491aa42e8c2d8f6b5d7d6d547f2d4700830fc5e848add0f8cd. Jan 28 00:03:10.464000 audit: BPF prog-id=268 op=LOAD Jan 28 00:03:10.472068 kernel: audit: type=1334 audit(1769558590.464:959): prog-id=268 op=LOAD Jan 28 00:03:10.472178 kernel: audit: type=1334 audit(1769558590.465:960): prog-id=269 op=LOAD Jan 28 00:03:10.472205 kernel: audit: type=1300 audit(1769558590.465:960): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.465000 audit: BPF prog-id=269 op=LOAD Jan 28 00:03:10.465000 audit[5851]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.475688 kernel: audit: type=1327 audit(1769558590.465:960): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.465000 audit: BPF prog-id=269 op=UNLOAD Jan 28 00:03:10.476983 kernel: audit: type=1334 audit(1769558590.465:961): prog-id=269 op=UNLOAD Jan 28 00:03:10.465000 audit[5851]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.480244 kernel: audit: type=1300 audit(1769558590.465:961): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.466000 audit: BPF prog-id=270 op=LOAD Jan 28 00:03:10.466000 audit[5851]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.466000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.470000 audit: BPF prog-id=271 op=LOAD Jan 28 00:03:10.470000 audit[5851]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.470000 audit: BPF prog-id=271 op=UNLOAD Jan 28 00:03:10.470000 audit[5851]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.470000 audit: BPF prog-id=270 op=UNLOAD Jan 28 00:03:10.470000 audit[5851]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.470000 audit: BPF prog-id=272 op=LOAD Jan 28 00:03:10.470000 audit[5851]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2655 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:03:10.470000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335386531653164336234353561343931616134326538633264386636 Jan 28 00:03:10.507068 containerd[1664]: time="2026-01-28T00:03:10.507007260Z" level=info msg="StartContainer for \"358e1e1d3b455a491aa42e8c2d8f6b5d7d6d547f2d4700830fc5e848add0f8cd\" returns successfully" Jan 28 00:03:12.661933 kubelet[2953]: E0128 00:03:12.661874 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-fcc6fc97b-npt4s" podUID="2e0df84e-16dc-494b-a3d3-1071788a0777" Jan 28 00:03:14.635676 kubelet[2953]: E0128 00:03:14.635637 2953 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4593-0-0-n-485d202ac1)" Jan 28 00:03:14.979763 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Jan 28 00:03:15.662292 kubelet[2953]: E0128 00:03:15.662249 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-bb4448d88-2hkvv" podUID="7ec7f1f1-9073-42e6-9a8a-d6f3b105d21c" Jan 28 00:03:15.671382 systemd[1]: cri-containerd-3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800.scope: Deactivated successfully. Jan 28 00:03:15.671959 containerd[1664]: time="2026-01-28T00:03:15.671813147Z" level=info msg="received container exit event container_id:\"3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800\" id:\"3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800\" pid:5769 exit_status:1 exited_at:{seconds:1769558595 nanos:671579746}" Jan 28 00:03:15.681490 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 28 00:03:15.681593 kernel: audit: type=1334 audit(1769558595.678:967): prog-id=256 op=UNLOAD Jan 28 00:03:15.678000 audit: BPF prog-id=256 op=UNLOAD Jan 28 00:03:15.678000 audit: BPF prog-id=260 op=UNLOAD Jan 28 00:03:15.682759 kernel: audit: type=1334 audit(1769558595.678:968): prog-id=260 op=UNLOAD Jan 28 00:03:15.693417 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800-rootfs.mount: Deactivated successfully. Jan 28 00:03:16.428851 kubelet[2953]: I0128 00:03:16.428815 2953 scope.go:117] "RemoveContainer" containerID="0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d" Jan 28 00:03:16.429445 kubelet[2953]: I0128 00:03:16.429402 2953 scope.go:117] "RemoveContainer" containerID="3a1be33b36402a388881fdb5e0aab14e9a23d28307a4ed999a5c206e46dff800" Jan 28 00:03:16.430039 kubelet[2953]: E0128 00:03:16.429839 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-65cdcdfd6d-zsnf5_tigera-operator(19eb2c97-7f89-4c70-952d-90de4520251e)\"" pod="tigera-operator/tigera-operator-65cdcdfd6d-zsnf5" podUID="19eb2c97-7f89-4c70-952d-90de4520251e" Jan 28 00:03:16.430449 containerd[1664]: time="2026-01-28T00:03:16.430420531Z" level=info msg="RemoveContainer for \"0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d\"" Jan 28 00:03:16.436019 containerd[1664]: time="2026-01-28T00:03:16.435940468Z" level=info msg="RemoveContainer for \"0c38f0804988c1469ab2fea08380aaa3a6974cfa142820caf9104519e931225d\" returns successfully" Jan 28 00:03:19.662657 kubelet[2953]: E0128 00:03:19.662594 2953 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-79c89656f-55nfw" podUID="a85ad95c-92af-4836-ae16-c3e124882e38"